Facial expression for emotion detection has always been an easy task for humans (especially parents) but doing the same thing using the computer algorithm is a challenging task. With the advancement of computer vision and machine learning in the recent decade, it is possible to detect emotions from images. In this paper, we propose a novel technique, called facial emotion recognition using convolutional neural networks (FERC). The FERC is based on two-part convolutional neural networks (CNN) where first-half removes the background from the picture while the second part concentrates on the facial feature vector extraction. In FERC model expressional vector (EV) is used to find the different five types of normal facial expression. Supervisory data obtained from the stored database of 10000 images (154 persons). With total 24 value long EV it was possible to accuratly highlighting the emotion with 96% accuracy. The two-level CNN works within serise and last layer of perceptron adusts the weights and exponents values with each iteration and improves accuracy per stages. FERC contrast, generally followed strategies with single level CNN and hence improving the accuracy. Furthermore, a novel background removal procedure before EV avoids dealing with multiple problems that may occur, such as, distance from the camera. FERC was tested with extended Cohn-Kanade expression datasets. We expect the FERC emotion detection to be useful in many applications such as predictive learning of students, lie detectors etc.