Introduction

Facial expressions are vital identifier for human feelings because it corresponds to the emotions. Most of the times (roughly 55%) \cite{Mehrabian_2017} of the times, the facial expression is a nonverbal way of emotion expression and it can be considered as concrete evidence to uncover whether an individual is speaking the truth or not\cite{Ekman_2005}.
Recently, researchers have made extraordinary accomplishment in facial expression detection\cite{Xie_2019}\cite{2010}\cite{Mal_2017}. Improvements in neuroscience\cite{Parr_2009} and cognitive science\cite{2017} drive the advancement of research in the field of facial expression. Also, the development in computer vision\cite{1} and machine learning\cite{Xue_2006}makes emotion identification much more accurate and accessible to the general population. As a result, facial expression recognition is growing rapidly as a subfield of image processing. Some of the possible applications are human-PC interaction\cite{Hyoung_2007}, Mental patient observation\cite{Ernst_1934}, drunk driver recognition\cite{S_ChidanandKumar_2012} and most important is lie detector\cite{2013}.
The current approaches primarily focus on facial investigation keeping background intact and hence built up a lot of unnecessary and misleading features that confuse CNN training process. There are five essential facial expression classification classes reported which are displeasure/anger, sad/unhappy, smiling/happy, feared, and surprised/astonished\cite{FRIDLUND_1994}.  The current FERC algorithm presented in this manuscript aim for current expressional examination and to characterize given image into these five essential emotion classes. Reported techniques on facial expression detection can be characterized as two major approaches. First is distinguishing\cite{Gizatdinova_2007} that are identified with an explicit classifier and second is making characterization dependent on the extracted facial highlights\cite{Liu_2017}\cite{Liu_2017}\cite{Fasel}. In the Facial Action Coding System (FACS) \cite{Gavrilescu_2014}, action units are used as expression markers. These AUs were discriminable by facial muscle changes.