Detecting facial expressions and recognizing emotions is a crucial tool for perceiving the true sentiment a person has regarding any stimuli. As humans this function is performed instinctively through our natural empathic response. In order to do allow arbitrary devices to have this capability, using deep learning techniques we trained an ensamble of neural networks to detect six different emotional states: anger, disgust, fear, happiness, sadness, surprise and neutral.
Neural architecture and training
We used transfer learning from the Facenet  architecture. This architecture is used for facial identification. We used features from a middle layer of this network as embeddings that were later passed through two convolutional and two fully connected layers ending in a softmax activation. An ensamble of 4 such neural networks was built to produce the final classification Training and testing was performed on a 32,300 image database pre-labeled with the identified emotion with a 80/20 train-test random split.