Emotion recognition using image processing

Published on . Written by

Emotion recognition using Machine Learning
People often say that the eyes are the gateway to the soul. Humans have an innate ability to express themselves using just their faces. Whatever the emotion may be, the human face can express the right emotion and get the message across efficiently. That is why body language and facial reading are important skills when it comes to understanding a person. So, what if computers were able to do that as well? What if computers could look at your face and understand exactly how you feel? Well, this is already happening on a small scale, and this project will help you understand how that happens. In this Image Processing project, we will be looking at an application that recognizes emotions using facial recognition and analysis.

Read more..
Project Description


Skyfi Labs Projects
Facial detection and recognition techniques can be used for more than just surveillance. In fact, rather than for security purposes, these methods have a lot more applications in our day-to-day lives. We already use them extensively, for everything from unlocking our phones, paying our bills to even assisting in solving crimes. Due to high functionality, facial recognition is a hot topic that is being researched around the world. In this project, we will be taking a look at how those techniques can be used to understand the emotion conveyed via an image.

Concepts Used

  1. Fundamentals of Image Processing
  2. Basics of Data Segmentation
  3. Python Programming
  4. Fundamentals of Computer Vision
  5. Clustering and Analysis of Visual Data
  6. Feature Extraction
  7. Neural Networking
Project Implementation

  • The three steps involved in identifying the right emotions are;
  • Facial Detection, which is the inherent ability of a device to detect the presence and location of a face within an input image or frame.
  • Facial Recognition, which is the method by courtesy of which we compare multiple faces to identify which belongs to who, and to gather data from it.
  • Emotion Detection, which is the process by virtue of which we classify the emotions on a face according to the input image.
  • We will make use of the python library face_detection to make things easier for us.
  • This method will scan the input image, smoothens it out, reduce the noise and disturbance in the image and then detect the presence of a face within it by checking the arbitrary value of each pixel within the image.
  • It will then return to the main function, the coordinates of the boundary pixels which surround the face.
  • Next, the program will call on the library face_recognition to test and compare the boundary box we have with other faces. This process requires the extraction and comparison of key features from the boundary box.
  • Face encoding vectors are generated for each image, and the function essentially checks the distance between these vectors, while comparing each feature.
  • Kaggle has a dataset known as Face Emotion Recognition (FER) which has emotions belonging to the following categories- happiness, sadness, fear, disgust, anger, neutral and surprise.
  • Now, we must simulate a test model, and use a multi-layered Convolutional Neural Network to improve the model’s performance.
  • Finally, after running the model through enough tests, it will successfully be able to tell the difference between various human facial expressions and categorize it under the apt emotion.
Kit required to develop Emotion recognition using image processing:
Technologies you will learn by working on Emotion recognition using image processing:


Any Questions?


Subscribe for more project ideas