Emotion recognition from facial expressions using SVM algorithmEmotion recognition from facial expressions using SVM algorithmΑναγνώριση συναισθημάτων από εκφράσεις προσώπου χρησιμοποιώντας SVM αλγόριθμο
Διπλωματική Εργασία
Diploma Work
2019-01-172019enThe recognition of facial expressions and the corresponding emotion that they convey is something every human understands and it is universal without requiring training. It is instinctively “coded” in humans DNA. Human – Computer interaction can greatly benefit from recognizing various human emotions just by looking at us (no need for same language or any spoken language at all). The way we can achieve this is by taking frames (or live video) that depict human faces showing different emotional expressions and convert them to greyscale which makes it easier and faster to locate the facial area. We then apply a pre-trained mask with 68 flags-points (each one with unique coordination) based on a method that tries to apply these flags around the main facial areas that show details of emotions, around the orifices (mouth, eyes, nose). We proceed by taking all possible combinations of these 68 points and calculate the Euclidean distance between each set and then by using an SVM Machine learning method we train the system to recognize four main emotions (Anger, Joy, Tranquility, Sadness) experimented on two Data Sets (with depictions of expression and corresponding feeling). The data sets being used for the training and validation of the SVM are: Patras A.I.nD.M. data set of 84 directed facial poses (portrait angle) and Fer2013 data set of 960 Not directed facial poses (random angle). Τhe Fer2013 data set was created for a Facial Recognition competition to test the recognition algorithms to their limits, with very small resolution 48x48 pixels and full with out of focus, obscured (either by hair, hands, sunglasses, hats, bad angle etc ) pictures, some of them are not even from real photographs rather portraits or drawings of faces, with wide variety of different people from around the world. After the training we apply a validation test to find out how accurate it is. Of the four examined emotions Joy is the overall best distinguished with 100% (in all three statistical measures: Sensitivity, Specificity and Accuracy) in both Patras data sets tests 72%, 75% and 85% (Sen, Spe, Acc) in the Fer2013 data set accordingly. Anger comes second with 100%, 83%, 87.5% in Patras data set and 56%, 85%, 78% in Fer2013 data set. Tranquility is third with 50%, 100%, 87.5% (Sen, Spe, Acc) in Patras data set and 45%, 67% and 62% in Fer2013 data set. Finally Sadness with 87.5%, 50% and 83% in Patras data set and 21%, 80% and 62% in Fer2013 data set.http://creativecommons.org/licenses/by/4.0/Πολυτεχνείο Κρήτης::Σχολή Ηλεκτρολόγων Μηχανικών και Μηχανικών ΥπολογιστώνLemonis_Ioannis_Dip_2019.pdf.pdfChania [Greece]Library of TUC2019-01-17application/pdf2.9 MBfree
Lemonis Ioannis
Λεμονης Ιωαννης
Zervakis Michail
Ζερβακης Μιχαηλ
Kalaitzakis Konstantinos
Καλαϊτζακης Κωνσταντινος
Sergaki Eleftheria
Σεργακη Ελευθερια
Πολυτεχνείο Κρήτης
Technical University of Crete
Facial expressions
Emotion
Computer vision
Support vector machine
Dlib
OpenCV
Image processing
Classifier
Fer2013
IBUG 300-W