Role
Lead contributor for code implementation, responsible for preprocessing, model architecture, expression classification, and visualization.
Course
MATH 3710 – Data Analysis
Objective
This project aimed to recognize facial expressions from user-uploaded images through deep learning-based emotion classification. Models like VGG19 and ResNet18 were implemented and compared to classify expressions into categories such as Angry, Happy, Sad, etc. Visual results and prediction probability distributions were generated.
Technologies
Python, PyTorch, OpenCV
VGG19, ResNet18 model implementation and comparison, CK+ Dataset, Emotion category recognition and expression classification, Confusion Matrix, Softmax score bar charts
My Contributions
- Handled input image processing: grayscale conversion, resizing, cropping, and tensor preparation
- Implemented and trained deep learning models (VGG19 & ResNet18) for emotion recognition
- Visualized results with classification bar plots and corresponding emoji icons
- Plotted confusion matrices and analyzed classification accuracy for each expression
Results
- Successfully classified facial expressions from images and visualized emotion output
- VGG achieved higher accuracy than ResNet but with longer inference time
- Delivered interpretable visualizations including classification bars, emojis, and confusion matrices
GitHub Repository
GitHub repository not yet published