Emojify is an innovative application that combines the power of machine learning and deep learning to bridge the gap between human emotions and digital communication. By leveraging Convolutional Neural Networks (CNNs), TensorFlow/Keras, and OpenCV, this project identifies facial expressions from static images and represents them with corresponding emojis. The application, featuring a user-friendly interface built with Tkinter, brings an engaging and intuitive way to visualize emotions for both personal and professional use.
Install Dependencies:
Ensure all required Python libraries are installed. You can use the following command to install the necessary dependencies:
pip install -r requirements.txt
Prepare the Environment:
model2.keras
and model2.weights.h5
) in the root folder of the project./emojis/
) and properly mapped in the code.Modify the Input Path:
emoji.py
file in a text editor.image_path
variable with the full path of the image you want to analyze. For example:
image_path = '/path/to/your/image.jpg'
Run the Application:
Execute the app using the following command:
python emoji.py
View Results:
Repeat with Another Image:
image_path
variable with a new image path in the emoji.py
file.The project uses a dataset containing grayscale images of human faces, annotated with corresponding emotion labels. The dataset is structured to include training and validation subsets to enable model training and evaluation.
Number of Classes: 7 (Angry, Disgust, Fear, Happy, Neutral, Sad, Surprise)
Image Resolution: 48x48 pixels (grayscale)
Number of Samples:
Total: 1,22,407 images
Training Set: 92,968 images
Validation Set: 17356 images
Test Set: 12,083 images
Emotion Distribution (number of images per class in the training set):
Angry: 6566
Disgust: 3231
Fear: 4859
Happy: 28592
Neutral: 29384
Sad: 12223
Surprise: 8113
Training Set: 80% of the dataset (used for training the model).
Validation Set: 20% of the dataset (used for evaluating the model's performance during training).
The inspiration behind the Emojify project is based on the increasing integration of AI into creative and interactive applications. Facial expressions are a universal mode of communication, crossing language barriers, and the idea of translating them into emojis—a globally recognized form of digital communication—felt like a natural and innovative bridge between human emotions and technology.
Utility
Enhanced Communication: This application can be used in a messaging app or social media to add an extra level of personalization by suggesting emojis automatically based on a user's facial expressions.
Accessibility: It can aid individuals with communication challenges such as speech impairments in order to express emotions in a different way.
Educational Use: Teachers or therapists can use Emojify to assist a child or a person with a condition like autism in better understanding and identifying their emotions.
Entertainment and Gaming: The project can also be applied in games or entertainment applications for more immersive and interactive user experience.
Social Media Users: For the purpose of fun and expressive conversations online.
Developers and Innovators: As a tool to bring AI-driven emotional intelligence to their platforms.
Educators and Therapists: For teaching recognition of emotions and enhancing communication.
Gaming Enthusiasts: To make games even more dynamic and engaging with real-time emotion mapping.
It would blend innovation with practicality as it seeks to make technology more empathetic, inclusive, and fun for different users.