Developed a real-time emotion detection system aimed at analyzing students' facial expressions during classroom sessions. Leveraged Convolutional Neural Networks (CNN) trained on the FER-2013 dataset, consisting of 35,887 grayscale images, to classify emotions into seven categories: angry, disgusted, fearful, happy, neutral, sad, and surprised. Implemented the model on a Jetson Nano, utilizing a webcam for real-time emotion recognition. This project aims to enhance classroom engagement and teaching effectiveness by providing instant feedback on students' emotional states. Faced and overcame challenges related to OpenCV compatibility, optimizer errors, and training large datasets. Future work includes integrating this technology into a classroom management system with real-time emotion monitoring via a web dashboard.