Classroom +

Nov 2018 - Jan 2019

About

Classroom+ was a real-time emotion recognition tool designed to help teachers understand student engagement using just a standard webcam. It captured facial snapshots and uses artificial intelligence to estimate how students are feeling — no manual input, no surveys, minimal chances of error.

Why build Classroom +

This project started with a real classroom experience. I had a teacher who genuinely believed her lessons were engaging, but most of the class struggled to keep up and often felt disconnected. There was no easy way to give that kind of feedback without it feeling personal or uncomfortable.

That is what sparked the idea for Classroom+. We wondered if we could build a tool that gave teachers an objective view of how students were feeling during class, in real time.

How it works

You launch the application, place the webcam at eye level, and the tool takes it from there. It automatically detects faces in the frame and uses a trained model to analyse facial expressions, identifying one of six basic emotions for each person. Everything runs in real time, giving an instant view of how students are feeling during a lesson.

Technical Specs

This project was developed entirely in Python (version 3.6.7), with TensorFlow (version 1.15) powering the deep learning model. We used OpenCV for real-time image processing and face detection, while libraries such as NumPy, SciPy, and Pandas managed the underlying data structures.

Facial detection was implemented using Haar Cascades, and emotion classification was handled by a custom-trained neural network built with Keras.

The dataset was built entirely from scratch, with many of the training images manually captured at our school by taking photos of volunteers showing different emotions. Each image was carefully sorted, cleaned, and annotated by hand. In total, the system was trained on more than 600 curated images. Due to privacy considerations, this dataset remains private.

What I learned

This was not a solo project, and it was only much later that I realised part of the model’s training logic had come from a GitHub repository. At the time, I didn’t understand what referencing or attribution meant in open-source work. If I were to build this again today, I’d handle credit and licensing very differently.

← Back to projects


[email protected]

Last updated: 15 June 2025