Back to articles
Real-time emotion detection from webcam — no wearables needed

Real-time emotion detection from webcam — no wearables needed

via Dev.toEmoPulse

We’ve been running controlled trials with real-time facial affect analysis using nothing but a standard 720p webcam — no IR sensors, no EEG caps, no chest straps. The goal? Detect emotional valence and arousal with enough accuracy to be useful in high-stakes environments: remote proctoring, telehealth triage, UX research. Most open-source pipelines fail here because they treat emotion as a static classification problem. We treat it as a dynamic signal. Our stack uses a lightweight RetinaFace for detection, followed by a pruned EfficientNet-B0 fine-tuned on dynamic expressions from the AFEW and SEED datasets — not just static FER2013 junk. Temporal smoothing via a 1D causal CNN on top of softmax outputs reduces jitter and improves response latency under variable lighting. The real breakthrough wasn’t the model — it was synchronizing inference with gaze vector estimation and head pose to gate confidence. If the user isn’t facing the camera within ±30 degrees, we don’t emit a prediction.

Continue reading on Dev.to

Opens in a new tab

Read Full Article
0 views

Related Articles