Real-time facial emotion detection system using ROS2 Humble, OpenCV, TensorFlow, and Python. Detects 5 emotions (happy, sad, angry, surprised, neutral) from webcam feed using FER library with pre-trained CNN model (trained on FER2013 dataset).
- FER CNN-based classification: Pre-trained deep learning model for accurate emotion detection
- Real-time detection: 15 Hz video processing with smooth emotion tracking
- 5 emotion classes: Happy π, Sad π’, Angry π , Surprised π², Neutral π
- Live video display: Webcam feed with face bounding boxes and emotion labels
- Emoji overlays: Large emoji indicator in video feed
- ROS2 integration: Publishes detected emotions to
/facial_emotiontopic - Production-grade architecture: Type hints, clean code structure
- Emotion smoothing: 5-frame history-based filtering for stable detection
- Ubuntu 22.04
- ROS2 Humble
- Python 3.10+
- Webcam (USB or built-in)
sudo apt update
sudo apt install -y \
python3-pip \
ros-humble-cv-bridge \
ros-humble-image-transport \
python3-opencvpip3 install fer --break-system-packagesThis installs FER library with TensorFlow and other dependencies.
mkdir -p ~/facial_emotion_ws/src
cd ~/facial_emotion_ws/src# If using git
git clone <repository_url> facial_emotion_detector
# Or create manually and copy filescd ~/facial_emotion_ws
source /opt/ros/humble/setup.bash
colcon build --packages-select facial_emotion_detector --symlink-installsource install/setup.bash# Run with launch file
ros2 launch facial_emotion_detector emotion_detection.launch.py# Run detector with live video display
ros2 run facial_emotion_detector emotion_detectorBoth methods open a webcam window showing:
- Your face with green bounding box
- Current emotion label on face box
- Large emoji in top-right corner
- Emotion text with emoji at bottom
- FPS counter in top-left
Press 'Q' to quit.
In a separate terminal:
source ~/facial_emotion_ws/install/setup.bash
ros2 topic echo /facial_emotion| Emotion | Instructions |
|---|---|
| π HAPPY | Smile wide with visible teeth. Facial muscles relaxed and elevated. |
| π’ SAD | Look DOWN at floor. Let face droop. Pout lower lip OUT. Slightly close eyes. |
| π ANGRY | Stare FORWARD intensely. Squeeze eyebrows DOWN and TOGETHER. Clench jaw. |
| π² SURPRISED | Open mouth VERY WIDE (O-shape). Raise eyebrows UP HIGH. Open eyes wide. |
| π NEUTRAL | Completely relax all facial muscles. Natural resting face. No tension. |
Key tip: For SAD look DOWN. For ANGRY stare FORWARD. This separates them easily.
emotion_detector_node
βββ Camera Capture (15 Hz)
βββ FER CNN Model
β βββ Pre-trained on FER2013 dataset
βββ Emotion Classification
β βββ CNN softmax scores with tuned thresholds
βββ Emotion Smoothing (5-frame history)
βββ ROS2 Publisher (/facial_emotion)
βββ Video Display with Overlays
| Topic | Type | Description |
|---|---|---|
/facial_emotion |
std_msgs/String |
Published emotion name |
- Message values:
"happy","sad","angry","surprised","neutral" - Publish rate: Variable (when face detected)
| Parameter | Type | Default | Description |
|---|---|---|---|
camera_id |
int | 0 | Camera device index |
facial_emotion_ws/
βββ src/
β βββ facial_emotion_detector/
β βββ facial_emotion_detector/
β β βββ __init__.py # Package initialization
β β βββ emotion_classifier.py # FER-based CNN emotion classifier
β β βββ emotion_detector_node.py # ROS2 node with video display
β β βββ emotion_display_node.py # Alternative display node
β βββ launch/
β β βββ emotion_detection.launch.py # Launch file for multiple nodes
β βββ resource/
β β βββ facial_emotion_detector # Package resource marker
β βββ test/
β β βββ test_copyright.py # Copyright validation
β β βββ test_flake8.py # Code style checks
β β βββ test_pep257.py # Docstring validation
β β βββ test_emotion_classifier.py # Unit tests for emotion logic
β βββ package.xml # ROS2 package dependencies
β βββ setup.py # Python package setup
β βββ setup.cfg # Package configuration
β βββ README.md # Documentation
βββ build/ # Build artifacts
βββ install/ # Installed files
βββ log/ # Build and runtime logs
The system uses FER library with pre-trained CNN:
-
Face Detection: OpenCV cascade classifier
-
Emotion Classification:
- FER library with TensorFlow Lite CNN model
- Trained on FER2013 dataset (35,000+ labeled images)
- Outputs softmax scores for 7 emotions
-
Emotion Mapping (FER β 5 classes):
angry,disgustβ ANGRYfear,surpriseβ SURPRISEDhappyβ HAPPYsadβ SADneutralβ NEUTRAL
-
Tuned Thresholds:
- SAD:
sad_score + fear_score * 0.3 > 0.25 - ANGRY:
angry_score + disgust_score * 0.5 > 0.15 - Separation: SAD wins if sad > angry, ANGRY wins if angry > sad
- SAD:
-
Smoothing: 5-frame majority voting for stable output
| Metric | Value |
|---|---|
| Detection rate | 15 FPS (video display) |
| Latency | ~100ms end-to-end |
| Model accuracy | ~63% on FER2013 test set |
cd ~/facial_emotion_ws
source install/setup.bash
# Run all tests
colcon test --packages-select facial_emotion_detector
# Run specific test
python3 src/facial_emotion_detector/test/test_emotion_classifier.py- Emotion classification logic
- Threshold behavior
- Edge cases and defaults
Problem: "Failed to open camera"
# Check available cameras
ls /dev/video*
# Test camera
ffplay /dev/video0
# Change camera_id parameter
ros2 run facial_emotion_detector emotion_detector --ros-args -p camera_id:=1Problem: Emotions not changing or incorrect
- Ensure good lighting (front-lit face)
- Face camera directly
- Make exaggerated expressions initially
- Adjust distance (60-120cm optimal)
- For SAD: look DOWN
- For ANGRY: stare FORWARD intensely
Problem: TensorFlow deprecation warnings
These are harmless warnings from TensorFlow Lite. The system works correctly.
Problem: "cannot import name 'FER' from 'fer'"
# Use correct import path
# In emotion_classifier.py, change:
from fer import FER
# To:
from fer.fer import FER