Skip to content

AungKaung1928/Facial-Emotion-Detector

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Facial Emotion Detection System Project

Overview

Real-time facial emotion detection system using ROS2 Humble, OpenCV, TensorFlow, and Python. Detects 5 emotions (happy, sad, angry, surprised, neutral) from webcam feed using FER library with pre-trained CNN model (trained on FER2013 dataset).

Features

  • FER CNN-based classification: Pre-trained deep learning model for accurate emotion detection
  • Real-time detection: 15 Hz video processing with smooth emotion tracking
  • 5 emotion classes: Happy 😊, Sad 😒, Angry 😠, Surprised 😲, Neutral 😐
  • Live video display: Webcam feed with face bounding boxes and emotion labels
  • Emoji overlays: Large emoji indicator in video feed
  • ROS2 integration: Publishes detected emotions to /facial_emotion topic
  • Production-grade architecture: Type hints, clean code structure
  • Emotion smoothing: 5-frame history-based filtering for stable detection

System Requirements

  • Ubuntu 22.04
  • ROS2 Humble
  • Python 3.10+
  • Webcam (USB or built-in)

Dependencies

System Packages

sudo apt update
sudo apt install -y \
    python3-pip \
    ros-humble-cv-bridge \
    ros-humble-image-transport \
    python3-opencv

Python Packages

pip3 install fer --break-system-packages

This installs FER library with TensorFlow and other dependencies.

Installation

1. Create Workspace

mkdir -p ~/facial_emotion_ws/src
cd ~/facial_emotion_ws/src

2. Clone/Create Package

# If using git
git clone <repository_url> facial_emotion_detector

# Or create manually and copy files

3. Build Package

cd ~/facial_emotion_ws
source /opt/ros/humble/setup.bash
colcon build --packages-select facial_emotion_detector --symlink-install

4. Source Workspace

source install/setup.bash

Usage

Method 1: Using Launch File (Recommended)

# Run with launch file
ros2 launch facial_emotion_detector emotion_detection.launch.py

Method 2: Direct Node Execution

# Run detector with live video display
ros2 run facial_emotion_detector emotion_detector

Both methods open a webcam window showing:

  • Your face with green bounding box
  • Current emotion label on face box
  • Large emoji in top-right corner
  • Emotion text with emoji at bottom
  • FPS counter in top-left

Press 'Q' to quit.

Monitor Emotion Topic

In a separate terminal:

source ~/facial_emotion_ws/install/setup.bash
ros2 topic echo /facial_emotion

Emotion Detection Guide

How to Express Each Emotion

Emotion Instructions
😊 HAPPY Smile wide with visible teeth. Facial muscles relaxed and elevated.
😒 SAD Look DOWN at floor. Let face droop. Pout lower lip OUT. Slightly close eyes.
😠 ANGRY Stare FORWARD intensely. Squeeze eyebrows DOWN and TOGETHER. Clench jaw.
😲 SURPRISED Open mouth VERY WIDE (O-shape). Raise eyebrows UP HIGH. Open eyes wide.
😐 NEUTRAL Completely relax all facial muscles. Natural resting face. No tension.

Key tip: For SAD look DOWN. For ANGRY stare FORWARD. This separates them easily.

System Architecture

Node Structure

emotion_detector_node
β”œβ”€β”€ Camera Capture (15 Hz)
β”œβ”€β”€ FER CNN Model
β”‚   └── Pre-trained on FER2013 dataset
β”œβ”€β”€ Emotion Classification
β”‚   └── CNN softmax scores with tuned thresholds
β”œβ”€β”€ Emotion Smoothing (5-frame history)
β”œβ”€β”€ ROS2 Publisher (/facial_emotion)
└── Video Display with Overlays

Topics

Topic Type Description
/facial_emotion std_msgs/String Published emotion name
  • Message values: "happy", "sad", "angry", "surprised", "neutral"
  • Publish rate: Variable (when face detected)

Parameters

Parameter Type Default Description
camera_id int 0 Camera device index

Package Structure

facial_emotion_ws/
β”œβ”€β”€ src/
β”‚   └── facial_emotion_detector/
β”‚       β”œβ”€β”€ facial_emotion_detector/
β”‚       β”‚   β”œβ”€β”€ __init__.py                    # Package initialization
β”‚       β”‚   β”œβ”€β”€ emotion_classifier.py          # FER-based CNN emotion classifier
β”‚       β”‚   β”œβ”€β”€ emotion_detector_node.py       # ROS2 node with video display
β”‚       β”‚   └── emotion_display_node.py        # Alternative display node
β”‚       β”œβ”€β”€ launch/
β”‚       β”‚   └── emotion_detection.launch.py    # Launch file for multiple nodes
β”‚       β”œβ”€β”€ resource/
β”‚       β”‚   └── facial_emotion_detector        # Package resource marker
β”‚       β”œβ”€β”€ test/
β”‚       β”‚   β”œβ”€β”€ test_copyright.py              # Copyright validation
β”‚       β”‚   β”œβ”€β”€ test_flake8.py                 # Code style checks
β”‚       β”‚   β”œβ”€β”€ test_pep257.py                 # Docstring validation
β”‚       β”‚   └── test_emotion_classifier.py     # Unit tests for emotion logic
β”‚       β”œβ”€β”€ package.xml                        # ROS2 package dependencies
β”‚       β”œβ”€β”€ setup.py                           # Python package setup
β”‚       β”œβ”€β”€ setup.cfg                          # Package configuration
β”‚       └── README.md                          # Documentation
β”œβ”€β”€ build/                                     # Build artifacts
β”œβ”€β”€ install/                                   # Installed files
└── log/                                       # Build and runtime logs

Technical Details

Emotion Classification Algorithm

The system uses FER library with pre-trained CNN:

  1. Face Detection: OpenCV cascade classifier

  2. Emotion Classification:

    • FER library with TensorFlow Lite CNN model
    • Trained on FER2013 dataset (35,000+ labeled images)
    • Outputs softmax scores for 7 emotions
  3. Emotion Mapping (FER β†’ 5 classes):

    • angry, disgust β†’ ANGRY
    • fear, surprise β†’ SURPRISED
    • happy β†’ HAPPY
    • sad β†’ SAD
    • neutral β†’ NEUTRAL
  4. Tuned Thresholds:

    • SAD: sad_score + fear_score * 0.3 > 0.25
    • ANGRY: angry_score + disgust_score * 0.5 > 0.15
    • Separation: SAD wins if sad > angry, ANGRY wins if angry > sad
  5. Smoothing: 5-frame majority voting for stable output

Performance

Metric Value
Detection rate 15 FPS (video display)
Latency ~100ms end-to-end
Model accuracy ~63% on FER2013 test set

Testing

Run Unit Tests

cd ~/facial_emotion_ws
source install/setup.bash

# Run all tests
colcon test --packages-select facial_emotion_detector

# Run specific test
python3 src/facial_emotion_detector/test/test_emotion_classifier.py

Test Coverage

  • Emotion classification logic
  • Threshold behavior
  • Edge cases and defaults

Troubleshooting

Camera Issues

Problem: "Failed to open camera"

# Check available cameras
ls /dev/video*

# Test camera
ffplay /dev/video0

# Change camera_id parameter
ros2 run facial_emotion_detector emotion_detector --ros-args -p camera_id:=1

Poor Detection

Problem: Emotions not changing or incorrect

  • Ensure good lighting (front-lit face)
  • Face camera directly
  • Make exaggerated expressions initially
  • Adjust distance (60-120cm optimal)
  • For SAD: look DOWN
  • For ANGRY: stare FORWARD intensely

TensorFlow Warnings

Problem: TensorFlow deprecation warnings

These are harmless warnings from TensorFlow Lite. The system works correctly.

Import Errors

Problem: "cannot import name 'FER' from 'fer'"

# Use correct import path
# In emotion_classifier.py, change:
from fer import FER
# To:
from fer.fer import FER

References

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors