EmoLink: Facial and Emotion Perception System
Computer Vision Research @ MIT Media Lab Personal Robotics Group
For this research, I developed a multi-modal deep neural network for Dyadic Affect Analysis in Parent-child multi-modal interaction in order to enhance the performance of a speech-recognition-based model. I integrated yolov5 and DeepFace to differentiate parent versus child for accurate attention estimation (95% accuracy) and built pipeline to analyze emotions and nonverbal cues during dyadic interaction. I wrote one first-author paper “EmoLink: Facial and Emotion Perception System for Displaying Interpersonal Dynamics in Real-World Parent-Child Interactions” (submitted to ICMI 2024).
