FacePhys: Remote Vital Signal Monitoring

February 15, 2026 · 2 min read
projects

A camera-based technology for contactless measurement of heart rate, HRV, and other vital signs from facial videos.

Overview

FacePhys leverages remote photoplethysmography (rPPG) technology to extract physiological signals from subtle color changes in facial skin caused by blood flow. This non-contact approach makes health monitoring accessible using everyday devices like smartphones, webcams, and surveillance cameras.

Try it out →

Key Features

  • Universal Camera Support — Works with any standard RGB camera without specialized hardware
  • Multiple Vital Signs — Measures heart rate, HRV, respiration rate, and blood oxygen saturation
  • Real-time Processing — Enables continuous monitoring with low latency
  • Robust Algorithms — Handles various lighting conditions, skin tones, and motion artifacts
  • Open Source Toolbox — Comprehensive research and development framework

Technology

FacePhys employs advanced deep learning models and signal processing techniques to:

  1. Detect and track facial regions of interest
  2. Extract subtle color variations from video frames
  3. Filter noise and motion artifacts
  4. Reconstruct physiological waveforms
  5. Compute vital sign metrics

Applications

  • Healthcare Monitoring — Remote patient monitoring and telemedicine
  • Fitness & Wellness — Stress assessment and exercise tracking
  • Human-Computer Interaction — Emotion recognition and adaptive interfaces
  • Research — Physiobehavioral computing studies

Try It Yourself

Experience FacePhys technology in action with our interactive demo:

Launch Live Demo →

The demo runs entirely in your browser and uses your webcam to measure your heart rate in real-time. No data is uploaded or stored.

Resources

Impact

FacePhys technology has been published in top-tier conferences and journals, contributing to the advancement of contactless health monitoring and physiobehavioral computing research.

Yuntao Wang
Authors
Associate Professor (Research Track)
Yuntao Wang’s research centers on physiobehavioral computing and intelligent interaction for mobile and wearable systems. His work focuses on (1) developing robust, efficient sensing that performs reliably on mainstream devices, (2) extracting spatiotemporal patterns from multimodal signals to infer interaction intent by leveraging natural behavioral correlations, and (3) designing edge-efficient interfaces that deliver high performance on mobile and wearable platforms. He has published 90+ papers, received 10 international conference awards, and holds 30+ granted patents. His contributions have been recognized with honors including the Wu Wenjun AI Outstanding Youth Award (2024), the CAST Young Elite Scientists Sponsorship Program (2022), the Qinghai High-Level Innovation & Entrepreneurship Leading Talent (2024), and the First Prize of the China Electronics Institute Science & Technology Award (2019).