Skip to content

QualityPoint Technologies News

Emerging Technologies News

Menu
  • About Us
  • Technology
  • Medical
  • Robots
  • Artificial Intelligence (AI)
  • 3D Printing
  • Contact Us
Menu

Huge Discount Offer: 14 ebooks + 2 courses

New system that uses smartphone or computer cameras to measure pulse, respiration rate could help future personalized telehealth appointments

Posted on April 4, 2021

Telehealth has become a critical way for doctors to still provide health care while minimizing in-person contact during COVID-19. But with phone or Zoom appointments, it’s harder for doctors to get important vital signs from a patient, such as their pulse or respiration rate, in real time.

A University of Washington-led team has developed a method that uses the camera on a person’s smartphone or computer to take their pulse and respiration signal from a real-time video of their face. The researchers presented this state-of-the-art system in December at the Neural Information Processing Systems conference.

Now the team is proposing a better system to measure these physiological signals. This system is less likely to be tripped up by different cameras, lighting conditions or facial features, such as skin color. The researchers will present these findings April 8 at the ACM Conference on Health, Interference, and Learning.

The researcher says that Machine learning is pretty good at classifying images. If you give it a series of photos of cats and then tell it to find cats in other images, it can do it. But for machine learning to be helpful in remote health sensing, we need a system that can identify the region of interest in a video that holds the strongest source of physiological information — pulse, for example — and then measure that over time.

Every person is different. So this system needs to be able to quickly adapt to each person’s unique physiological signature, and separate this from other variations, such as what they look like and what environment they are in.

The team’s system is privacy preserving — it runs on the device instead of in the cloud — and uses machine learning to capture subtle changes in how light reflects off a person’s face, which is correlated with changing blood flow. Then it converts these changes into both pulse and respiration rate.

The first version of this system was trained with a dataset that contained both videos of people’s faces and “ground truth” information: each person’s pulse and respiration rate measured by standard instruments in the field. The system then used spatial and temporal information from the videos to calculate both vital signs. It outperformed similar machine learning systems on videos where subjects were moving and talking.

But while the system worked well on some datasets, it still struggled with others that contained different people, backgrounds and lighting. This is a common problem known as “overfitting”.

The researchers improved the system by having it produce a personalized machine learning model for each individual. Specifically, it helps look for important areas in a video frame that likely contain physiological features correlated with changing blood flow in a face under different contexts, such as different skin tones, lighting conditions and environments. From there, it can focus on that area and measure the pulse and respiration rate.

While this new system outperforms its predecessor when given more challenging datasets, especially for people with darker skin tones, there’s still more work to do, the team said.

The researchers are also working on a variety of collaborations with doctors to see how this system performs in the clinic.

Any ability to sense pulse or respiration rate remotely provides new opportunities for remote patient care and telemedicine. This could include self-care, follow-up care or triage, especially when someone doesn’t have convenient access to a clinic.

News Source: University of Washington

Share

Related News:

  1. New model offers a way to speed up drug discovery
  2. MIT’s AI System “Pic2Recipe” Predicts recipes from photos
  3. Artificial intelligence uses internet searches to help create mind association magic trick
  4. Loihi: Intel’s New Self-Learning Chip Promises to Accelerate Artificial Intelligence
Master RAG ⭐ Rajamanickam.com ⭐ Bundle Offer ⭐ Merch ⭐ AI Course

  • Bundle Offer
  • Hire AI Developer

Latest News

  • MIT Researchers Unveil New Framework to Test AI Privacy Risks in Clinical Models January 6, 2026
  • MIT Researchers Develop AI-Driven Robot That Builds Furniture From Text Prompts December 17, 2025
  • Kling O1: A New Breakthrough in AI Video Creation December 4, 2025
  • Coactive: Teaching AI to See and Understand Visual Content June 10, 2025
  • Harvard Sues Trump Administration Over International Student Ban May 23, 2025
  • Stanford Researchers Develop AI Agents That Simulate Human Behavior with High Accuracy May 23, 2025
  • ​Firebase Studio: Google’s New Platform for Building AI-Powered Applications April 11, 2025
  • MIT Researchers Develop Framework to Enhance LLMs in Complex Planning April 7, 2025
  • MIT and NVIDIA Unveil HART: A Breakthrough in AI Image Generation March 25, 2025
  • Can LLMs Truly Understand Time Series Anomalies? March 18, 2025

Pages

  • About Us
  • Basics of 3D Printing
  • Key Innovations
  • Know about Graphene
  • Privacy Policy
  • Shop
  • Contact Us

Archives

Developed by QualityPoint Technologies (QPT)

QPT Products | eBook | Privacy

Timesheet | Calendar Generator

©2026 QualityPoint Technologies News | Design: Newspaperly WordPress Theme