This article was last updated 4 years ago

Google Pixel

Google has announced that it will be updating the Google Fit app on Pixel phones to enable heart and respiratory rate monitoring. Users will be able to read these readings using the cameras of the phones. This feature will be made public this month, Google said, adding that they plan to add them to other Android phones in the future.

Both features depend on the camera of the smartphone to work. For measuring the respiratory rate, it uses the camera and optic flow (a computer vision technique) to detect the respiratory rate via subtle movements in the chest. The head and the upper torso should be in view of the phone’s front-facing camera, and the user should breathe normally. For the heart rate, it uses the camera to detect subtle color changes in the fingertip (placed on the rear-facing camera lens) that happen when freshly oxygenated blood flows from the heart through the body.

In a blog post, Shwetak Patel, director of Health Technologies, Google Health, said that these features are not a replacement for, or meant for, medical diagnosis or to evaluate medical conditions and are instead intended to let users track and improve their overall wellness. The measurements made after the monitoring has been completed can be saved in the app to monitor trends over time, alongside other health and wellness information.

Earlier, Google finished the initial clinical validation examining accuracy among healthy individuals, as well as those with respiratory conditions that might impact measurement. The algorithm is said to be accurate within one breath per minute on average on both groups. The algorithm for measuring heart rate is said to be accurate within two percent on average across all categories. Google finished its initial clinical validation examining the algorithm’s performance among people with different skin types.

Patel said that increasingly powerful sensors and advances in computer vision made it possible for the phone’s camera to track tiny physical signals at the pixel level, like chest movements, to measure their respiratory rate and subtle changes in the color of fingers for their heart rate.

“We developed both features and completed initial clinical studies to validate them so they work in a variety of real-world conditions and for as many people as possible. For example, since our heart rate algorithm relies on approximating blood flow from color changes in someone’s fingertip, it has to account for factors such as lighting, skin tone, age, and more in order to work for everyone,” Patel said.

Jack Po, a product manager at Google Health, said that the machine learning technique at Google emulates the way a doctor counts a patient’s respiratory rate, by watching their chest rise and fall.

Google has completed the initial clinical validation with people of different health status, in different ambient lighting, different skin tones, as well as different heart rate ranges like users sitting at rest, users elevating their heart rate by briefly exercising. This is to make sure that the feature works for as many users as possible.