Google learns to measure heart rate using headphones

Wearable technology, such as smartwatches and wireless earbuds, typically relies on photoplethysmography (PPG) to monitor heart rate. Picture: Reuters

Wearable technology, such as smartwatches and wireless earbuds, typically relies on photoplethysmography (PPG) to monitor heart rate. Picture: Reuters

Published Oct 31, 2023

Share

Wearable technology, such as smartwatches and wireless earbuds, typically relies on photoplethysmography (PPG) to monitor heart rate.

PPG uses light pulses to measure blood activity and has proven effective but comes with its own set of limitations. However, Google scientists have recently explored an alternative approach called audioplethysmography (APG) using off-the-shelf active noise-cancelling earbuds and a software update.

The concept behind APG involves bouncing a low-intensity ultrasound signal off the inner ear canal. This signal detects surface perturbations on the skin as blood flows through it, utilising the earbud's tiny microphone that is used for the active noise-cancelling feature.

According to Google's research blog, the technique demonstrated resilience even when dealing with issues such as a poor ear seal, variations in ear canal size, or individuals with darker skin tones. This is a noteworthy breakthrough, as heart rate accuracy has historically been a challenge for wearables when dealing with individuals with darker skin tones or those with tattoos.

The study conducted by Google's researchers revealed that the ultrasound approach worked well even when music was playing. However, it did encounter some challenges in noisy environments, and the APG signal could be disrupted by body motion. Nevertheless, these issues were mitigated by employing multiple frequencies and isolating the most accurate signal among them.

While heart rate monitoring headphones have existed for some time, they typically rely on the PPG approach, which can be sensitive to intense movement or an improper fit.

It's important to bear in mind that this is a research study, and it doesn't necessarily indicate that Google is preparing to release such headphones or update existing products. Nonetheless, it provides insight into Google's exploration of new ideas in the wearables space.

The use of ultrasound to monitor heart rate with everyday earbuds, while promising in terms of health applications, could also give rise to apprehensions about invasive surveillance capabilities. If this technology were to be widely adopted, it might create a situation in which individuals could unknowingly have their vital signs monitored without their consent or knowledge. As we have seen in the past with other innovations, there is always a need for careful consideration of privacy and ethics, especially in cases like these where no new hardware is needed, and data collection could happen without people even being aware of the kind of data their devices are able to measure.

Forbes’s chatbot paves the way for ubiquitous language interfaces

Forbes, in collaboration with Google Cloud, has introduced a beta version of its generative AI search platform called Adelaide. Named after BC Forbes's wife, Adelaide aims to offer personalised search experiences for readers. It allows users to ask specific questions or input general topics, receiving recommended articles related to their queries, along with summarised answers, as long as they fall within Forbes's coverage scope.

Vadim Supitskiy, Forbes's chief digital and information officer, expressed the publication's desire to enhance engagement with both its search feature and its articles.

He stated that traditional search engagement had been relatively standard, prompting the adoption of generative AI to attract more users seeking information.

Previously, Macworld, PCWorld, Tech Advisor, and TechHive incorporated AI chatbots to answer reader questions based on their articles.

Supitskiy emphasised that Adelaide is Forbes's first generative AI tool but not its first foray into AI technology. In 2019, Forbes introduced Bertie, an AI-powered tool designed to assist Forbes journalists in refining their writing style. Both Adelaide and Bertie were developed using Google’s language models via its Cloud API services.

While it is currently only trained on the past 12 months of news articles, Forbes has ambitious plans for Adelaide, aiming to expand its knowledge base to cover its entire archive dating back to 1917.

Adelaide is a vision of one the likely futures of AI tech integration, where service providers each have their own language-based interface for users. These text-based AI programs will give businesses and users an easier way to both sift through large amounts of information and deliver those insights in a human-friendly format.

They will undoubtedly make great tools for guiding users, troubleshooting problems, and lowering the barrier of entry to information-intense tasks. However, it is unclear if this is a concerning acceleration into a digital, post-literate future, or whether this is a necessary improvement to make our new data-clogged Wikipedia present more human-accessible.

James Browning is a freelance tech writer and local music journalist.

BUSINESS REPORT