Amazon asked the world this week, in effect, “Who wants a microphone-equipped wrist band that can police your tone of voice, connected to a service that measures your body fat based on pictures of you in your underwear?”

It’s asking $100 for the Halo device, which is also an activity tracker that would compete with similar offerings from Fitbit and Apple, plus $48 a year for the associated health and fitness service — though at least one insurer will buy it for you if you give it your data.  

But some tech analysts, journalists and would-be users reacting to the news on social media pointed out a range of potentially problematic features of Halo, drawing comparisons to dystopian science fiction classics such as “Do Androids Dream of Electric Sheep,” the novel on which the movie “Blade Runner” was based.

“I usually call them all unwearables, but this is fascinating and more than a little scary,” veteran technology journalist and columnist Kara Swisher said on Twitter.   

Amazon positions the Halo device and service as a way for people to get better insight into what they’re actually doing, or not doing, in order to guide health improvements. Rather than a collaboration on the blockbuster Microsoft game franchise Halo (as some hoped on reading the headlines), Amazon’s Halo marks its deepest foray into what for years has been called the quantified self. It’s also another prong in its strategy to grow a major business in healthcare. Halo can be paired with electronic medical records from Cerner, among a host of other third-party integrations.

Kate McKean, a literary agent, was one of several people expressing concern on social media that the body fat assessment tool — which allows a user to “see yourself at different body fat percentages” by adjusting a scroll bar, with limits in place so as not to depict unsafe levels — could contribute to fat shaming and degraded self-image. Weight Watchers doesn’t see a problem, introducing a service that ties into Halo “to help members build good habits to live healthier lives.”

Advertising

As for the tone analysis — pitched by Amazon as an opt-in tool “to help you communicate more thoughtfully and keep your relationships strong” — people including lexicographer and dictionary editor Kory Stamper questioned whether the technology to accomplish an accurate assessment of how someone is feeling, based on a few snippets of speech, is ready for use, particularly given the limited details available on how it was developed, such as what sort of data was used to train the artificial intelligence.

“If your training data is bad, then the result will be bad,” Stamper said on Twitter. “With a product like this, that has real-world consequences for people.”

An Amazon spokesperson said that it focused on “ensuring the data we use to train and evaluate our models accounts for all demographic groups. Tone was trained using tens of thousands of voices across demographics and regions in the US.” (An earlier version of this story erroneously said Amazon had not disclosed this information.)

The company said it used American English, and the analysis would be less accurate for people who “have an accent.”  

Setting aside the privacy implications of yet another always-on microphone, some people with autism or who are closely tracking emotions as part of a therapy program said on social media that a daily report on mood could be useful.

But there is also a deep current of distrust surrounding these services, given that personal data, particularly on people’s emotions, is highly valuable to businesses and has mostly been gathered for the purposes of selling more stuff. Indeed, Amazon has built its empire on an ever-growing trove of highly detailed customer data.

Advertising

Will Ahmed, founder and CEO of a company called WHOOP that makes a similar activity tracker and fitness membership program, said he believes everyone will eventually be wearing such a device, but cautioned people to pay attention to who benefits.

“It’s hard to understate how invasive it is that a Trillion dollar company wants you to wear a 24/7 wearable that intentionally records EVERYTHING you say. Re-read that and tell me it’s not a dystopian future,” he said on Twitter.

Amazon says it designed Halo “with customer trust as a foundational tenet.” Speech samples will be controlled by the users, stored only on their phone and deleted after analysis, Amazon says in one of its more clearly worded privacy polices — documents that have typically been lengthy, obtuse and rarely read by users. (Still, the privacy policy runs to nearly 2,800 words, and, as with most such products, there are separate terms of service and legal notices.) Body scan images are sent to the cloud, but are deleted after processing, the company says. Health data is stored on Amazon’s servers, but the company promises never to sell it.

The Halo uses microphones on its band to listen to snippets of conversation and analyze how it thinks you come across to others. (Courtesy of Amazon)
The Halo uses microphones on its band to listen to snippets of conversation and analyze how it thinks you come across to others. (Courtesy of Amazon)

Users can choose to give it away, though. John Hancock said it would give its policy holders a free Halo and three-year membership in exchange for sharing their sleep, heart rate and activity data, something the company has offered for other devices since 2018, OneZero reported Friday.

Other insurers have partnered with Apple, Google and Fitbit for similar programs that offer incentives for activities measured by these tracking devices.  

The features that differentiate Halo are also the most intimate and potentially invasive. While many people may abhor the idea of sharing photos of themselves in their skivvies for analysis by Amazon technology and allowing it to record them all day, for many others sharing data in exchange for convenience and benefits is not a problem; to the contrary, they’ve become a fact of modern life. Amazon’s category leading line of microphone-and-speaker devices are widely used.