Artificial intelligence experts doubt Amazon Halo new bracelet, can accurately interpret human emotions and warn of privacy risks - Amazon presented this Thursday Amazon Halo, a bracelet to compete with Fitbit and Apple Watch. Like its competitors, Halo can measure heart rate and sleep patterns, but it also seeks to differentiate itself with a peculiar feature: judging your emotional state from your tone of voice.

The "Tone" feature (Amazon Tone) ensures you can tell how you sound to others. It uses "machine learning to analyze the energy and positivity in a customer's voice so they can better understand how they sound to others, helping to improve their communication and relationships," says Amazon's Halo press release.

As an example, Amazon's medical director, Maulik Majmudar, points out that it is possible to get information from Tone such as: "in the morning you seemed calm, at ease and affectionate." According to Majmudar, the feature analyzes vocal qualities such as your "pitch, intensity, tempo, and rhythm" to tell you how he thinks you sound to others.

The experts with whom Business Insider has spoken doubt that an algorithm can accurately analyze something as complex as human emotions and are also concerned that the data of this function may end up in the hands of third parties.

"I have my doubts that today's technology will be able to decipher the very complex human code of communication and the internal functioning of emotions," says Dr. Sandra Wachter, assistant professor of artificial intelligence ethics at Oxford University.

"The way we use our voice and language is heavily influenced by social expectations, culture and Customs. To expect an algorithm to be able to read and understand all these subtleties seems rather an aspiration, " he says.

Artificial intelligence experts doubt Amazon Halo new bracelet

Wachter adds that stating that the algorithm can tell you how other people are judging your voice further tangles things.

"Here the machine has to understand how a person speaks (and what he says) and deduce how the other person understands and interprets these words. This is an even more complex task because you have to read two minds. An algorithm as a mediator or interpreter seems very strange, I doubt that a system (at least at this point) will be able to decipher this complex social code, " he explains.

His Mozilla colleague Frederike Kaltheuner agrees that voice analysis has intrinsic limitations. Voice recognition systems have also suffered throughout history with different types of voices, he points out. "Accuracy is usually lower for people who speak an accent or in their second language."

Amazon explains that the tone function is voluntary for Halo bracelet owners. Once activated it runs in the background, recording small fragments of your voice throughout the day for analysis. There is also an option to turn on the function in specific conversations up to 30 minutes long.

Amazon says all of this data is kept safe and secure, with all processing done locally on your Mobile, which then erases the data. "Voice samples are never sent to the cloud, which means no one hears them, and you have full control of your voice data," Majmudar reported.

The insistence of Amazon employees human you will not hear any of the recordings made by this function seems to allude to the time in the Amazon, along with other large companies, was involved in a scandal after several reports revealed that they were sending recordings sensitive to Alexa for analysis to humans, who were not even direct employees of Amazon, but they were subcontracted by other companies.

But experts say that even without humans listening to tone audio recordings, there are important privacy implications.

Privacy Policy Expert Nakeema Stefflbauer explains to Business Insider that Halo could be the gateway for Amazon to get into insurance technology. "My first impression is that it's almost as if Amazon is moving as fast as possible to anticipate public revelations about its own incursions into the insurtech realm," says Stefflbauer.

"I am alarmed when I hear that such assessments are recorded, because, although I see no benefit in IT, entrepreneurs could. Insurers certainly could. Public administrations that monitor benefits (such as unemployment) could definitely," he adds.

"The definitive signal to me that you, as a customer, are not the ultimate goal of the data collected is that Amazon already has alliances with insurers like John Hancock and medical records companies like Cerner," stefflbauer adds.

John Hancock announced on Thursday that he would be the first life insurer to integrate with Amazon Halo. "Starting this fall, all John Hancock Vitality customers will be able to link the Amazon Halo bracelet to the program to earn Vitality Points for the small daily steps they take to try to live a longer and healthier life," the insurer said in a press release.

Kaltheuner says it's good that the tone function is optional, but anonymous Halo data could still be shared on a large scale with third parties. "Even if it's aggregate and anonymous information, it may not be something you want your watch to do," he says.

Chris Gilliard, surveillance and privacy expert at the Digital Pedagogy Laboratory, has explained to Business Insider that Amazon's privacy claims seem unconvincing to him.

"Amazon experienced the pressure when it was discovered that real humans were listening to Alexa's recordings, so this is their attempt to short-circuit that particular criticism, but to say that these systems will be 'private' stretches the meaning of that word beyond its own meaning, " he comments.

Wachter says that if, as Amazon claims, an algorithm is able to accurately analyze emotion in people's voices, it could pose a potential human rights problem.

"Our thoughts and emotions are protected by human rights law, for example, freedom of expression and the right to privacy," says Wachter.

"Our emotions and thoughts are one of the most intimate and personal aspects of our personality. In addition, we are often unable to control our emotions. Our inner thoughts and emotions are at the same time very important for forming opinions and expressing them. This is one of the reasons why the human rights act does not allow any intrusion into them."

"It is therefore very important that this barrier is not invaded and that this border is respected," he says.

Artificial intelligence experts doubt Amazon Halo new bracelet

You may also find interesting: