Your IPhone is Listening to Even Your Bedroom Stories!

Siri can listen everything

An apple device in your pocket can keep your privacy at bay!

Perhaps it is time to add another proverb that looks like the famous ‘an apple a day keeps the doctor away’ – An Apple Device in your Pocket Can Keep your Privacy at Bay! You heard it right – as per reports there are serious privacy issues caused by Apple’s favourite Siri who is more like a companion for most of us. The report says that there’s a chance a human on the other side of the world may be listening to you.

Siri can listen to everything, as we have already given necessary privileges to Apple. They may be hearing that conversation you had with your boss about a new marketing strategy. Or that awkward exchange with your doctor about a really private medical problem or even when you have intimate moments with your wife or girlfriend.

That’s the takeaway from a troubling new Guardian story in which an Apple whistleblower details how the company lets contractors review audio of users’ Siri commands — as well as recordings never actually meant for Siri’s digital ears — to improve the digital assistant.

Apple told The Guardian it regularly sends Siri activations to “secure facilities” where human reviewers listen to the clips, which are usually just a few seconds long and stripped of the Apple user’s ID and name. These contractors grade the audio snippets, noting whether the AI handled the request appropriately and any mistakes, such as Siri thinking it had heard its “wake word” when it didn’t.

Fewer than 1 percent of all Siri activations are subjected to this process, Apple said, and the goal is to improve Siri’s ability to understand and assist users. But as The Guardian discovered, there are several key problems with Apple’s activation vetting process — and how the company describes it to users.

For one, Apple doesn’t explicitly state in the privacy documentation it develops for consumers that humans might be listening in when they talk to Siri. It also doesn’t put much effort into hiring trustworthy contractors or making sure the audio clips can’t be traced to their sources, the whistleblower told The Guardian.

Perhaps the most troubling revelation in The Guardian story, though, is the regularity with which these human reviewers hear audio that wasn’t even meant for Siri. The recordings can also be far longer than the few seconds Apple described to The Guardian, with the whistleblower noting that some can last upwards of 30 seconds.

It’s not entirely surprising that Apple lets humans review audio recorded by their AI assistants, given that we already knew that Amazon and Google do the same thing. And it also looks like digital assistants are not only here to stay, but that they’ll be even more ubiquitous in the future, meaning these companies probably aren’t going to stop attempting to perfect the tech any time soon.

So, if Apple, Google, Amazon, and the rest of the tech giants are determined to let humans review audio recorded by their AI assistants, maybe they should all focus on perfecting just one aspect of the tech first: training the assistants to listen only when spoken to.

Content Courtesy: The Guardian,

Notice: Undefined variable: vuukle in /var/www/html/wp-content/plugins/free-comments-for-wordpress-vuukle/includes/commentbox.php on line 21

Notice: Trying to get property 'settings' of non-object in /var/www/html/wp-content/plugins/free-comments-for-wordpress-vuukle/includes/commentbox.php on line 21

Notice: Trying to access array offset on value of type null in /var/www/html/wp-content/plugins/free-comments-for-wordpress-vuukle/includes/commentbox.php on line 21