Who Is Really Listening To Your Conversations?

Who Is Really Listening To Your Conversations?

For the last century, we’ve imagined a future where we’re surrounded by robotic butlers that are classy, smart, and discreet. We wouldn’t think twice of asking an embarrassing question of a digital assistant, or entrusting them with sensitive information, because it’s directive would be to serve only its owner. RIGHT? Already, there are millions of these digital helpers running around in pockets, in the form of digital assistants like Apple’s Siri, Microsoft’s Cortana, Amazon’s Alexa and Google’s smart assistant. These virtual helpers use artificial intelligence to parse what users say or type, and return useful information. More recent updates to Siri and Google have taught the assistants to guess at what users want to know before they’re asked, chiming in with notifications about a traffic jam at the appropriate time. But they aren’t quite the robot-butlers we had in mind. In the quest to make our digital lives more convenient, tech companies have run up against a familiar dilemma: It’s hard to deliver convenience without sacrificing privacy and security. For many people today the conversation all about these virtual assistants and the privacy trade-offs that come with this convenience.


For now, conversing with digital assistants is largely a one-on-one affair. Invoke Siri on your phone and ask for directions or hit up Google for the weather, and your query gets sent to headquarters—a giant server farm somewhere—where it’s parsed, answered, and returned to your device – that’s why you need to be connected to the Internet or using your cell’s data connection before these digital assistants can work. Like nearly everything else on the Internet, your requests will leave a trail of breadcrumbs. Questions directed at Siri and Google’s voice search get sent to their respective companies, paired with unique device IDs that aren’t connected to specific users; as far as I know based on what the companies have been saying. For Apple, they store Siri requests with device IDs for six months, and then deletes the ID BUT keeps the audio for another 18 months. Amazon, Google, and Microsoft associate this data with your username, encrypt it, then store it indefinitely in the cloud. Virtual assistants can capture data related to your contacts, calendar, browsing histories, location, music library, purchases, and other personal preferences. They can use it to initiate phone calls, schedule appointments, pull up traffic or weather reports en route to your destination, or suggest nearby restaurants.


When served with a court order, each company will either surrender your personal data to the relevant legal authority or challenge it. Apple says that in 2015, it received about 2,000 requests from U.S. law enforcement agencies for iCloud account data, and it complied with more than 80 percent of them. So if you ask Siri how to search up something illegal and you are called in for questioning by the authorities, I would not be surprised if your Siri recordings—along with your Web searches—are introduced as evidence in court.

Could these digital assistants be hacked to eavesdrop on your? Possibly. So far, though, the only known attacks on voice assistants have occurred in a lab. Researchers in France last year demonstrated a way to remotely control phones using Siri or Google Now. The hack required each handset to use an external microphone and have the voice assistant enabled from the lock screen. An April 2015 security report detailed flaws in voice-activated home automation hubs from Ubi and Wink that would enable them to be used as listening devices. (These devices have since been modified or removed from the market.) And Mattel’s voice-driven Hello Barbie doll initially came with a wide range of security flaws, some of which could theoretically allow hackers to eavesdrop on kids. ToyTalk, which built and manages Hello Barbie’s voice technology, says it has since addressed the “relevant and credible vulnerability issues.” So, if manufactures don’t


Is it possible then to delete any of these archived recordings? Yes, most people are unaware of this but Amazon, Microsoft, and Google let you individually view and delete your recordings online. Apple doesn’t show you individual recordings to delete, but it automatically deletes all of the information it’s gathered after you deactivate Siri. Beware that deactivating your digital assistant also nukes any data the service uses to customize the information it provides, such as your favorite restaurants or sports teams. So after reactivating it, you essentially start from scratch. Keep in mind too that voice interfaces or digital assistants are really here to stay and I think the main challenge now is to make them more secure so people’s privacy is protected.

thedigitalteacher

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Archives

My Twitter Feed: