It was long thought to be a myth and dismissed by big tech companies.
But experts have revealed how listening into your conversations has become a multi-billion dollar industry.
Earlier this week, a leak from a leading marketing firm appeared to confirm how companies use microphones on your smart devices to eavesdrop before selling the data to advertisers.
‘You can be talking to one of your friends about going on a vacation to Portugal through a phone call, and then a day later or that same day, what do you see? An advertisement for a trip,’ data security expert Andy LoCascio told DailyMail.com.
The first slide of CMG’s leaked pitch deck describes how their Active-Listening software listens to your conversations and extracts real-time intent data
The deck goes on to break down the process step by step, from identifying a ‘data trail’ left behind by consumers conversations and online behavior to creating targeted digital ads
The leak last week came from a pitch deck given by CMG, a marketing partner of Facebook, Amazon and Google.
The deck – which appears to have been made for prospective clients – detailed CMG’s ‘Active-Listening,’ software, which collects data from people by listening in on their conversations.
Active-Listening software can be enabled through any app on an Android or iPhone, and other devices like smart home assistants can listen in too, LoCacio said.
What’s more, these devices are listening practically all the time, not just when you’re intentionally using your microphone to make a phone call or talk to Alexa, for example.
‘For most devices, there is no device state when the microphone is inactive. It is nearly always active when Siri is present on the device or any other voice activated assistant is present,’ LoCascio said.
Companies that want to capture your voice data and sell it often gain access to your microphone through apps.
Typically, apps are granted permission to use your microphone through a clause ‘buried in the myriad of permissions you accept when installing a new app,’ he added.
That means that many users are consenting to being tapped without even realizing it.
‘The problem is, the form of consent is an all-or-nothing Faustian bargain,’ data privacy expert and consultant Sharon Polsky said.
‘So many websites say ‘we collect information from you and about you. If you use our website, you’ve consented to everything that we do.’ You have no way of opting out,’ she added.
LoCascio explained that this is how CMG and other companies are getting away with this even in states with wiretapping laws that prohibit recording somebody without their knowledge, like California.
‘To be perfectly clear, there are no laws about this. If we give somebody permission to use the microphone on our device, and we click off all the other terms of service that none of us ever read, they can certainly use it,’ LoCascio said.
That lack of protective legislation has ‘created an entire data broker industry that’s now worth billions,’ Polsky said.
Google, Amazon and Facebook are explicitly touted as CMG clients, but these tech giants have denied that they are using CMG’s Active-Listening software
This industry’s rapid growth is owed partly to the development of highly sophisticated large language models, like Chat GPT.
These extremely powerful AI tools have made it easier and faster for advertisers or other third parties to mine our voice data for valuable information, LoCascio noted.
‘All I have to do is take one of those transcripts, drop it in the ChatGPT box, and then ask it a simple question. Like, ‘please tell me what product and services I could market to somebody based on this conversation,’ he explained.
Once that voice data is captured, it can be sold to advertisers to direct and inform their targeted marketing. But it can also be sold to other clients, who could be using it for entirely different reasons.
‘They could be capturing those conversations for anything,’ LoCascio said.
‘It’s one thing to say they’re doing it for ads, and they can claim that, but they sell that information blindly to other people. And they don’t scrub it, so they basically sell an audio transcript,’ he added.
Other examples of voice data purchasers include insurance companies, for the purpose of creating personalized insurance rates, and the federal government, Polsky said.
‘One of the purchasers of our information – information about us – everything from our opinions, our predilections, our associations, our travel routes, is the government,’ she said.
And there are other insidious entities that want to get their hands on our voice data too, such as ‘people from the dark web that want to profit from scamming us,’ Polsky said.
That means that sharing your social security number or other sensitive personal details could put you at risk of identity theft, LoCascio said.
CMG is an American media conglomerate based in Atlanta, Georgia. The company provides broadcast media, digital media, advertising and marketing services, and it generated $22.1 billion in revenue in 2022.
CMG did not respond to DailyMail.com’s request for comment.
The leaked deck details the six-step process that the company’s Active-Listening software uses to collect consumers’ voice data through seemingly any microphone-equipped device.
It’s unclear from the slideshow whether the Active-Listening software is eavesdropping constantly, or only at specific times when the phone mic is activated, such as during a call.
As for whether legislators will move to protect the public from this kind of surveillance, LoCascio said it’s highly unlikely and probably would not make a meaningful difference anyway.
‘They can write all the laws they want, but the bottom line is that we all indemnify them the moment we go and click, ‘yes, I approve of whatever your terms of service are,” he continued.
That’s why it’s important for device users to understand the privacy risks that come with skimming through an app’s Terms and Conditions and blindly accepting.
To prevent your voice data from getting captured and sold, LoCasio recommended going through all your apps and deleting everything that you don’t regularly use.
Once you have trimmed down your apps, go through the ones that are left and think critically about which ones you trust to have access to your microphone, and which ones you don’t.
For the ones you don’t, change the settings to prevent them from accessing your mic. That should stop them from potentially eavesdropping on your conversations.
And if you downloaded an app for a specific purpose and it’s no longer needed, delete it, LoCascio said.
If you granted microphone access when you downloaded it, letting it sit unused on your phone gives it ample time to begin listening to your conversations at any point.
Polsky added that it’s best to keep your phone and other devices turned off when you’re not using them.
And at the end of the day, educating yourself about the privacy risks associated with your devices is most important, she said.
‘Nowadays, you can’t trust anybody,’ LoCascio said.