Why You Think Your Devices Are Listening To You

 

At a dinner party the other night, a very accomplished business person told a story about how he and his wife were certain that their devices were listening to their conversations. “I was talking to my wife about a pair of designer shoes that she wanted to purchase, and not 10 minutes later while she was doing some online research for work, she saw an ad for that exact pair of shoes. She hadn’t searched for the shoes; the ad just appeared. Clearly, our computers or our phones are listening.” Some people nodded in agreement, and others began to chime in.

I listened politely for a few minutes more as the story was embellished and other guests shared their own versions of “surveillance state” anecdotes.

Then, I raised my hand like a school kid and said, “OK. Wait. Which do you think is more likely? (A) There is secret software that breaks about 20 different local, state, and federal surveillance and privacy laws, that neither I nor any of my clients know about but that are being secretly used by me, my clients, and other advertisers to put the right message in front of you at the right time in the right place?

Or, (B) Thanks to your online behaviors (and the privacy policies, terms, and conditions you have agreed to) we have access to enriched data sets and our predictive models and machine learning tools have evolved so quickly that we have an uncanny ability to understand your behaviors well enough to put the right message in front of you at the right time in the right place?”

Questions that followed included “What is an enriched data set?” “What is an online behavior?” “What is a predictive model?” What kind of machine learning are you talking about? Is that AI?” And my favorite, “How do you know what I’ve been talking about with my friends?”

Enriched Data Sets

Data is more powerful in the presence of other data. If you have someone’s name and email address, you can send them a general offer via email. If you know where they live (phone book), what car they drive (warrantee lists), if they own or rent their home (public records), where they work (location data from your phone, LinkedIn, or other public websites), what they do (LinkedIn or other public websites), what their hours are (location data from their phone or Yelp or Google), how many people they are responsible for (inferred from their purchasing data), what they ordered for dinner last night (their social media posts), where they had dinner last night (their credit card info – which is legal to obtain if the company has a business relationship with them), how much debt they carry (their credit report), their credit score (credit reporting organizations), etc., you can send them a more targeted offer. The more data you have, the more accurate your predictions can be. But there is more to enriched data sets than passive information. Let’s add in online behaviors.

Online Behaviors

When you click on something, you are exhibiting an online behavior. This includes links in search, links to articles, links on websites, visiting a website, stopping while scrolling a social media site to look at a meme or message, swiping left or right, tapping an icon on your smartphone, picking up your smartphone (accelerometer), walking or running using a health app (GPS), using Waze, Google Maps, or Apple Maps for wayfinding, talking to Siri, Alexa, Google, Cortana, or Bixby, or playing a game of any kind on any device. All of these behaviors are captured, logged, and used to enrich your profile.

Is All My Data in One Place?

Your enriched profile is not in one place. But every company that wants to send you a targeted message does everything it can to create a “single view of the customer.” This includes cobbling together the most robust, most enriched data profile possible. The better the profile, the better the predictions. The private profiles that big tech organizations such as Google, Facebook, Amazon, Netflix, Microsoft, and Apple have for each of us are unimaginably large, and the predictions they make are extraordinarily accurate. In China, the government has pretty much 100 percent of the data everyone creates. In the EU, GDPR has been enacted to protect people from this. It is too early to tell is GDPR works.

Predictive Models

Most predictive models fall into two general categories: classification and regression.

The goal of classification algorithms is to identify new data as belonging to a specific class or category. There are binary classifications (two possible outcomes such as male/female) and there are multi-class classifications (data may belong to multiple classes or categories). This is roughly analogous to a person asking, “What is this?” then thinking about it and then declaring, “Oh, it’s a cup. Let me put it in the cupboard.”

You are part of several classes including your family members, your friends, and your communities of interest. If you have been mathematically placed in a class with people who are likely to be discussing designer shoes, you’re going to see ads for designer shoes. Is it a coincidence that you were “just” talking about designer shoes? No. The algorithm was 92 percent confident that you had a 71 percent chance of talking about designer shoes.

Regression analysis can be used to infer relationships between independent and dependent variables. If I know a bunch of stuff about you (such as your income, zip code, monthly mortgage payment, type of car you drive now, age, and gender), I can use regression analysis to predict what car you’ll want to buy or lease next.

Why is a PhD in data science is worth $1.5 million per year to a data-rich organization? Because a PhD in data science knows how to creatively and efficiently apply analysis techniques to make super-accurate predictions.

Machine Learning and AI

If I showed you a 10 x 10 spreadsheet of data about your business, in only a few minutes you could tell me everything it represented. You know your business, your customers, your industry. The numbers would describe things you have experienced in real life, and you would be able to explain (using the language of arts and letters, not the language of mathematics) how the numbers spoke to you.

However, if that data set was 25,000 columns by 25 million rows, there is no way you or any other human being could ever look at or interpret the data. That’s why it’s called “big data.”

To look at big data, you need computers. And to make the data actionable, you can teach machines to do predictive analysis. Machines can now learn, and predictive analytics is one of the things machines learn to do very, very well.

Why You Think Your Devices Are Listening to You

First, we need to define “listening.” With respect to your private audible conversations, meaning spoken words that might be recorded and interpreted, unless you are under surveillance by a government agency with a warrant or being illegally eavesdropped on, no one is listening to your conversations with any tool that will be used to put advertising messages or content in front of you. No matter what you think Alexa or Google Assistant does when you have not said a wake word, it’s just not happening. (Note: There have been some sensationalist headlines recently about Alexa and Siri quality assurance (QA) workers that have heard things they should not have. They certainly were not using the data for anything nefarious, the company policies around QA were just poorly thought through.)

However, every other device in your world (including Alexa and Google Assistant after you say the wake words) takes whatever behaviors you exhibit and whatever data that can be gathered about you and uses it to make predictions about your behaviors.

So, in practice, everything is “listening” to you. Not humans in rooms with headphones, but rather computers in data centers using AI. The data you create about yourself is being gathered, analyzed, and used all the time – 24/7/365.

What to Do About It

Now comes the hard part. We have to figure out if the benefits of accurate messaging and the convenience of our machines knowing us at the most intimate level are worth the risks. Designer drugs created from our own DNA seem great. Tools that can read our emotions and help us cope with complex issues seem scary, but also great. Custom movies with custom soundtracks created in real time specifically for us individually seem like science fiction, but they are only a few years away. Self-driving cars that know where we are going and how we like to travel seem awesome too.

All of this requires big tech to have unfettered access to our data. Should it? If so, what data? Whose data? In the next few years we are going to have to vote – mostly with our wallets – about this. Our elected officials will have to deal with it too.

Do yourself, your kids and grandkids, and your unborn future descendants a favor: make this a personal priority. Our future depends on it.

 

 

Shelly Palmer is Fox 5 New York's On-air Tech Expert (WNYW-TV) and the host of Fox Television's monthly show Shelly Palmer Digital Living. He also hosts United Stations Radio Network's, ...

more
How did you like this article? Let us know so we can better customize your reading experience.

Comments

Leave a comment to automatically be entered into our contest to win a free Echo Show.
Carol W 4 years ago Contributor's comment

Sorry Mr Author, I couldn't disagree more. There is nothing that is great about any of this. The genie is out of the box and we now find ourselves the unwilling victims of future technologies we have no control over. Our awareness has come too little too late. Our privacy is no longer ours. And there's no one and no way to stop it. Oh, politicians will try to use it to gain a voters but the empty words will amount to nothing but empty promises and empty results. And don't tell me there's an app that will help my cortisol levels when it was the other 5000 damned apps that got me strung out in the first place. You think you're paranoid now? They're just getting started. You want privacy? Unplug your computer, throw away your phone.

Cheers, Carol, a retronaut in training.

Susan Miller 4 years ago Member's comment

Carol is right. Companies are getting out of control and crossing lines knowing that if they get caught, they'll only get a slap on the wrist. I just read about how when #Google's Street View mapping cars passed by people's homes, they intentionally hacked into their wifi routers and stole users' private data. This was barely in the news, which is shocking! $GOOG $GOOGL

They knowingly stole users' info, kept it for years, and now only have to pay $13 million. Outrageous!

www.cnn.com/.../index.html

Gary Anderson 4 years ago Contributor's comment

I do think data sets can be valuable in specific industries. Or in a call center where time is wasted identifying products and history of the caller. But as a general rule you are right. We can go, Susan and Carol, into someone's house and an entire conversation can be recorded without our knowledge. That is just going too far.

Susan Miller 4 years ago Member's comment

Absolutely Gary. I don't understand how there has not been more outrage about #Google. They were found guilty and supposed to expunge all the private data they stole from their users almost 10 years ago. They never did. And the cl;aimed it was an accident but it later came out that they INTENTIONALLY stole this data.

I think it's no coincidence Google dropped the "Do no evil" from their mission statement.

Adam Reynolds 4 years ago Member's comment

Actually it was "Don't Be Evil" and was in #Google's 'Code of Conduct' since the beginning. Until it was removed last year. I guess their new motto became "Don't be Hypocrticial" so the line about not being evil had to be removed.