May 15 2019
Google Has ‘VOODOO Doll Avatar-Like Version of YOU’ – Former Tech Giant’s Design Ethicist
Tristan Harris, a former Google design ethicist, says that contrary to popular belief, those eerily hyper-targeted ads for the very…
Tristan Harris, a former Google design ethicist, says that contrary to popular belief, those eerily hyper-targeted ads for the very product you just talked about aren’t popping up because your phone’s mic is spying on you.
Instead, Google and Facebook have gathered so much data about you and your habits that the corporation can simulate what Harris calls a “little voodoo doll, avatar-like version” of people in order to predict which ads they might click or products they might buy, according to the Australian Broadcasting Corporation.
It’s a creepy image — giant companies using your online behavior to reconstruct your personality and interests from the ground up — that reveals the futility of digital privacy in the era of big data.
During his time at Google, Harris, who has since founded the Center for Humane Technology, was responsible for “ethically” influencing the thoughts of Google users, writes ABC.
But based on his comments at a town hall conference about the impact of technology on society on Tuesday, it sounds like Google is still heavily invested in figuring out how to predict our behavior.
“I don’t have to listen to your conversations because I’ve accumulated all the… clicks and likes you’ve ever made, and it makes this voodoo doll act more and more like you,” Harris said on the topic of extremely hyper-targeted advertisements. “All I have to do is simulate what conversation the voodoo doll is having, and I know the conversation you just had without having to listen to the microphone.”
That’s right: a voodoo doll made up of your clicked links, location, likes, demographic information and other digital hair clippings is babbling away in a server somewhere – and it’s so lifelike it’s actually mimicking your conversations.
… and hacking your attention span
Tristan’s job at Google was, as he puts it, “studying how do you ethically influence two billion people’s thoughts?” through considering how apps hack into people’s psychology (the dopamine rush of a Instagram like), or the way incessant Gmail notifications can induce a state of anxiety.
Big tech companies, he argued in 2013, while employed at Google, were abusing their users by stealing their time.
… to draw you deeper
Tristan uses the example of watching one video on Youtube and then going down a rabbit hole and snapping out of the trance hours later.
“You’re like what the hell just happened to me?” he said.
“It’s because the same moment you hit play it wakes up an avatar voodoo doll version of you – it has one of these for one out of four people on Earth – and it knows exactly what video to play next because it simulates on that voodoo doll.
“[The Youtube algorithm] asks, if I tested these 100 million variations of videos which one would cause you to stay the longest?”
Now consider that 70 per cent of Youtube’s traffic is driven by the algorithm, and people spend about 60 minutes a day on average on the platform.
With a billion users, that means about 700 million hours a day of human attention is being determined by a computer.
In effect, the algorithm takes control of what people are thinking and feeling.
The same is true for any social media or tech platform that makes money through having active engaged users; there is an incentive to hold their attention, and the best way of doing this is through crazy, hyperbolic content.
Tristan also uses the example of Facebook groups. (Read More)