Technology is causing a set of seemingly disconnected things - shortening of attention spans, polarization, outrage-ification of culture, mass narcissism, election engineering, addiction to technology.
Tristan Harris
Technology steers what 2 billion people are thinking and believing every day. It's possibly the largest source of influence over 2 billion people's thoughts that has ever been created. Religions and governments don't have that much influence over people's daily thoughts.
Tech companies are distracting, dividing and outraging citizens to the point where there is little basis for common ground. This is a direct threat to democracy.
There's nothing in your life or in our collective problems that does not require our ability to put our attention where we care about. At the end of our lives, all we have is our attention and our time.
Information that confirms our beliefs makes us feel good; information that challenges our beliefs doesn't.
With its onslaught of never-ending choices, never-ending supply of relationships and obligations, the attention economy bulldozes the natural shape of our physical and psychological limits and turns impulses into bad habits.
None of most powerful tech companies answer to what's best for people, only to what's best for them.
The most important problems we face are complex, and require sustained attention. But we don't speak in terms of nuance or complexity. Is that by accident? It's because our minds have been entrained to expect shorter and shorter bite-sized bits.
For every design goal you have, you have to have a corresponding measurement to know how you're doing - a way of measuring success.
You're either on, and you're connected and distracted all the time, or you're off, but then you're wondering, am I missing something important? In other words, you're either distracted or you have fear of missing out.
We're going to need a new social contract with the tech world one that asks for consent, and one with transparent goals. Right now, the goals of technology are not aligned with our goals as humans. We need technology that empowers us to make the life choices we want to make.
Magicians start by looking for blind spots, edges, vulnerabilities and limits of people's perception, so they can influence what people do without them even realizing it. Once you know how to push people's buttons, you can play them like a piano.
If you're an app, how do you keep people hooked? Turn yourself into a slot machine.
We continue to have this illusion that things outside of us aren't driving what we think and believe, when in fact so much of what we spend our attention on is driven by decisions of thousands of engineers and product designers.
We're all vulnerable to social approval. The need to belong, to be approved or appreciated by our peers is among the highest human motivations. But now our social approval is in the hands of tech companies.
While nations protect their physical borders, tech platforms leave digital borders wide open.
I'm an expert on how technology hijacks our psychological vulnerabilities. That's why I spent the last three years as a Design Ethicist at Google caring about how to design things in a way that defends a billion people's minds from getting hijacked.
Traditional companies have been subject to licensing for many years. Attention utilities should be required to obey limits on data extraction and message amplification practices that drive polarisation, and should be required to protect children. We should ban or limit microtargeting of advertising, recommendations and other behavioural nudges.
I was a design ethicist at Google, where I studied how do you ethically steer people's thoughts? Because what we don't talk about is how the handful of people working at a handful of technology companies through their choices will steer what a billion people are thinking today.
New technologies always reshape society, and it's always tempting to worry about them solely for this reason.
If one app or news site or friend gets your attention, that means something or someone else loses it. It comes out of our sleep, our time with family or our reflective time with ourselves.
Most notifications you get are because a machine is trying to get your attention. Those notifications aren't built to help you live your life. They're built to get your attention.
We want to have a relationship with technology that gives us back choice about how we spend time with it, and we're going to need help from designers, because knowing this stuff doesn't help. We're going to need design help.
Yes, online privacy is a real problem that needs to be addressed. But even the best privacy laws are only as effective as our Paleolithic emotions are resistant to the seductions of technology.
If, at any moment, reality gets dull or boring, our phone offers something more pleasurable, more productive and even more educational than whatever reality gives us.
Once you start understanding that your mind can be scheduled into having little thoughts or little blocks of time that you didn't choose, wouldn't we want to use that understanding and protect against the way that that happens?
I spend a lot of my time thinking about how to spend my time. Probably too much - I probably obsess over it. My friends think I do. But I feel like I kind of have to, because these days, it feels like little bits of my time kind of slip away from me, and when that happens, it feels like parts of my life are slipping away.
Every year, more and more friends, apps, media or news stories want our attention. We need a better way to organize all the kinds of choices we have.
I actually worry that we're so mindlessly following the herd on privacy and data being the principle concerns when the actual things that are affecting the felt sense of your life and where your time goes, where your attention goes, where democracy goes, where teen mental health goes, where outrage goes.
For any company whose business model is advertising, or engagement-based advertising, meaning they care about the amount of time someone spends on the product, they make more money the more time people spend.
Our online news feeds aggregate all of the world's pain and cruelty, dragging our brains into a kind of learned helplessness. Technology that provides us with near-complete knowledge without a commensurate level of agency isn't humane.
The EU can lead the world toward more humane technology. But doing so requires thinking more broadly about reining in social media platforms to prevent them from degrading our democracies.
Apps or media who make money on advertising are never satisfied with 'enough' of your attention. They will always fight for more.
With our Paleolithic instincts, we're simply unable to resist technology's gifts. But this doesn't just compromise our privacy. It also compromises our ability to take collective action.
I noticed when I was at Stanford, there was a class called the persuasive technology design class, and it was a whole lab at Stanford that teaches students how to apply persuasive psychology principles into technology to persuade people to use products in a certain way.
Ergonomics is about designing for failure modes and extremes: how things break under repetition, stress or other limits. And the goal of ergonomic designis to create an alignment between the user's limits, the thing you're designing, and how people will ideally use that thing.
YouTube has a hundred engineers who are trying to get the perfect next video to play automatically. And their techniques are only going to get more and more perfect over time, and we will have to resist the perfect.
I'm not against technology.
If we really wanted to have a reorientation of the tech industry toward what's best for people, then we would ask the second question, which is, what would be the most time well spent for the thing that people are trying to get out of that situation?
The only form of ethical persuasion that exists is when the goals of the persuader are aligned with the goals of the persuadee.
Our brains are already bad at seeing exponential curves.
While you can use muscle-memory to unconsciously move your thumb to open an app without thinking, it's actually impossible to type on a keyboard unconsciously.