It seems like everyone is talking about The Social Dilemma at the moment. It’s likely that most people have never heard of it. And yet the perception I have through my own Reality Feed is that this movie is being talked about everywhere. Though seriously, I believe it’s in the top 10 things watched on Netflix this week.
I’ll be honest. When I first heard about it, I had no interest. It didn’t appeal to me because the premise appeared to be going over old ground. Truths about the way we have sold ourselves to technology and social media. And the hype around this notion that we have bought a one way ticket to the annihilation of humanity. Do I really want to watch that?
But then my friend Jas mentioned he had also heard it was worth checking out. So I opened Netflix. Wow, there it was. Top of the page, full width spread. As if it had been expecting me. 😳
If you’ve not heard of The Social Dilemma, it is a documentary that blows the whistle on the Persuasion Technology industry. It revolves around testimonies, reflections, and laments, from some of Silicon Valley’s leaders. Those who have been instrumental in creating the biggest platforms and technological features we all know (and most of us use).
No Great Surprises
There was nothing in the film that surprised me. No great revelations that I hadn’t already been at least partially aware of. But the way The Social Dilemma presents the situation felt pretty fresh, and will hopefully pave the way for some good conversations.
I never intended to write and share thoughts on this, but my morning pages got a little bit overtaken by my own reflections on what I’d watched. Not least because it is another piece in the jigsaw I’ve been putting together in the wake of information burnout and news fatigue that really took hold of me about 4 months ago.
Maybe you’ve been feeling similar. Exhausted by the onslaught of news that makes you simultaneously furious and hopeless. The links, videos, and trending topics, you know will take you into a rabbit hole that leaves you feeling dirty and used, but you just can’t help yourself. And it’s made all the worse by the fact that you know it’s bad for you, but something pulls you in and you just can’t help yourself.
The Social Dilemma is a reminder that this is probably not an accident. We are moving deeper and deeper into a system which requires us to do what it manipulates us into doing.
When I finished the movie I couldn’t help but wonder if I’m being farmed like an animal.
I also began to wonder…is Artificial Intelligence really driving us towards self-obliteration , or does it simply bring into the open the stuff that is already lurking around, beneath, and within me. Is it what already makes up the fabric of society? And it only makes it more visible.
I have loads of questions. And I’m interested in the power of resources like this, to stimulate healthy conversations. I’ve already seen the cynical, the dualistic, and the ideological tear downs. I have nothing invested in this film. I don’t think hype is the best reaction we can have to it (I’ve seen plenty of that too). It’s unhelpful, not least because it just feeds the very beast we’re looking at exposing.
Well I found a bunch of ideas really interesting to think about and reflect on…
What (or Who) Is The Product?
The idea that, “if you are not paying for it, you’re not the customer; you’re the product being sold” has been around for a while. Of course by definition, if you’re not paying for something, you are not a customer. And we don’t have to look far to see examples of companies collecting data on its users and building accurate profiles to sell to.
For some people this is seen as a positive. Targeted advertising means they only see things they might actually be interested in buying. Gone are the days of being shown irrelevant adverts and posts. Isn’t that something to celebrate?
But Jaron Lanier suggests that the idea that non-paying users are the product not the customer, is a simplistic assessment that fails to see the whole picture.
“It’s the gradual, slight, imperceptible change in your own behaviour and perception that is the product.” – Jaron Lanier
Oh dear. Yeah that sounds much more terrifying. And this is where it begins to get murky and confusing. These companies don’t just use our data to sell to our preferences. The Social Dilemma argues that they subtly manipulate our preferences, morphing them over time with such imperceptible shifts that we don’t see it happening. It’s an insidiously gradual change that happens, creeping beyond our field of vision. Like a sleight of hand that we don’t see the magician perform as part of their trick.
The sleight of hand occurs through the use of advanced and secretly concealed algorithms. Increasingly intelligent and perceptive readings of what will lure us in and distract us, so that are more susceptible to perception management, information spreading, and parting with our cash.
I know there has been resistance to it by other people, but I kind of enjoyed the way Jeff Orlowski personified the algorithms behind the screen of Ben’s phone.
In the fiction elements of The Social Dilemma, the algorithm is brought to life as a committee of three men, working to keep him looking, tapping, and scrolling. Between little auctions when they sold the fruits of Ben’s behaviour to the highest advertising bidder. It was slightly abrasive in some ways (running the risk of turning it into a pantomime, which it may have done).
But I’ll be honest, I found something helpful about this image. Because imagining real people plotting to predict and manipulate your behaviour, makes it more fun to try and subvert. It’s increased my own awareness of what I’m doing whenever I open my phone or laptop, and whether I’m being suckered into doing something which means my personal Algorithmic Committee is high-fiving and celebrating behind the scenes.
Obviously it doesn’t work like that in ‘reality’. But it’s a potentially useful tool (for some of us) to use as a way to locate some measure of autonomy and control.
I Want The Mundane Details From Those I’ve Chosen to Follow…Not Controversy and Hype
I remember when Twitter moved to an algorithmic timeline in 2016. The experience was permanently altered and my frustrations with the platform grew from there. It hadn’t been perfect for a while. There were other elements that had taken it away from the elegant simplicity of its infancy.
But when you no longer saw what others posted in order of when they posted things, it kind of changed everything. It became impossible to follow who you wanted to follow, and to see their updates in chronological order. The algorithms presented a more hyped up and divisive timeline. What were people getting outraged by? Who said what to whom? And if you don’t post for a while, it’s sure to throw some ‘Recommended For You’ notifications your way, to try prompting you back down the rabbit hole.
Relentlessly Annoying Notifications (bait on the water)
There are notifications you can’t turn off. The little blue number tells you something is happening. But the truth is, nothing is happening. No one has spoken to you and these are baiting attempts to make you bite. We haven’t posted much on the Earlybird Twitter account, and it gets at least 10 new notifications per day. Someone we follow retweeted something by someone we don’t follow. Two people liked a photo of a cat. Or some Twitter news clickbait headline, implicates a celebrity in some outrageous act.
Facebook does the same. When I check in from time to time, I’ll invariably be greeted by a notification. “Oooh”, I think. “Someone must be interacting with me”. Nope…they’re just telling me that “your post is getting more engagement than 90% of your other posts”. ‘OK, great, umm thanks?’
There are phantom notifications too. These are not mentioned in The Social Dilemma itself, but it made me wonder if the little red ‘1’ next to ‘Notifications’ on my Facebook page is no accident. Maybe it’s an intentional glitch, designed to make me keep looking for what’s causing it. Or am I just getting paranoid?
Notifications are hypnotic triggers. They both mould and stimulate our brains. And they pull us away from the thing we are focussed on. Before we know it the ‘just a moment’ check, turns into 2 hours of clicking, watching videos, and getting drawn into and angry at a debate between people we’ve never met, never followed, and never want to meet or talk to.
Am I A Tool? (OK, don’t answer that)
I’ve always thought of social media platforms and search engines as tools. Things we use for a particular purpose in order to achieve a specific outcome. But as they suggest in The Social Dilemma, a tool is something which sits passively. While it waits to be used, it doesn’t do much else.
But when we pick up our phone, the apps, platforms, and hardware itself may start to use and shape US. What if it is in-fact we that are the tool in this equation? Again, this is a slightly simplistic statement to make, because in reality it is a lot more complex than that (we CAN and DO use technology in very effective and intentional ways).
It is in fact, the subtle ways it builds a picture of who we are, based on our behaviour, values, and beliefs (analysis of how we’ve BEEN USING the tool), that then changes what it wants us to see and do when we NEXT USE the tool.
This Social Dilemma is Bigger Than One Individual…and It’s a Problem For Us All
These concerns are bigger than a single scapegoat that we can use to blame or from whom we might demand change. In many ways the genie is out of the bottle and there is no way to put it back in. It’s easy to generalise and catastrophise. And to create a simplistic straw man monster baddie out of a particular platform, individual, or group. But this is something for all of us to grapple with.
As Tristan Harris says, the technology revolution is both a utopia and a dystopia. You can tap a button and a car will take you where you need to go. And on the same device, someone might struggle with their mental health because their social media post doesn’t get the positive engagement they seek.
The belief in technology as a foundation for positive change in the world, is at the heart of the work many of the speakers now do. Though we might question the judgement of those who created features and tools that became centre-pieces of what they now see as a great evil in the world. Just sayin’. There’s an awful lot of people in The Social Dilemma, suggesting, ‘I invented it in good faith and it was corrupted by money’. And it might be nice to see a little more responsibility taken, and reassurance that they’ve learned some lessons for their next venture…
Mirror That Reflects Us, or Mould That Shapes Us (or maybe a little of both)?
So whatever we think of the technology itself, this poses a problem for all of us. There are implications that we all experience. And it’s no good to simply write it off in various ways. The Social Dilemma argues that the consequences are there for us all to see in the fragmentation of the fabric in our social system and ways of understanding the world.
It would appear that everyone literally has a different truth and a different reality presented to them. And false information is 6 times more likely to spread than verified, fact-checked information. Why? Because people find the truth boring.
I’m not sure this is a new thing. There is something in human beings that loves good gossip and a fun conspiracy theory. And has done so way before the invention of the internet. It might be fair to suggest that these algorithms are using our most destructive tendencies (towards ourselves and our communities), against us. We are willing participants because we literally cannot help ourselves.
Technology as an Existential Threat
In The Social Dilemma, Tristan Harris takes us through the mindset that leads us to inaction. I’m just watching cat videos on YouTube and connecting with people on Twitter. I call out BS when I hear it, and highlight people who have outrageous views. My behaviour is hardly an existential threat.
And this is where we might be sleep walking. Because that judgement is completely right and fair. To call my mundane use of technology an existential threat to humanity and the world, seems like a gross over-reaction. But as Harris suggests, the technology has an ability (and perhaps it’s even attached to this purpose), to bring out the worst in society…and “the worst in society is the existential threat”.
In other words, the way we’ve set things up, technology is not bringing out the best in society. And I don’t think there’s really any argument about that. Swathes of Social Media shows a cess pit of bullying, narcissism, trolling, disconnection and general vileness. But what if this wasn’t inevitable?
What Do The Creators Believe?
You’d think those who designed and built the products would be proud of them and believe in their value for the world, right? Well, perhaps not so much.
Those who build this technology seriously regulate how much they want their families to use the platforms. They know how potent and damaging it can be. And they know how susceptible they are themselves to its addictive power. I couldn’t help but think of the image of the drug lord who would never dream of touching the product they have profited so greatly from producing and distributing to unsuspecting (and trusting) members of the public. And according to those in The Social Dilemma, the main part of their business plan is weening people onto the product, and creating a dependency on it and a belief that life cannot exist beyond it.
And this leads me onto the final point that creates a whole new minefield to (carefully) explore.
Every time I pick up my phone and use one of these ‘persuasive technology’ products, I am subtly exposing myself to being re-shaped. Both physically, through the addictive programming of my brain, but also through the immersive drip feed that shows me a mirror with an image that isn’t quite what it should be. There is always more on the endless scroll. Another video.
It’s like disappearing into that proverbial rabbit hole. Every time we re-surface, there is something altered. But this shift is so small and imperceptible that it’s impossible to see.
I’m not sure this can be seen, therefore as a black and white issue of personal choice and responsibility. I can take the view that, ‘it’s OK, I know the risks but I’m choosing to do it anyway because I believe the benefits outweigh the costs’. Because when we are unaware of how those risks are harming us. And it’s not just that, but the risks to society at large. Our actions are no longer just about us. They’re about a potential toxin running through the bloodstream of democracy, which is by definition, a fragile system which needs protecting and renewed commitment, at the best of times.
Regulation for the Animals?
We have regulation in pretty much every type of legal industry. Reason being, so that individuals and companies can’t get away with harming people or the environment in which they operate.
What’s different is that regulation for people, is usually in place to protect customers. And as we know from earlier, the customers are the advertisers.
So what about protection for us? What are we again? The product. How is the “gradual, slight, imperceptible change in our own behaviour and perception,” protected? Can it be? At the end of the day, if that isn’t occurring, then the customer won’t be interested and the industry shrinks. Who in power is going to choose that route?
The cost is our mental and physical health. It’s our belief in a better world, and even our ability to imagine a world beyond this one. As well as many of the freedoms we take for granted.
Do we need agricultural style regulation? To make the farming methods more humane and less intrusive? What would that look like?
Can the farmers regulate themselves? Is it fair to expect that from them? There is perhaps issues with what is being passed off as healthy feed right now. Maybe that needs regulating.
What goes into the food that turns us into the most valuable product? Outrage, hype, division, the extremities, gossip, hostility, name-calling, public shaming etc. We love it (in the sense that we are attached to it), but it’s really bad for us. It makes us grow quite fast and high, and it’s really easy and cheap to produce en masse. So as long as the harvest is ripe, the customers are loving it.
Is it really that bad?
What are we able to do to protect ourselves? What can we change in the systems themselves? And how can we raise awareness in ourselves and others, of the ways we are being manipulated into shifting our perception for the nefarious purposes of advertisers?
Will we ever recognise that what we think is our free thinking, unshackled, ability to see the truth, is actually our own self-directed foray into an algorithmic rabbit hole that produces a caricature that we believe to be nothing but 100% real?
I mean, I can easily tell others how they’re the product of this machine. Especially if they spout out views we find ridiculous. But if I can’t see it in myself, then perhaps that’s why and how it works so seamlessly and perfectly. And I don’t know if you’d agree, but I find it really hard to see it in myself. Maybe that’s why I’m dwelling on this so deeply. It’s a quest to find how I’ve been (and am being) changed by these devices, platforms, and apps.
Over to You
Have you seen The Social Dilemma? What impact has it had on you? I’d love to read your response in the comments below.