By Adam Ashby GibbardFacebook

You might be sitting here now reading The Leveller, a one-size-fits-all piece of media. But soon you will pick up your phone, open Facebook and receive a kind of personalised digital newspaper that is more up to date than The Leveller – even when it’s hot of the press.

I’d wager that the content on Facebook isn’t anywhere near the caliber of this fine paper collection.Touting quality is a losing battle though, when faced with a service that intimately knows who you are and caters to it. That sentence makes me feel dirty, but Facebook’s dirtiness is like eating a whole box of cookies in one sitting – it’s still tasty.

Facebook is a social media platform built on an algorithm. This is where the catering comes in – but not like catering at a nice wedding. This catering is one done to keep your attention for as long as possible, so advertisers have more time to show you things.

Facebook sells the attention of its vast membership of 2.2 billion monthly active users – almost 29 per cent of the entire world – to whoever wants to pay for access. In the process, it reaps mountains of cash. In 2017 Facebook had a total revenue of just over $40B with 98 per cent of that coming from advertising.

While this is insidious on its own, it gets much worse. The algorithm that gets to know you better than your best friend is built to maximise profits. Roger McNamee, an early Facebook investor, pointed out that the algorithms are built to keep your attention “by sucking up and analyzing your data, using it to predict what will cause you to react most strongly, and then giving you more of that.”

It also comes packaged with a purpose-built Pavlovian dog notification system, telling you all the time about inane news like that time Yada Yada liked your post on “The Best Way to Butter Toast.” This only serves to pull you back in and away from actually living.

The only way The Leveller could possibly compete for your attention would be if we infused crack onto the surface of the pages. (We have not, BTW.)

Facebook’s algorithmic advertising is amazing and that’s no hyperbole – unlike the ‘amazing’ video of a kitten trying to get out of a teacup I watched yesterday. With Facebook’s tools anyone can advertise something to a very specific group based on demographics, interests and location.

If I wanted to push my services as a Sandwich Consultant – because people are eating terrible sandwiches – I could then target middle-aged people who make over $50,000 a year, are interested in sandwiches and live in downtown Ottawa. Whether or not my business would take off is questionable, but I would likely reach about 1,000 people for $20.

Nowhere else can advertising money be spent to target potential customers so specifically. Facebook is able to derive who such people are based on users’ supplied information, their activities on the platform and the sites that they visit elsewhere on the internet.

Basically every click you make gives them more info. And that’s the ultimate cost of their service – the sale of your identity.

Next time you see ‘sponsored’ content, think about how you were specifically targeted. It’s not by chance that you keep seeing ads for shitty beer if you’re in your early 20s, or the Facebook pages of bars in or around a university.

Facebook isn’t shy about what it knows about you either. If you see sponsored content try clicking the “Why am I seeing this?” button and they’ll tell you. Click on “Manage Your Ad Preferences” and you’ll be treated to a buffet of things they know about you. You may even learn something about yourself –  I had no idea I was interested in pelicans, for example!

This is the next step of capitalism: personal information as value. It’s big data finding yet another way to invade your personal life for profit. What’s more is that people seem happy to give up their very personal information for a free social media service. And why? Because it’s more and more becoming a social requirement – and because it’s addictive.

Sean Parker, a founder of Facebook, told Axios that Facebook was designed around “exploiting a vulnerability in human psychology.” Getting likes and liking others, seeing people’s posts and posting, getting messages and sending messages – these all play your dopamine levels like some viral toddler piano virtuoso.

Dopamine, of course, is the happy syrup your body dishes out to regulate rewards-based behaviour, and the main hormone involved in addiction.

“A little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever… get[s] you to contribute more content, and that’s going to get you…more likes and comments,” Parker said. He calls this a social-validation feedback loop.

It’s an addictive loop, but it’s not controlled by the government in any way. The tobacco industry knew smoking was addictive and preyed on that for profit. The opioid crisis is under similar scrutiny now. With almost all harmfully addictive things there has been education, control and legislation. But is Facebook actually harmful?

Former Facebook vice-president for user growth, Chamath Palihapitiya, recently said that “the short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.”

In a surprise move Facebook then came out saying that use of social media could cause mental health issues, but that such issues came from passive use of the platform. Active use is apparently fine and even beneficial, a handy finding for their profit margin.

But none of this addresses issues surrounding social decay – and especially around other mental health issues experienced by younger generations who have grown up with the platform. Facebook recently came out with a kids version of Messenger, so now they are targeting young kids.

Now consider all the algorithms, social media dopamine fixes, phantom vibrations and posts about toast and add some political messaging. Welcome to the 2016 U.S. election, the Brexit referendum and government propaganda in Myanmar, Cambodia, the Philippines and China.

Nevermind actual targeted political messaging – remember that Facebook relies on being an attention hog. And nothing hogs more attention than base emotional response.

The 2017 word of the year was ‘fake-news,’ a term often used to incite anger at the media – and also a source of actual fake news used to incite anger and fear, base emotions at their lowest.

It’s scary to think that anyone with the money could systematically target very attentive groups in a particular demographic with propaganda, or really any message they like. This brings us to the darker side of Facebook, where self-affirming echo chambers operate, liberated from the burden of differing perspectives – sometimes to terrifying effect:

  • Facebook has helped anti-Rohinga  misinformation and hate speech go viral in Burma, feeding what the United Nations has called a textbook case of ethnic cleansing.
  • China regularly uses Facebook advertising to spread word on how great the country is… not to their own people of course, as the platform is banned.
  • The ex-leader of the opposition in Cambodia, now in exile, is suing Facebook in California to find the extent to which the company colluded with the Prime Minister, who presently has more likes on his all-Khmer language page than there are Cambodians on Facebook.
  • Russian involvement through Facebook has potentially been responsible for giving us Trump by upsetting the 2016 U.S. election – and could have been involved in tipping the Brexit referendum.

Facebook has said that it’s working hard to build trust in the sponsored content you see and that it wants to bring Facebook back to its roots as a social media site.

But we have to remember that Facebook is first and foremost an advertising platform. As transparent as they try and make themselves, what they truly are and what they really do isn’t the main talking point. What do we do, especially when we see what kind of damage it can create when abused by anyone with a political agenda?

Maybe it’s time we logged out and threw away the key.

This article first appeared in the Leveller Vol. 10, No. 5 (Feb/Mar 2018).