AND joined Twitter in the apparently halcyon days of 2009, before Brexit, Sandy Hook denial, Covid-19 conspiracy-mongering, and the livestreaming of police brutality. At that time, it felt like a school playground: you larked about with like-minded individuals, made charming acquaintances and laughed at the antics of the resident show-offs. Maybe, for someone, somewhere, that version of social media still exists. But probably not. Anyone who has ignored the advice of the smugly offline to “never tweet” is aware that a successful afternoon on social media these days is one in which you somehow manage to evade harassment, racism, misogyny, videos of atrocities, or a distant family member’s radicalised rant about, say, the wokification of Waitrose.
Wading through digital sewage is the upfront cost of using these sites. Less obviously, we pay with our attention and creativity, freely providing the content that expands the fortunes of their founders. And yet social media remains an alluring prospect, especially for the lonely, the disfranchised, the frustrated and those who feel alienated from society. It offers a semblance of community, somewhere to belong, the impression of followers who appear to care about you, and, most compellingly ,; a place where your views can be validated and reinforced.
In The Chaos Machine, New York Times reporter Max Fisher attempts to chart the development of these familiar and contradictory forces from the time of Facebook’s launch in 2004. Since then, the site has expanded from a dorm-room project for rating the attractiveness of female students to the world’s third most visited website, with the unregulated power to move fringe conspiracy theories toward the mainstream, elect governments on the back of misinformation and even, according to UN human rights experts, play a “determining role” in genocide in Myanmar.
Fisher has enjoyed more access than most. In 2018 he received a stash of documents from a Facebook contractor-turned-whistleblower (named Jacob, in the book) that purported to reveal the inadequacy of the social network’s moderation policies. Facebook duly invited Fisher to its offices to sit in on high-level meetings. This level of insight, he writes, left him alternating “between sympathy for and skepticism of Facebook’s policy overlords”.
Inevitably, the company – and others like it – claims the patterns of radicalization and abuse predate social media. Technology, they argue, has merely reduced “friction” in communication, allowing messages to propagate more widely. Clearly a propensity to make snap judgments based on incomplete data, and to join like-minded mobs when pricked by outrage are general human flaws. But this is something else. Fisher how social media algorithms and design “deliberately shape our experiences”, exerting “such a powerful pull on our psychology and our identity that it changes how we think, behave and relate to one another”.
He quotes Facebook’s own researchers as saying “our algorithms exploit the human brain’s attraction to divisiveness”, leveraging that flaw to “gain user attention and increase time on the platform”. Twitter and Facebook are engineered in ways that “supercharge identity into a matter of totalising and existential conflict” – an idea familiar to anyone who browsed their feeds in the months leading up to the Brexit referendum.
In one sense this is a contemporary retelling of the myth of Narcissus. Social media provides the mirror in which we see our ideas and preferences algorithmically reflected. As these beliefs are reinforced, we fall increasingly in love with that reflection until some previously trivial thought or prejudice becomes a defining element of our identity. Simultaneously, we are not built for the omniscience social media affords, making us party to every tragedy and triumph across the world in real time. Fisher likens the platforms to the cigarette manufactures of the 60s, claiming not to understand why people might be concerned about the impact of their products. At some point we’ll look back on these days in bewilderment.