In this pivotal year of elections where the impact of who we elect will shape the trajectory of our collective and interconnected futures for many years to come. We must be aware of the subtle and not so subtle influences on how we behave, perceive each other, and vote. One important factor which consumers of information must take note of, is the way disinformation, microtargeting, and social media have not only enhanced our lives but also polarised our societies.
In 2024, an estimated 4-billion people will go to the polls to elect leaders of their countries, in what has been deemed as a global year for democracy. Over 60 countries, including El Salvador, the UK, Rwanda, Russia, Taiwan, Indonesia, India, South Africa, and the US will host elections - some will be free and fair, others will be deeply compromised.
All of these elections, however, will be consequential.
On the global ballot, our votes will determine whether we can arrest and contain the climate crisis, whether the warmongers and their profiteers will be allowed to upend international human rights frameworks, and whether our democracies are actually resilient enough to manage the rise of right-wing authoritarianism. Complicating and contributing to all of these issues, is the rampant use of disinformation and political microtargeting on social media, which has played a profound role in polarising our societies.
Can you hear me from your echo chamber?
Social media has undoubtedly revolutionised the way that we communicate, engage, think, and perceive each other and the world around us. It has an influence in mobilising public views and political opinions, and has democratised information dissemination, allowing for a more inclusive and participatory political environment whereby politicians can connect directly with the electorate.
However, the days of naively trusting in the benevolent powers of the tech industry and their messiahs are thankfully over. Since 2016, evidence has steadily been growing, based on experiences from elections in the US, India, Brazil, the UK and the Philippines, that if left unregulated, social media can have a destabilising impact on the very tenets of our democracies. We have transitioned from the principle that “information is power” to the present moment where disinformation is power - those who control and manage the information ecosystem, have an influence on how people vote, behave, interact, and perceive the world.
Disinformation is not a new phenomenon - it’s been around for centuries. What is new, however, are the algorithms being deployed on social media, where users, often without their knowledge or understanding, are being shown specific content which they will find agreeable based on their tastes, preferences, and biases. These echo chambers that we find ourselves in are reinforcing our existing beliefs (right or wrong!), isolating us from diverse perspectives, and further exacerbating polarisation both in the virtual and real worlds.
Microtargeting - The Invisible Hand
Added to the present phenomenon of disinformation, is the use of microtargeting. Microtargeting involves analysing vast amounts of data across social media and the internet, to identify potential voters and tailor messages specifically designed to resonate with them. It is a powerful tool currently being used by political campaigns across the world to persuade voters directly, based on their preferences, habits, and ideologies.
Barack Obama’s campaign team was successful in doing so in both 2008 and 2012 when it directly reached young people in the US and targeted them with messages that the youth in America were increasingly interested in - healthcare, education, and student debt. At the time, this type of campaigning was hailed as innovative and impressive, using the direct power of social media to increase voter turnout.
Fast track to 2016, where both the Trump campaign and Brexit Vote Leave campaign used the same tactics as Obama did, though they deployed significantly more sinister strategies, using disinformation to microtarget voters - appealing to their innate fears, biases and belief systems - without users’ express knowledge that they were being intentionally targeted based on a set of demographic and behavioural data that social media companies collected and sold to third parties.
While microtargeting can seem benign in some instances, where users get targeted with specific adverts based on their shopping, food, commercial, or lifestyle preferences, it is much more dangerous when users get targeted politically and socially, to influence the way they vote or don’t vote, based on their internet scrolling habits and engagement behaviours.
A New CODE
As a social justice activist with a keen interest for data and technology, I’ve long been interested in how our societies and behaviours are shifting rapidly, based on the little tech devices that we carry around with us. Seeing how Donald Trump, Narendra Modi, Jair Bolsanaro, and Rodrigo Duterte all rose to power using very targeted digital campaign strategies, was a turning point for many.
Our world, elections, and democracies would probably never be the same again because social media companies allowed their platforms, services and users’ data to be used and manipulated by the world’s most dangerous despots. Furthermore, our ever-increasing reliance on digital technologies is shaping the way that we act, connect, transact, and interact. Those who write the codes behind our screens, write the rules by which the rest of us live. Software engineers are becoming the social architects of our time, framing the way that we see ourselves and each other.
For these reasons, with the belief that activism can be pivotal in the digital domain, I have established the Campaign On Digital Ethics (CODE) - a non-profit organisation aimed at improving your digital rights, and shaping a digital future based on human rights frameworks. At CODE, our strategy is simple: improve the digital literacy of users of the internet, and advocate for ethical and legal frameworks in the development and deployment of algorithms and AI.
It probably seems untenable for a small NGO in South Africa to take on the foes in the big tech industry, but working directly with you, a user and consumer of social media and digital technology, we can revolutionise the digital space - making it safer, transparent, and accountable.
In this pivotal year of elections, where the impact of who we elect will shape the trajectory of our collective and interconnected futures for many years to come, we have to be aware of the subtle and not so subtle influences on how we behave, perceive each other, and vote. The digital landscape, with its vast potential to both empower and mislead, is crucial to this dynamic.
As users of digital technology and online platforms, it is incumbent on us to critically evaluate the information that we consume and to understand the mechanisms for why we are receiving particular types of messages and content.
We will never be able to backtrack on the technological advancements of our times, so in that spirit, we should insist that technology, social media, algorithms, and artificial intelligence, rather than undermine, be built on the basis that they enhance our humanity and democratic values.
Kavisha Pillay is a social justice activist and the executive director of the Campaign On Digital Ethics (CODE).
Comments