top of page
2020-10_Lab_Notes_Header—AI_in_Electio

elec·tion

noun /i-ˈlek-shən/

▶︎ an act or process of electing

▶︎ predestination to eternal life

▶︎ the right, power, or privilege of making a choice

AI in U.S. election campaigns

Introduction: The Candidates 

Presidential candidates Trump and Biden are polar opposites in many respects, but their campaigns share a common ground that, surprisingly, is rarely discussed by media pundits. The 2020 Presidential elections could be decided by vast data-gathering efforts by both parties that have harnessed the dangerous capabilities of communication tools powered by artificial intelligence (AI).

Chatbots

For example, the chatbot—short for “chat robot”—is one such AI-powered tool programmed to simulate and engage in conversations with voters, informed and guided by everything worth knowing about each voter’s life and profile. Leveraging the capabilities of algorithms and natural language  processing (NLP) for the purpose of microtargeting voters, conversations between chatbots and voters are continuously enhanced and fine-tuned through automated Big Data integrations that include voters’ opinions, moods, likes, and dislikes. It is a surprise that AI--enabled microtargeting by political campaigns, involving millions of potential voters in 2020, has not gained more visibility in recent media.

  

Chatbots and other such AI tools used for political microtargeting purposes have been developed over the years by tech and marketing companies to acquire and retain customers and develop brand and product loyalty. The digital advertising ecosystem is therefore a perfect toolset for altering negative views of candidates and inducing more positive ones. In addition, these tools and products enable the connection of voters to online communities of candidate supporters to share and reinforce opinions, perspectives, etc... Chatbots can even provide virtual “coaches” to mentor voters or communities of voters to enable them to become more effective advocates for candidates and achieve political-organizing goals.

continues

Continued from the emailed newsletter

Candidate Apps

 

Each candidate has launched his own app with the goal of raising money and awareness, but each is using the technology in different ways. Joe Biden’s campaign app, called “Team Joe,”, is a viral messaging system that prompts new users to share their contact list. That list is then instantly cross-referenced with the Democratic Party’s voter files. The software then runs an analysis on the contact list and flags specific individuals as prime campaign targets. These individuals will then receive personalized messages from the app user promoting the candidate and encouraging them to download the app in turn. President Trump’s campaign app, on the other hand, is mainly focused on pushing out policy messages and gathering as much data from as many potential voters as possible. An AI program analyzes this data and delivers “News” and “Social” tabs that contain AI-enabled personalized tweets and stories supporting the President’s campaign and policies. 

 

Microtargeting

 

Voter mMicrotargeting is not new. It has been used in national political campaigns since 2004. The tools and techniques were comparatively simple back then, but microtargeting enabled the Bush campaign to reach a remarkably large percentage of likely Bush voters who indeed voted for Bush. Big Data for microtargeting was used during the Obama campaign in 2012, without any post-election analysis of its significance or benefits. It was not until the presidential campaign of 2016, and the use of Cambridge Analytica by Ted Cruz and Donald Trump, that microtargeting became much more publicly visible and also controversial, but once again without any conclusive evidence of its actual value for these or future campaigns.

 

Looking ahead to the future, political parties and their candidates will increasingly rely on Big Data, AI, machine learning, and analytics to connect and engage with voters. Software programmed to analyze the online behavior of voters, their data consumption and social media patterns, and a host of other factors enable the creation of unique psychographic and behavioral voter profiles. Every voter can then get a different version of each candidate that is in line with key aspects of the voter’s personal history and psychographic profile. Automated social media bots can be designed to motivate uncertain or iffy voters, or those who are nervous about a candidate’s ideology. Thus, microtargeted campaigns can be tailored to each voter’s unique individual profiles.

 

Warnings

 

Foreign Interference

 

While Russian cyber-interventions in U.S. presidential elections are not a new threat to voter’s privacy and security, it bears mentioning that if these technologies are available to presidential candidates and advertisers, then they are certainly available to foreign entities as well. Most of us have heard plenty about Russian hackers (referred to by cyber researchers as “Fancy Bear”) breaking into the campaign of former presidential candidate Hillary Clinton and leaking the emails of her staff. According to the U.S. intelligence community, these attacks included phishing, a hacking method which seeks to trick users into disclosing passwords.

 

However, even more subtle than outright hacking are the pervasive efforts by foreign entities to leverage the digital advertising and social-media landscapes to influence voters’ opinions. Recently, these foreign efforts have consistently favored Donald Trump, but regardless of our individual opinion of him as a candidate, it should universally concern us that countries like Russia, China, and North Korea are one shell company away from having access to most of the opinion-swaying tools available to Super PACs and advertisers.

Everybody’s a Target

 

The key takeaway here is that awareness is critical. All voters of any political persuasion in the U.S. and elsewhere need to know these facts: they are campaign targets in any county by ZIP Code, town, and neighborhood; every click that they make on a computer or mobile device, or any credit-card payment has the possibility of ending up in an AI-enabled “voter persuasion dashboard” that is updated 24/7. Data-driven online advertising can target voters based on IP and browser cookie data, online and offline purchase data, and all third-party data points; and sooner or later they may have chatbot interactions customized by their personal profiles.

 

Looking Ahead

 

These days, and certainly in the future, chatbots engaged in political conversations with voters will draw on research in the field of psychology and progress in development of the AI, machine learning and natural language processing (NLP) being utilized by chatbots in “therapeutic conversations” that help patients to manage their mental conditions. Many of the chatbots being used for mental health purposes are being built on Cognitive Behavioral Therapy (CBT) platforms designed to change the way that patients think and behave. 

 

It’s not much of a stretch to envision the use of AI/CBT platforms to change the way that voters think and behave before and on election day. In addition to making candidate’s political stances more acceptable or palatable, AI-enabled political chatbots can draw on a vast amount of research on the ways that chatbots are being used by psychologists to relieve anxiety and depression. Chatbots also have been used in conjunction with the Facebook Messenger platform to monitor changes in user’s moods, in addition to gathering data about users. Think about the possibilities of political campaigning that uses NLP to change negative feelings about people (think candidates) and life situations.

 

Mitigation

 

In 2020 Facebook is trying to prevent the use of machine learning to leverage its own targeted advertising system to deliver streams of misinformation, fake content, and negative messages tailored to users’ profiles. While Facebook certainly should be pressured to make all efforts to combat these harmful practices, this will be yet another game of technological cat-and-mouse, in which large companies like Facebook and Google that mostly operate under the law have a permanent disadvantage against creative and persistent criminal hackers and exploiters. Whether through automated bots or fake news delivered by Twitter or other social media, the documented evidence of malicious microtargeting is abundant and, frankly, scary and nauseating. Most importantly, it is far too ubiquitous for us to be able to lay responsibility on the companies and CEOs. We as users and consumers must stay informed, vigilant, and always skeptical of what we see and read online and ask ourselves if things are truly as they seem, or could there be some chatbot or foreign interference at play.

We at the St. James Faith Lab will keep you informed, and let us know your thoughts and concerns surrounding AI’s use in elections and in shaping public opinion.

The Rev. Canon Cindy Evans Voorhees

Executive Director

St. James Faith Lab

bottom of page