Additional Resource  

The Role of AI in the 2024 Elections

The year 2024 was dubbed “the largest election year in global history” with half the world’s population voting in national elections. Earlier this year, we hosted an event on AI and the 2024 Elections where scholars spoke about the potential influence of artificial intelligence on the election cycle– from misinformation to threats on election infrastructure. This webinar offered a reflection and exploration of the impacts of technology on the 2024 election landscape.

Photo by Dmitrii Vaccinium, Unsplash

In 2024, over 60 countries held national elections, making it one of the largest election years in history. These events coincided with the rapid advancement and adoption of artificial intelligence (AI), causing public concern over how these technologies might impact global elections. In May, we asked experts to weigh in on potential threats, such as misinformation and deepfakes, voting behavior, and vulnerabilities in our election infrastructure. With the end of the year approaching, the Allen Lab asked Danielle Allen, Nate Persily, and Bruce Schneier to revisit their predictions and reflect on the role of AI across the 2024 elections.

Despite early public concerns about AI dramatically disrupting this year’s elections, the panelists agreed that most of those fears were not realized, while a more nuanced reality emerges — AI is permeating all aspects of elections, from campaign operations to the information ecosystem. As Persily noted, AI amplifies the abilities of both good and bad actors to achieve the same goals they’ve always had in elections.

Reflecting on this Election Cycle

During the event, panelists noted that several platforms took proactive measures this election cycle to address election related concerns. Anthropic and OpenAI both redirected political queries to authoritative sources, such as the Associated Press or CanIVote.org. Social media platforms continued to develop and deploy their own AI-driven countermeasures to identify and tackle coordinated inauthentic behavior. Persily highlighted a few interesting behaviors to emerge with the use of AI-generated content. For example, most AI-generated images with viral spread were largely satirical and built upon existing narratives. Notably, multiple politicians took to discrediting unflattering (but true) events as being “AI-generated.”

The Potential of AI in Political Systems

Schneier noted AI’s unique election-related capabilities revolve around the speed, scale, and scope of these systems. AI has been used in various aspects in elections: communications, polling, political organizing, democratizing access to political participation, fundraising, and strategy. AI has improved communications through live translation, such as in the New York Mayoral race, and voter outreach through the use of AI, such as the AI candidate in the Tokyo Governor race. Campaign operations and political organizing are leveraging AI for voter outreach and helping campaigns scale their communication efforts with services like Votivate, which use AI-generated voices for get-out-the-vote calls. AI may also be able to help potential candidates navigate the complexities of running for office, especially in local races where resources are often limited.

However, as Allen noted, while AI will inevitably become as ubiquitous as electricity, we must remain mindful of how its aggregate effects might fundamentally alter political dynamics in ways we can’t yet predict.

Areas for Future Research

As the relationship between trust and technology continues to evolve, the panelists highlighted a few areas for further research and investigation:

  • Advance methods of disclosure to ensure that we know when we are interacting with a computer or another person. Current watermarking methods may not be a robust enough solution, and won’t necessarily solve the disinformation problem.
  • With only a few companies leading the world of AI today, further examine the relationship between AI and the concentration of power. The Allen Lab’s recent work on the Political Economy of AI is focused on this dynamic.
  • Explore the balance between open-source and closed-source models, and the implications for democratizing access to these tools while managing associated risks.

As Schneier noted, democracy is about more than just an outcome – it’s about the human process behind reaching that outcome, which AI should enhance rather than replace.

Watch the full event:


More from this Program

Political Economy of AI Essay Collection

Political Economy of AI Essay Collection

Earlier this year, the Allen Lab for Democracy Renovation hosted a convening on the Political Economy of AI. This collection of essays from leading scholars and experts raise critical questions surrounding power, governance, and democracy as they consider how technology can better serve the public interest.

Watching the Generative AI Hype Bubble Deflate

Additional Resource

Watching the Generative AI Hype Bubble Deflate

As a part of the Allen Lab’s Political Economy of AI Essay Collection, David Gray Widder and Mar Hicks draw on the history of tech hype cycles to warn against the harmful effects of the current generative AI bubble.

AI and Practicing Democracy

Additional Resource

AI and Practicing Democracy

As a part of the Allen Lab’s Political Economy of AI Essay Collection, Emily S Lin and Marshall Ganz call on us to reckon with how humans create, exercise, and structure power, in hopes of meeting our current technological moment in a way that aligns with our values.

More on this Issue

Political Economy of AI Essay Collection

Political Economy of AI Essay Collection

Earlier this year, the Allen Lab for Democracy Renovation hosted a convening on the Political Economy of AI. This collection of essays from leading scholars and experts raise critical questions surrounding power, governance, and democracy as they consider how technology can better serve the public interest.

Watching the Generative AI Hype Bubble Deflate

Additional Resource

Watching the Generative AI Hype Bubble Deflate

As a part of the Allen Lab’s Political Economy of AI Essay Collection, David Gray Widder and Mar Hicks draw on the history of tech hype cycles to warn against the harmful effects of the current generative AI bubble.