skip to Main Content


According to a March 2019 Pew Research Center Survey, 90% of Americans believe that altered videos or images lead to confusion in public discourse. “Deepfakes”—the use of AI to generate deceptive visual media depicting real people saying or doing things they did not—pose serious threats to democracy. While tools for identifying altered or fabricated video are advancing, and social media policies designed to limit the distribution of such content continue to evolve, the broad availability of tools for producing doctored or fabricated videos raises concerns about whether and how deepfakes might be used to manipulate the 2020 election.

Sponsoring Organizations

The Silicon Valley Leadership Group Foundation is a non-profit 501(c)(3) organization. Its Board is a group of dedicated business and community leaders from Silicon Valley, which works closely with the Silicon Valley Leadership Group.

The CITRIS Policy Lab, headquartered at UC Berkeley, supports interdisciplinary research, education, and thought leadership to address core questions regarding the role of formal and informal regulation in promoting innovation and amplifying its positive effects on society.

Winning Videos! 

First Place: Katy Jiang, master’s student in the Software Engineering Program at San Jose State University

Second Place: Rachel Moy, Sophomore at UC Berkeley studying Economics and Sustainability

First Place

Second Place

Video Competition (This Competition is Now Closed)

We are seeking student submissions of educational video content about deepfakes. Competition criteria:

  • This competition is open to graduate students (Masters, PhD or Law), undergraduates and high school students only.
  • Submissions should have a clear educational purpose, and any fabricated or manipulated content should be clearly and continuously labeled as such whenever it appears and throughout its duration.
  • Keep submissions to less than 3 minutes in length.
  • Keep it PG.
  • Make it something that watchers from all parts of the political spectrum can appreciate. (Let’s have fun, but please don’t fan the flames of political division.)
  • Don’t feature anyone currently serving in elected office, or currently running for office.
  • Each submission must have a faculty or staff sponsor and include faculty/staff sponsor contact information. (A faculty/staff member may sponsor more than one submission.)
  • Winners will be announced on Monday, October 27th.

Our Goal

To produce engaging video content to help educate the general public on how video can be fabricated and manipulated relatively easily, with little training and minimal resources.


Winning submissions will be featured through CITRIS media channels and broadly in the Silicon Valley community through Silicon Valley Leadership Group media channels, and may be featured during the Silicon Valley Leadership Group’s Oct. 30th flagship event with more than 1,000 industry and public sector leaders. There will also be modest cash prizes:

For undergraduate and postgraduate submissions: $2,500 for first place, and $1,000 for second place.

For high school submissions: $1,000 for first place and $500 for second place.

Participating Educational Institutions

Santa Clara University, UC Berkeley, UC Davis, UC Merced, and UC Santa Cruz. Please note that participation is not limited to students at these institutions.

Helpful Resources

Here are some links that might serve as an introduction for interested students:

Meet the Judges

Brian Brennan, SVP, Silicon Valley Leadership Group
Brian Brennan leads the Leadership Group’s membership development, and heads the Silicon Valley Leadership Group Foundation’s Emerging Tech Policy Initiative. Prior to joining the Leadership Group in 2008, he oversaw political development programs for the US Agency for International Development in Moscow.

Angus Forbes, Associate Professor, Computational Media, UC Santa Cruz
Professor Forbes’s research investigates novel techniques for visualizing and interacting with complex scientific information. Prior to joining the Computational Media faculty at UC Santa Cruz, he was an Assistant Professor of Computer Science in the Electronic Visualization Laboratory at University of Illinois at Chicago, where he designed and taught graduate courses in Information Visualization, Computer Graphics, and Deep Learning, and co-founded the first dual-degree MS/MFA joint program in Computer Science and New Media Arts.

Ashish Jaiman, Director of Technology, Microsoft
Ashish Jaiman is the Director of Technology and Operations in the Customer Security and Trust organization at Microsoft, focusing on the Defending Democracy Program. Jaiman’s mission is to help customers to improve their security posture and defend against cyber-attacks. He is currently working on disinformation defense and deepfakes intervention strategy and its impact on society and democracy.

Yuhong Liu, Associate Professor of Computer Engineering, Santa Clara University
Professor Liu’s expertise is in trustworthy computing and cybersecurity.  Her research interests include online social network security and privacy, trust management in cyber-physical systems, and trustworthy cloud computing.

Brandie Nonnecke, Director, CITRIS Policy Lab, UC Berkeley
Brandie Nonnecke, PhD is Founding Director of the CITRIS Policy Lab, headquartered at UC Berkeley. Brandie studies human rights at the intersection of law, policy, and emerging technologies with her current work focusing on fairness, accountability, and appropriate governance mechanisms for AI. She is a Technology and Human Rights Fellow at the Carr Center for Human Rights Policy at the Harvard Kennedy School. She served as a fellow at the Aspen Institute’s Tech Policy Hub and at the World Economic Forum on the Council on the Future of the Digital Economy and Society. Her research publications, op-eds, and presentations are available at

Magdalena Wojcieszak, Professor, Dept. of Communications, UC Davis
Professor Wojcieszak’s research focuses on how people select political information in the current media environment and on the effects of these selections on attitudes, cognitions, and behaviors. She also examines the effects of mass media, new information technologies, and various message types on tolerance, perceptions, and polarization, as related to intergroup relations. Her current interests include ways to minimize selective exposure and biased information processing.

Learn More

For questions, please email us at [email protected].
Back To Top