The Protocols of the Elders of Zion: 120 years of Antisemitic Propaganda
Panel Summary – What is to be Done, Part 2: The Challenge and Promise of Technology
The last panel of the symposium considered the double-edged sword of networked communication technology, where powerful platforms enabled and inspired communication, collaboration, and creation unimaginable just a decade ago and today face a new threat landscape that benefits from its connective tissues. Daniel Byman of Georgetown University moderated a conversation between Justin Erlich, Elizabeth Neumann, and Cynthia Miller-Idriss as they discussed the current threats and what we must do about them. Our panelists established that antisemitism is not just an isolated threat or restricted to a defined community but is instead a deep-seated virus that is a starting and end point to all forms of radicalized hate online in the ‘propaganda train:’ misogyny, homophobia, xenophobia, among others. Next, panelists discussed ways to stop the spread of hateful ideologies online and prevent youth from becoming radicalized, discussing solutions from short-form ‘pre-bunking’ videos that teach people how to recognize when they are being manipulated to on-the-ground initiatives with parents, coaches, teachers, faith leaders, and other communities to create behavioral change. Tech companies like TikTok also seek to address harmful content online through five vectors – policies, enforcement, empowerment, education, and partnerships – and to strike a balance between empowering users and institutionally removing content. Community solutions were designated as some of the most foundational and impactful aspects of CVE/CT work, supplementing online tools and education. As the threat landscape has changed since 9/11, government structures to fight dangerous content have failed to update, and the focus of advocates has thus shifted to grassroots initiatives and education. Furthermore, technology has made CT harder, as our society faces increased exposure to online content available at all times. The politicization of content and words like ‘disinformation’ have also complicated the panelists’ and other advocates’ work as they navigate learning curves in publicizing and advertising information about combating harmful online content to the public. A further complicating factor is the lack of government investment in community-based solutions, with panelists noting that questions about long-term projects like building community trust and reducing recidivism rates for radicalization remain unasked and unanswered.
Justin Erlich – “Coded words are one of the great challenges here. We’re constantly trying to keep up without overkilling content that may be used in counterspeech or that may have some sort of neutral context. So this is some kind of the ongoing work that we (TikTok) rely on our partners – trusted NGOs and civil servant groups – to help us with.”
Elizabeth Neumann – “Most companies trying to police their terms of service are limited by usually leveraging government designations of what a terrorist group is – and it’s not necessarily focused on movements. And so, Seamus (Hughes), in the previous panel, talked about how there is this artificiality in the way that the government functions: we designate terrorist organizations, we don’t designate terrorist movements. And the law is structured around organizations, not movements. So a lot of the tools, both in the federal government and the tools that the tech companies rely on, are not there to address antisemitism as a movement or as a type of content that we want to prohibit. So we have to get more creative.”
Cynthia Miller-Idriss – “One of the things that strike me is that we are often talking about antisemitism, or antisemites, or even ‘The Antisemite’ as if it’s some kind of bounded or very recognizable and identifiable thing. I think that one of the things that we find in the (Polarization and Extremism Research & Innovation) lab is that antisemitism tends to be not just a starting point for the propaganda that we see online but it’s also the endpoint as well. It doesn’t matter where you get on the ‘propaganda train,’ whether it’s antisemitism or anti-immigration or male supremacism, it always ends up on antisemitism. You go far enough down the rabbit hole, and you get there.”
Daniel Byman, Georgetown University
Daniel Byman is a professor in the School of Foreign Service with a concurrent appointment with the Department of Government. He is an editor at Lawfare and a member of the Department of State’s International Security Advisory Board. He served as Vice Dean of the SFS undergraduate program from 2015 until 2020 and before that as director of Georgetown’s Security Studies Program and Center for Security Studies from 2005 until 2010. He also led a Georgetown team in teaching a “Massive Open Online Course” (MOOC) on terrorism and counterterrorism for EdX. Professor Byman is also a part-time Senior Fellow at the Center for Middle East Policy at the Brookings Institution. From 2002 to 2004 he served as a Professional Staff Member with the 9/11 Commission and with the Joint 9/11 Inquiry Staff of the House and Senate Intelligence Committees. Before joining the Inquiry Staff he was the Research Director of the Center for Middle East Public Policy at the RAND Corporation. Previous to this, Professor Byman worked as an analyst on the Middle East for the U.S. government.
Justin Erlich, TikTok
Justin is the Global Head of Issue Policy and Outreach & Partnerships for the Trust & Safety team at TikTok. He leads teams that develop global policy framework, engage with civil society and communities, and incubate Responsible Innovation practices. He also regularly teaches courses on Disruptive Technology & Regulation at the UC Berkeley Law School. Harnessing his strategic and policy background, he focuses on building organizations that operate in highly-regulated environments. He combines big-picture thinking with executive leadership to deliver tangible impact. He is driven by a passion to ensure tech platforms bring us together rather than drive us apart. Prior to joining TikTok, Justin worked in the urban mobility tech sector at Uber as Global Head of Policy for Autonomous Vehicles & Urban Aviation and the V.P. of Strategy, Policy & Legal at Voyage. He served as the Principal Tech Advisor for the former California Attorney General and current Vice President Kamala Harris, overseeing the Department’s work on privacy, data, tech platforms, and the regulation of emerging technologies. He also spent 5 years at McKinsey & Co. as a consultant, with a focus on cities and the social sector. He has a degree in Government with related fields in economics and behavioral psychology from Harvard University, and holds a J.D. from New York University School of Law. He is a member of the California State bar.
Elizabeth Neumann, Moonshot
Elizabeth Neumann is the Chief Strategy Officer for Moonshot, a tech-driven solutions provider harnessing the power of the internet for good. We develop new technology and methodologies to expose threats, disrupt malicious actors and protect vulnerable audiences online. We work to end online harms – such as violent extremism, disinformation, child sexual exploitation, gender-based violence, and human trafficking – making communities, governments, and businesses safer, both online and off, around the world. Previously, Ms. Neumann served as the Assistant Secretary for Counterterrorism and Threat Prevention at DHS where she led eight program and policy teams addressing a range of issues including domestic violent extremism, screening and vetting, countering terrorism and transnational criminal organizations, countering hostile UAS (drones), and human trafficking. Over the past two decades, Ms. Neumann created and implemented multiple government-wide reforms, primarily in the areas of security and public safety. Ms. Neumann began her homeland security work in the aftermath of 9/11, serving on the inaugural staff of the White House Homeland Security Council (now part of the National Security Council). Ms. Neumann is a Board Member of the National Immigration Forum, founder and member of the Council on National Security and Immigration, and a National Security Contributor at ABC News.
Cynthia Miller-Idriss, Polarization and Extremism Research & Innovation Lab (PERIL), American University
Dr. Cynthia Miller-Idriss is a Professor in the School of Public Affairs and in the School of Education at the American University in Washington, DC, where she is also the founding director of the Polarization and Extremism Research and Innovation Lab (PERIL). She is a Draper Richards Kaplan Foundation Entrepreneur and recently served as the inaugural creative lead for the Alexander von Humboldt Foundation’s residency program on social cohesion in Berlin, Germany. Dr. Miller-Idriss regularly testifies before the U.S. Congress and briefs policy, security, education and intelligence agencies in the U.S., the United Nations, and other countries on trends in domestic violent extremism and strategies for prevention and disengagement. She is the author, co-author, or co-editor of six books, including her most recent book, Hate in the Homeland: The New Global Far Right (Princeton University Press, 2022). She is currently at work on a new book on the gendered dimensions of violent extremism. Dr. Miller-Idriss writes frequently for mainstream audiences, as an opinion columnist for MSNBC and in other recent by-lines in The New York Times, The Atlantic, Foreign Affairs, The Washington Post, Politico, USA Today, The Boston Globe, and more