Interview with Jeff Deutch, Mnemonic

 

Mnemonic works globally to help human rights defenders effectively use digital documentation of human rights violations and international crimes to support advocacy, justice and accountability.

Funding from the European AI Fund enables Mnemonic to focus on legislation such as the Digital Services Act that has the potential to greatly affect platform transparency broadly and the availability of social media content for archives specifically. The organisation also advocates with tech companies to save the vast swathes of human rights documentation that live online from permanent deletion.

Jeff Deutch told us more about Mnemonic’s work.

Mnemonic works to preserve digital documentation of human rights violations. What is the role of AI in this field?

Jeff: AI impacts digital documentation in both positive and negative ways. There are positive uses of AI for object recognition and other technologies that can help us sort through large bodies of human rights documentation. However, these applications are still limited and our policy work has largely focused on the negative impact of AI.

Unfortunately, policymakers in companies and in government are increasingly turning to AI for content moderation – using various forms of AI to determine what content can stay on online platforms, and what content will be removed or limited in reach.

Digital documentation of human rights violations has been improperly removed at a very high rate by AI because:

– it may include content that wouldn’t be allowed were it not for various newsworthiness and education exceptions in content policies, and AI is bad at assessing the applicability of those exceptions

– it is often in Arabic, Ukrainian, or other languages using non-Roman characters that AI does not handle as well

– lists of groups and individuals deemed to be terrorist or violent extremist are heavily biased towards “Islamist” terrorists, and efforts to remove this content end up impacting human rights documentation

How does recent regulation such as the DSA affect what you do?

Jeff: The DSA will force companies to make changes to their content moderation practices, both in implementation and in transparency. On one hand, it is quite likely that the DSA will make it easier to advocate for protection of human rights documentation because it has due diligence requirements of “VLOPs”: Very large online platforms, and VLOPs are still the most popular place for human rights defenders to post content. On the other hand, the DSA could push companies to take down content even more rapidly, endangering human rights documentation.

You advocate for policy reform at the European level – how do you see any developments translating beyond EU borders?

Jeff: We see developments translating beyond EU borders by legitimizing content removal practices that have the potential to be especially dangerous in undemocratic countries. We also see the potential for copycat legislation; the NetzDG law in Germany is a popular example but we are seeing all types of legislation travel globally. The GDPR also inspired legislation such as the California Consumer Data Privacy Act. As companies create systems and policies to comply with EU legislation, these systems could have a positive or negative impact on users elsewhere, including increased or decreased free expression online.

What are your main challenges you face in your work?

Jeff: One overarching challenge in our work is the difficulty of engaging with EU lawmaking. It requires inside knowledge to know when consultations are open, what are the critical junctures and policymakers to engage with, and even to see changes made to laws throughout the legislative process, since “official versions” can take months to publish.

Another challenge in our work is that those most impacted by content moderation decisions may also be people in vulnerable situations – that includes our own teams. People who are seeing their content removed may also be facing surveillance, unlawful detention, torture, and murder. That means that, while content moderation is important, it can be hard for those people and the NGOs that serve them to justify taking the time to engage in AI and content moderation policies.

What is your vision for a digital environment that upholds human rights worldwide?

Jeff: The Internet is powerful and it is global. At this point, regulation is required to ensure that human rights are respected online. In our vision, policymakers in countries where companies are located, as well as policymakers in big markets, regulate with users inside and outside their borders in mind.

 

^
^
^