Interview with Nani Jansen Reventlow, the Founder of Systemic Justice: Making litigation accessible as a tool for change.

Nani Jansen Reventlow. Photo credits: Tetsuro Miyazaki

 

Nani Jansen Reventlow is the Founder of Systemic Justice. She is an award-winning human rights lawyer specialised in strategic litigation at the intersection of human rights, social justice, and technology.

We spoke with Nani about what it means to support community-driven litigation and what Systemic Justice has learned about how communities experience technology-enabled harms.


Systemic Justice is the first Black-led organisation in Europe working to radically transform how the law works for communities fighting for racial, social, and economic justice. As “the movements’ law firm”, Systemic Justice helps ensure that those fighting for justice and equality can bring about change via the courts. This aims to help dismantle the power structures that fuel racial, social, and economic injustice. Systemic Justice works from a foundation of anti-oppression, intersectionality, and justice, across the digital and non-digital context.  

 

European AI & Society Fund: Systemic Justice has extensively mapped systemic harms that local communities across Europe are experiencing the most. What are some of the issues related to technology-enabled harms that have come through in your research? How do tech issues intersect with other systemic harms that people experience?

Nani: Our first Europe-wide mapping of racial, social, and economic justice priorities – published as Surfacing Systemic (In)justices: A Community View – provided one-of-a-kind insight into the harms experienced by communities and the opportunities they saw for addressing them. Looking at the data, we noticed that the role of technology was not as prominently present as we had expected, even though in many of the themes we explored, including policing, free movement, and social protection, tech plays a substantial role.

With the support of the European AI & Society Fund, we are doing a deeper dive into this question. We’ve expanded the data set we created in 2022, which identified just over 1,000 organisations, movements, and collectives across Europe resisting racial, social, and economic injustice to 3,000 organisations, with new additions coming primarily from Central, Eastern, and Southern Europe as well as the Nordics. In addition, we conducted a series of 40 qualitative interviews with community activist to, amongst others, gain a deeper understanding of the role technology plays in the systemic injustices they are facing.

Those conversations made clear that technology plays a key role in exacerbating social problems. This happens on different levels, from being racially profiled and subjected to surveillance in everyday life; to access to service providers such as schools, job centres or social services; to engagement with state institutions, police, and border control. Often, these harms are based on the sharing of information between different databases that operate based on racial profiling and predictive analytics, which in turn is driven by racialised categories that affect all levels of decision making for individuals, communities, and neighbourhoods alike.

We’ll publish our findings in our next report, which will be launched in June this year, so stay tuned!

European AI & Society Fund: What resources and capacities do local organisations and collectives need to organise effectively to address these challenges?

Nani: A number of themes emerged from our research.

First, approaches to the understanding and regulation of AI and technology currently sit outside of community experiences. The digital rights field is dominated by white-led civil society and academic actors who are divorced from the realities of tech and data harms. Local community organisations are contending with the very real consequences of digital exclusion and the encroachment of tech into policing and law enforcement and therefore should be the site for building knowledge and power to challenge the encroachment of tech that exacerbates systemic injustice.

Second, communities and actors in the digital rights field do not speak the same language. There is a need for a new vocabulary which better captures the concerns of communities and the harms they experience.

Third, there is a need to build a movement that has the knowledge and power as well as the political and legal strategies to resist the encroachment of technology that exacerbates systemic injustices. There also is a need to equip legal practitioners with the insights and understandings of the everyday harms as experienced by marginalised groups.

European AI & Society Fund: For someone new to community-driven litigation, where should they begin?

Nani: One of the reasons we do the work that we do is that we know that litigation is currently not accessible to everyone as a tool for change. Those using the courts now tend to have access to resources and knowledge that is not widely available. This means not everybody has the same starting point or frame of reference when it comes to community-driven litigation.

In our first conversations with communities, people shared with us that they wanted to learn more about litigation and what it can do for their communities. Following up on those requests we’ve designed a toolkit that seeks to provide a starting point to gaining a better understanding of what strategic litigation is and – most importantly – how it can be used as a tool to support communities’ fights for justice. It includes a Guide for legal action, which breaks down some of the essential components of strategic litigation in an accessible way, with lots of case studies showing how litigation has supported campaigns for change in the past and present. This is a good starting point for anyone wanting to learn about litigation for the first time.

If you have engaged with litigation before, perhaps as a lawyer or an NGO, you might want to learn more about how we do it in a community-driven way. When we talk about community-driven litigation, we mean partnering with communities to support them in bringing about the litigation they want, on the issues they identify, which they believe can serve their goals – such as changing law, policy, or institutional behaviour. This is not just about consulting with communities, or centring them in the legal work, it’s about supporting communities in being the strategists themselves in the litigation that concerns them. We then do the legal “legwork” to put the strategy into practice. Having communities’ perspectives, experience, and expertise drive the litigation work shouldn’t be a novelty, but sadly there is a lack of community-driven litigation practice in Europe. This is why we exist.

European AI & Society Fund: For you, what are the drivers for systemic justice in the tech field?

Nani: This is a difficult question to answer, as I associate the term “tech field” with the tech industry or sector. This is a field that is built within a system of racial capitalism and oppression, and the technology it produces will only ever serve the objectives of these systems. That means that tech that is built “for good” will never truly attain justice, as it will never go as far as dismantling the harmful systems that lie at the root of the tech industry itself.

Having said that, there are many tech activists and academics I greatly admire, who have done groundbreaking work in exposing how these systems are entrenched and embedded in technology – Joy Buolamwini, Timnit Gebru, Inioluwa Deborah Raji, Ruha Benjamin, Safiya Noble, and Simone Brown, to name a few. And of course, the fabulous team at the Digital Freedom Fund (DFF) and EDRi, who are continuing the decolonising process for the digital rights field we initiated in 2019 when I was still running DFF.

European AI & Society Fund: What books or podcasts that you’ve enjoyed recently, would you recommend to our readers?

Nani: Funny you should mention that! We just released an audio version of our Guide for legal action. Read by the members of the Systemic Justice team, it’s a podcast version of the resource I mentioned earlier that deconstructs “strategic litigation” with lots of real-world examples. You can find it wherever you get your podcasts. At Systemic Justice, every time we publish a lengthy report or resource, we always work to make an audio version to make sure the information can reach as many people as possible. Legal information is all too often presented one way – and it usually involves endless text with footnotes and legal jargon – but we want to make legal know-how accessible in different formats, and podcasts is one of the ways we do this.

Recent articles
Interview with Sam Jeffers from Who Targets Me: is generative AI changing election campaigns online? 

The European AI & Society Fund spoke with Sam Jeffers from Who Targets Me about how generative AI is changing online political advertising and campaigns

Interview with Connor Dunlop from the Ada Lovelace Institute: New insights into AI governance

In this interview we speak with Connor Dunlop from the Ada Lovelace Institute about their recent research on AI governance, the result of the UK's AI Safety Summit and what he's advocating for in 2024.