Interview with Gabriela di Oliveria, Glitch



Glitch is a UK charity committed to ending the abuse of women and marginalised people online. Through workshops, training, reports and programmesGlitch equips the intersectional community to become the digital citizens we need in the world today. From grassroots to systemic change, Glitch advocates for an online world that is a safe space for all. We have talked to Gabriela di Oliveira, Glitch’s Head of Policy, Research and Campaigns to find out more about their work.

What’s the connection between Artificial Intelligence and online abuse?

Gabriela: At Glitch we focus on online gender-based violence, particularly against Black women, and therefore understand online abuse as it relates to various forms of racialised and gender-based violence. Online abuse isn’t just the verbal abuse we often think of, like hate speech or harassment from others. Online abuse relates to a whole spectrum of ways communities can experience harm via the online space, including via harmful content (videos, images, text) such as racialised violence, sexual violence or disinformation, as well as abusive behaviours such as doxxing, image based sexual abuse or threats of violence.

Artificial Intelligence (AI) can help filter, or even prevent, abuse in some cases, but in reality technology-facilitated abuse is often developing as fast or faster than AI solutions. More importantly, we now know AI itself is not neutral: AI systems can be harmful and discriminatory based on who builds them, how they’re developed, and how they’re ultimately used. Black women are more at risk of AI harm due to entrenched gender and racial inequalities reflected in AI systems and their data, develops and use cases.

Despite this, since beginning to work in the AI space, Glitch has found a profound lack of research from this intersectional lens in Europe. We’re hoping to change that and improve understanding of the link between AI and gender and racialised online abuse.

Part of your current work revolves around the UK Online Safety Bill, which is yet to be passed. What are your concerns and how are you addressing them?

Gabriela: The Online Safety Bill (OSB) is set to be the first legislation of its kind in the UK. It aims to regulate user-to-user platforms and search engines to reduce the prevalence of online abuse, with a particular focus on preventing terrorism and child sexual exploitation and abuse. At Glitch we’ve been working to influence the OSB for almost two years, alongside other campaigning and rights-based organisations.

Our key concerns with the OSB are, despite evidence showing women are 27 times more likely to experience abuse such as harassment online, the Bill doesn’t mention women, girls or online gender-based violence once. We’ve been campaigning for women and girls to be named in the bill, creating a policy lever for tech companies to develop ‘codes of practice’ for preventing violence against women and girls. So far we have over 60,000 signatures, please sign if you’d like to support.

Another issue we’re concerned about is related to risk mitigation for marginalised groups. Currently, the OSB only requires tech platforms to develop preventative measures for ‘certain groups’, with no obligation to consider specific named groups – allowing tech companies to choose who to prioritise. We’re calling for ‘protected characteristics’ under the 2010 Equality Act, such as sex, race, religion, disability, age, to be considered at the point of developing risk mitigations. Specifically, we want to see wording in the OSB that acknowledges ‘multiple’ or ‘combined’ characteristics in order to protect Black women on the grounds of their specific risk to harm, rather than separately as a ‘Black person’ and as a ‘woman’ (as it currently is in the law).

Right now though, our key concern is that the OSB has been halted in parliament due to political uncertainty after our Prime Minister resigned on July 8th. We are more determined than ever to increase pressure to get the Bill passed and to push our calls for change on improving protections against online gender based violence.

Why do you think most initiatives that focus on online abuse fail to recognise that women and minoritised communities are often the target of these attacks?

Gabriela: By now we know that unless we make women and minoritised communities the focus of our work, they will get ignored, sidelined and deprioritised.

Barriers to this work across the sector are well evidenced; the lack of diversity of staff working in digital policy, tech and civil society at the leadership level are important. This matters alongside organisational values and commitment (or lack of) to feminism and antiracism. Even where there is a commitment and desire to do this work, operationalising and embedding it can be a challenge if its not already part of our day-to-day.

We still need more specific, intersectional research in this area. And for that we need funding for organisations who are led-by-and-for women and minoritised communities. More partnerships across the sector, which specifically explore the experiences of these communities, can help improve equity in our work and transform the effectiveness of the solutions we create.

At Glitch, we are led by a Black feminist approach, which means centering the rich and diverse experiences of Black women and non-binary people online. By doing so, we have to recognise, and work to tackle, the harm created by systems of power such as white supremacy and patriarchy in our online spaces. Developing solutions that take these into account, works to unlock stronger digital safety for all – we cannot, as a sector, continue to ignore the disproportionate impact on women and minoritised communities.

You advocate for the taxation of tech giants. What is the link with online abuse?

Gabriela: Online abuse takes place on technology platforms, particularly very large social media platforms that are being regulated under the Online Safety Bill in the UK. Already, in 2018, the UK Chancellor announced a new Digital Services Tax of 2% on tech giants like Facebook, Google and Twitter. This tax is expected to raise an additional £400m a year.

At Glitch we’ve been campaigning for 10% of that the new Digital Services Tax to be ring fenced specifically to combat, and ultimately end, online abuse. We believe this funding should be pledged to civil society organisations to help fund their vital work of ending online violence and abuse, such as through training and education on digital citizenship and online safety, developing new tools in partnership with tech, supporting the government on policy development and supporting survivors of online abuse and violence. Without this funding, we cannot sustain the strong and diverse civil society network working in this space.

What are the biggest challenges you encounter in your work?

Gabriela: As with so many us working on in this space, there are many challenges. These are perhaps not the ‘biggest’ but here are some we face everyday:

  • We are a small (but mighty!) organisation, and although we are growing, it is difficult to keep up with the level of demand on our capacity.
  • The level of demand is also particularly intense as one of the few organisations working specifically on online gender based violence with a focus on Black women, in a trauma-informed way.
  • Whilst we love partnering with others, the level of demand on our time sometimes makes it hard to do the work in the way we want to do it, particularly as a values led organisation.
  • Digital policy in particular is a very white elitist space at the EU level, so it can be a challenge to engage in policy discussions without losing our connection with our community – that’s why we’re exploring participatory policy making approaches to bring Glitch’s communities’ views into our work.
  • There is a disconnect between policy work and tech development, particularly around the practicalities of AI design and development – this is something we’re working to bridge in our own expertise.
  • Education through workshops is also part of Glitch’s approach to championing digital citizenship. What is your vision for bringing this concept to the wider public?

Over the last four years, our workshops have provided specialist support and safe spaces for women and non-binary people to better protect themselves and others from online abuse. We have also worked with a number of companies and organisations to improve digital literacy and develop the practice of what we call ‘Digital Citizenship’, including the importance of tech accountability. By championing and formalising Digital Citizenship work, Glitch is working to increase agency and accountability in digital spaces. This is part of what call a public health approach to digital safety – everyone has a role to play.

Some important ways this work will lead to wider change for the public include:

  • Narrative change: we are working on a change in public narrative, including:
    • Shifting understanding from seeing online spaces as ‘less real’ than offline, to understanding that online abuse is abuse,
    • The online space is not equal for all – women and specifically Black women, are at disproportionate risk of harm, which is creating personal, communal and societal harm and inequality,
    • Digital Citizenship – considering our behaviours online as important and a responsibility,
    • Tech is not ‘neutral’: the digital spaces and tools are designed and therefore can be different – they can also be harmful,
    • Better tech accountability and transparency can improve online safety for all.
  • Accountability and incentive structures: we are working towards developing an accreditation system for our programmes. This will allow communities and organisations to show leadership in this space and will formalise training and education in online safety and Digital Citizenship similarly to existing accreditation systems around anti-corruption practice or diversity, equity and inclusion.
  • Policy change: our Action work (workshops and resources) underpins our Advocacy work. We are working to influence the highest level of policy making and therefore change the law and regulations, to facilitate stronger tech accountability, protections for Black women and media literacy infrastructure.
  • The content, learnings and approach from our workshops feed into Glitch’s other work, while facilitating connection with our community. As we continue to develop this work, we hope to partner with others who are dedicated to a public health approach to digital safety.
Recent articles
Welcome aboard, Peggye! 

Please join us in extending a warm welcome to Peggye Totozafy, our newest team member! Peggye steps into the role of Senior Partnerships Manager with a wealth of experience in impact philanthropy, ready to lead our efforts in fostering meaningful connections with partners.

Interview with Sam Jeffers from Who Targets Me: is generative AI changing election campaigns online? 

The European AI & Society Fund spoke with Sam Jeffers from Who Targets Me about how generative AI is changing online political advertising and campaigns