Forum of European Muslim Youth and Student Organisations (FEMYSO)
Empowering young Muslims to track and call out AI Islamophobia
AI systems can embed and amplify Islamophobic biases. In 2024, the Forum of European Muslim Youth and Student Organisations (FEMYSO), demonstrated how easy it is to break the safeguards of several chatbots and generated over 270 examples of Islamophobic bias.
FEMYSO used this evidence to urge AI companies to take action against in-built discrimination in their systems. With support from the European AI & Society Fund, FEMYSO also developed an AI Islamophobia tracker.
The challenge
Technology is deeply intertwined with social structures and prejudices. For example, AI-driven content moderation on social media platforms – that is intended to counter the spread of terrorism – disproportionately suppresses Muslim voices. This means they reproduce anti-Muslim stereotypes, further entrenching inequalities. Large Language Models (LLMs), such as ChatGPT, also inherit biases from the data they are built and trained on, which reflect existing and historical social attitudes towards marginalised communities. This leads to further bias in the data that is fed into generative AI systems and further perpetuates the cycle.
The action
Civic engagement and grassroots networks – like The Forum of European Muslim Youth and Student Organisations (FEMYSO) – play a vital role in holding these AI systems accountable. FEMYSO is a pan-European network, with a mission to empower Muslim youth and build a diverse, cohesive and vibrant Europe.
For the European Action Day Against Islamophobia Conference 2024, FEMYSO organised a conference at the European Parliament where young Muslim participants challenged AI chatbots with prompts related to Muslim identity, aiming to expose biases, discriminatory outputs, and potential vulnerabilities that could be exploited by malicious actors. Participants broke the safeguards of several chatbots and generated over 270 examples of Islamophobic bias, which they used to urge AI companies to do better and de-bias their datasets.
Not only did this experience show young Muslims the importance of independent audits and so-called Red Teaming exercises – which identify and address potential vulnerabilities in technologies – it also equipped them with the knowledge and skills to advocate for ethical AI development. These exercises uncover systemic bias and highlight where the lives of their communities can be affected and contribute to creating a more inclusive and equitable future.
How the European AI & Society Fund helped
The European AI & Society Fund grant supported research, capacity building and advocacy around cases of anti-Muslim bias in algorithmic systems.
With our support, FEMYSO compiled a comprehensive database and tracker of AI and Islamophobia cases in Europe. Called The [A.i]slamophobia Campaign, this collective powered by FEMYSO brings the perspective of Muslim youth on the impact of digital technologies in Europe. It also serves as a useful knowledge-building and advocacy resource for the community.