Interview with Yigit Aydin, ESWA

The European Sex Workers’ Rights Alliance (ESWA) is a regional network of more than 100 organisations providing services to sex workers and advocating for sex workers’ rights in Europe and Central Asia. ESWA is sex worker-led and aim to challenge repressive laws and policies that impact sex workers’ human rights in particular in regard to the access to health and justice.

We have talked to Yigit Aydin,ESWA’s Programme Officer to find out more about the challenges that sex workers face in the digital field.


Where do you see the biggest challenges for sex workers’ rights in the digital field?

Yigit: Sex workers are facing growing challenges in the digital field. While some of these challenges originate from developments in AI, other existing challenges are intensified by the lack of sex worker involvement in the decision and policy making or the design of digital services. Sex work is often conflated with trafficking, and we see that this conflation is now present in the digital sphere as well. For example, FOSTA/SESTA, a US anti-trafficking law passed in 2018, is frequently used to target sex workers and exclude them from various digital services. FOSTA/SESTA is a law that essentially makes platforms liable for user activity and content. Due to sex work and trafficking conflation, platforms and other digital services have adopted policies that ban sex workers from their services. Some of the examples are Mastercard, Airbnb, PayPal bans on sex work and sex workers, to name a few. As an overarching theme, one of the biggest challenges in fighting for human rights for sex workers is the exclusion of sex workers in the development of laws, policies, programmes, services that impact our communities. Despite all our efforts, sex workers are rarely involved in policy development, and our voices are not heard. A similar issue is also visible in design processes of digital services and technologies where sex workers’ interests (or other overlapping marginalised communities) are not considered by the corporations that have market-driven concerns.


What actions do you take to strengthen sex workers’ rights in the digital field and how do you challenge repressive laws and policies that impact sex workers’ human rights?

Yigit: There has been a lack of organisational knowledge regarding sex work and digital rights. Whilst a growing number of sex workers sell sex, communicate, or organise using digital tools, the level of digital literacy varies greatly amongst individual sex workers and sex worker-led organisations. Therefore, one of our aim and concern at the beginning of our digital rights programme was to explore the specific needs of sex workers in different national and regional contexts. Digital rights is a broad concept that might enclose different issues depending on individuals’ context. For example, for street-based sex workers, offline surveillance technologies such as CCTV can be a more significant concern than online censorship. As a consequence, mapping out the different needs and realities of sex workers is an essential first step. We must say that this is an ongoing process that requires resources. To this end, ESWA is regularly organising online and offline consultations with its membership on topics that are identified as having great importance for sex workers, such as

– online censorship and digital discrimination

– data protection, safety and privacy

– platformisation of sex work and its impact on working conditions

– algorithmic threats (including the use of AI in anti-trafficking tools)

Developing useful resources in these areas is one of our current focuses. ESWA is currently in the process of writing briefing papers on the topics I mentioned. The first paper is on the online censorship and discrimination of sex workers. ESWA undertook in-depth interviews with sex workers and sex worker-led organisations in 17 countries to collect information, along with an online survey. This first paper will be launched in February. The exclusion of sex workers from platforms brings new questions regarding freedom of speech and expression. What is allowed on the internet? Who allows it? How are marginalised communities impacted by these decisions? As our member organisation PION in Norway said: “We have challenged every censorship decision, and we have been dismissed automatically in a split second by the same automatic system that claims to be reviewed by humans. Even when talking in metaphors, we can’t use certain words. We also know non-sex workers who have been banned simply for using words the systems associate with sex work.” Building alliances is vital for ESWA in our fight for human rights for sex workers. It is a pleasure to realise that digital rights organisations are more open to engaging on sex workers’ issues then many other ‘progressive’ civil society organisations who can reproduce discriminatory or stigmatising attitudes towards our communities. Although digital rights is a relatively new domain for ESWA, we have reached out to different organisations and academic partners to forge new alliances and look for new opportunities to collaborate. The newly formed Digital Dignity Coalition within EDRi can be an example where we represent the sex workers’ voices. EU level policy activities are another area we are focusing our efforts on. For example, ESWA has been following the development of the EU AI act and provided input to the Ad hoc Committee on Artificial Intelligence (CAHAI) consultation. More recently, we have worked on the Digital Services Act amendments. Particularly one amendment was a great concern for us which sought to introduce mandatory phone registration for pornographic content creators on porn platforms in order to prevent revenge porn or other non-consensual content. We believe such practices are not effective in preventing image-based sexual violence and only put sex workers in danger of being outed and exposed. Moreover, it was highly problematic that in a matter that primarily impacts sex workers, our community were not consulted at all. Lastly, we are fighting to protect sex workers’ digital rights by monitoring platforms and how they treat sex workers. Similar to the other gig economy workers, sex workers have zero to very little protection and rights when working online. As you might know, in the summer of 2021, OnlyFans almost banned sex workers and pornographic content on their platforms due to the mounting pressure from Mastercard and other financial services. However, after a fierce backlash from sex workers and their supporters globally, they changed their direction. Our statement on the issue can be read here.


In your view, what are the main risks and opportunities of AI and tech for sex worker communities?

Yigit: Technology has two faces for sex workers, much like for anyone. Technologies can simultaneously empower and disempower users, as we have witnessed during the COVID-19 pandemic. Thanks to ICTs, many sex workers could stay connected with their friends, families, communities and find ways of working without violating the lockdown rules, such as camming, selling erotic pictures/videos, or through an OnlyFans account. Some scholars argue that working online empowers sex workers by removing the need for third parties to help facilitate sex work, thus enhancing sex workers’ agency. Others suggest that internet and ICT-mediated sex work can even protect sex workers from violence compared to more traditional ways of working, such as working on the streets or in brothels where the possibility of encountering dangerous actors, including the police is higher. While there is some truth in these optimistic views that focus on the potential of technologies to improve our lives, what we are seeing is also the weaponisation of certain technologies to target sex workers and other marginalised communities. AI is a good example of this. In countries with high uptake of AI, sex workers are reporting the adoption of AI by law enforcement to make predictions regarding the number of sex workers and their locations. We also know that certain AI systems that supposedly detect victims of trafficking are being used by the police and immigration officials to spy on sex workers who advertise their services online, affecting mostly migrant and racialised sex workers. Such systems are more prevalent in countries like the United States; however, it is difficult to obtain information on the exact level of use in the EU. Another example where AI use results in the disempowerment of sex workers is the algorithmic content moderation systems used by many platforms. These systems disproportionately impact sex workers as they often flag posts that mention ‘sex’, ‘sex work’, ‘hustling’ etc., without paying any attention to the context in which they are used. For example, many sex worker-led organisations regularly lose their accounts due to the word ‘sex’ in their name or in the posts they publish. Often there are no redress mechanisms to challenge the decision, and valuable organisational resources are lost. Data collection and processing practices also impact sex workers’ safety, privacy and wellbeing. An increasing number of key health and social services use data collection to improve their services. To avoid surrendering data is almost impossible nowadays as both online and offline services are increasingly dependent on it. Not every data collection is a threat for sex workers; however, normalising data collection and making it an entry point for the use of services can have a discriminatory impact on marginalised populations such as sex workers. Sex workers are heavily criminalised in many contexts and generally stigmatised and it is important to understand that, for sex workers, online safety equals offline safety. Therefore, data collection represents a great danger for sex workers due to the possibility of data leaks. Lastly, often any potential risk that a piece of technology brings is exacerbated by structural issues such as racism, homophobia, sex work stigma, criminalisation which pave the way for any technology to be used to undermine human rights.


 Would you like to share anything you have learned so far?

Yigit: The recent debates at European level, (DSA, Council of Europe resolution on porn) clearly points to the priority of institutions, policy makers, tech companies to create digital environments that limit harms to users: from hate speech, image-based violence, revenge porn to online children sexual exploitation and trafficking. These debates reflect discussions that have taken place for decades when it comes to the place of sex work in society. Trying to ban sex work, through end demand model such as criminalisation of clients or abolition of pornography in order to protect children or rescue victims (real or imagined) is a knee jerk reaction to much complex issues. Recently the UK had to reverse / cancel its anti-porn laws, created to ‘protect children’, as they were simply not enforceable. Policy makers risks creating further risks to marginalised communities such as sex workers, migrants, racialised and LGBTI+ people by not paying enough attention to unintended consequences of hastily drafted and approved laws and resolutions. Privacy and freedom of expression are not just minority concerns: as often sex workers are the canary in the coal mine. Attacks on our rights are often precursors of attacks on other communities. Anyone who cares about issues such freedom of speech, anonymity, or power of big tech, should be alarmed by the treatment of all sex workers including adult content creators.



Recent articles
Welcome aboard, Peggye! 

Please join us in extending a warm welcome to Peggye Totozafy, our newest team member! Peggye steps into the role of Senior Partnerships Manager with a wealth of experience in impact philanthropy, ready to lead our efforts in fostering meaningful connections with partners.

Interview with Sam Jeffers from Who Targets Me: is generative AI changing election campaigns online? 

The European AI & Society Fund spoke with Sam Jeffers from Who Targets Me about how generative AI is changing online political advertising and campaigns