Over the past decade, social media has become increasingly important and influential. Some of this has a positive impact – raising awareness of policies and campaigns, for example, and helping people stay connected though lockdowns. Yet, as our research confirms, social media also plays a role in amplifying old and new forms of discrimination, hate and abuse.
Uncovering the unknown
This can be hard to detect, as users can mask racism through memes and emojis, and hide behind fake online personas to spread hate while avoiding consequences for their actions. As a group of young peer research trainees, we therefore set ourselves a project to better understand the prevalence and impact of racism on social media, hearing the experiences of people of colour (POC).
The social media space is fast evolving, and we identified gaps in research and evidence regarding experiences of online racism. We found that most of the existing literature focuses either on adults or on young children – so there is a need for research around the experiences of young adults and racism in online spaces. There’s also little research into how young people report online racial abuse.
‘The changing face of racism’
As a group of researchers, we are all POC, and we are motivated to shed light on this issue, untangling the complex picture of racism that exists in both the physical and the online world today.
The 2021 report by the Commission on Race and Ethnic Disparities fails to consider the changing face of racism which, when moved online, is often ‘covert’; disguised through banter or implicit posts littered with emojis that are often undetectable to content monitoring processes.
As POC using social media, our personal experiences do not reflect a welcome, multi-racial society – and neither do those of the POC we interviewed in this research. Activity during and following Euro 2020 is a good example of this, where three black footballers received more than 1,622 racist comments after they missed penalties in the shoot-out at the finals. Britain was revealed as by far the largest country of origin for the abuse.
Time for change
With the introduction of the Online Safety Bill (2021), we believe our research is timely and much needed, as many online platforms look to improve their online safeguarding for users. The findings can benefit online platforms by highlighting the weaknesses of social media design and providing detail on how it can be addressed to create a healthier environment online.
Disha Das, a member of our Kickstart peer research team, comments: “As a Pakistani woman, I have experienced racist comments from people of various backgrounds. Whether it’s denying my Pakistani heritage or placing me in the forefront of colourist remarks, they have all been masked through ‘banter’, online and offline.
“Offensive imagery and content online is readily available and easy to share. Without even using words, another user can racially abuse me online. I wanted to conduct this piece of research to validate my experiences as a person of colour online and shed light on the experiences of others who have been silenced by ineffective moderation strategies on social media platforms to help build a better community online for young people.’’
Reflecting on the study
Publishing today, this study uncovers the experiences of a small group young people who experienced racism online. As Nora Zia, another member of the Kickstart peer research team that directed and conducted this research, reflects: “As someone who uses social media platforms for political engagement and to stay informed or raise awareness, I was interested to see what role social media had in spreading racism.”
However, as a team, we faced some challenges in compiling our evidence, including social media platforms flagging our research recruitment post as ‘too political’. This hindered our progress, and meant we had to rely on our own networks in order to recruit participants. It was also frustrating because, as social media users ourselves, we see ‘passive-aggressive’ and covertly racist content that is often left online. Social media platforms do not remove content that isn’t deemed to violate their guidelines, and these can lack the nuance to detect covert racism, such as racism using memes and emojis.
Another challenge came in analysing the data. While most of the data was consistent across participants, and consistent with previous literature, one survey submission contrasted heavily with the responses of other participants. This one participant commented: ‘No one is a ‘victim’ of [racism on social media] – unless they want to be’. (This was submitted through anonymous data collection, while our study itself highlights the misuse of online anonymity.) As researchers, we felt this response sought to invalidate the experiences of other young people of colour.
Both challenges emphasise the importance of presenting research around the topic of racism and continuing to give POC a platform to voice their experiences.
Valuable insights
This project is important as it explores how racism is presented through social media. It suggests flaws in social media design, and reflects on how these could be improved by recognising and addressing covert racism, such as the use of offensive emojis. This benefit social media design in future, and also highlights the importance of education to address ignorance around acceptance of different cultures and identities.
This article and the accompanying report were written, devised and directed by Nora Zia, Disha Das, Jonah Celestin and Sharpay Salazar-Turner, a group of Kickstart peer researchers recruited and supported by The Young Foundation. The Kickstart Scheme provides opportunities and training for young people deemed at risk of unemployment.
Families & Youth Inequality Peer research Posted on: 18 February 2022