Joseph Kwarteng's Spotlight Story on MisogynoirKiran Parmar, Wednesday 25 August 2021 | Annotate
What could you tell us about your research?
My research is about intersectionality in hate speech detection, focusing on "Misogynoir" - a particular kind of hate that black women experience. Moya Bailey coined the term "Misogynoir" to describe "the specific hatred, dislike, distrust, and prejudice directed toward Black women". It is misogyny against black women with the intersection of race and gender. The concept of misogynoir is exclusive to Black womanhood, and women of other races cannot experience it, but individuals of any gender or ethnicity can perpetuate it. For instance, the hyper sexualisation of Black women and stereotypes that characterise Black women, particularly as angry, unreasonable, or extraordinarily strong, are examples of misogynoir that impact the health, safety and well-being of Black women and girls. For example, actress and comedian Leslie Jones had to quit the social media platform Twitter after receiving a disproportionate number of hateful remarks in reaction to her part in the all-female Ghostbusters film remake. A qualitative content analysis of the abusive comments revealed multiple forms of misogynoir in these comments relating to Jones's attractiveness to men or perceived "masculine" features, a gendered and racist comment that Black women are more likely to experience.
What exactly are you researching?
My research aims to explore how misogynoir manifests online and how it could be mitigated since existing technologies for hate speech detection do not accurately identify this type of hate and protect black women accordingly. The methodology we are considering for this research is multidisciplinary, combining social, computational and linguistic approaches.
Why is this important?
This study is vital because these targeted abuses can destroy lives and cause people to lose their self-esteem and confidence to speak up. According to an Amnesty International survey, 41 per cent of women who had encountered online bullying or violence said that, on at least one occasion, these online encounters made them believe that their physical protection was compromised. Moreover, as most online social networking platforms ban hate speech, the sheer scale of their networks and web applications makes it very difficult to monitor all of their content. Although these platforms have created automated techniques for detecting hate speech, research has shown that these approaches do not perform as effectively for specific marginalised groups such as Black women or types of hate speech that are intersectional, such as Islamophobia and Antisemitism. Therefore, exploring ways to help platforms reduce the amount of hate Black women receive is crucial.
What would be your predictions for your study area in the future?
The future prediction for this study is to have a state-of-the-art detection model that recognises the different types and characteristics of misogynoir. Moreover, an automated model that can accurately identify misogynoir online. In addition, the study will help inform policies around hateful conduct online and expand knowledge around intersectional hate.
What is something you have learned about that you did not expect to?
This research has helped me recognise that multiple social categorisations like race, gender, class and others combine to produce a distinct model that might subject one to oppression. Moreover, misogynoir is not synonymous with racism or misogyny as single components but instead exists at the intersection of the two in unique ways to Black women. It is very contextual, and we, therefore, need to educate ourselves about it and learn to recognise it.