Unintended bias evaluation: an analysis of hate speech detection and gender bias mitigation on social media using ensemble learning.

NASCIMENTO, Francimara, CAVALCANTI, George and DA COSTA ABREU, Marjory (2022). Unintended bias evaluation: an analysis of hate speech detection and gender bias mitigation on social media using ensemble learning. Expert Systems with Applications, 201: 117032. [Article]

Documents
30005:601989
[thumbnail of Da Costa-Abreu-UnintendedBiasEvaluation(AM).pdf]
Preview
PDF
Da Costa-Abreu-UnintendedBiasEvaluation(AM).pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (429kB) | Preview
Abstract
Hate speech on online social media platforms is now at a level that has been considered a serious concern by governments, media outlets, and scientists, especially because it is easily spread, promoting harm to individuals and society, and made it virtually impossible to tackle with using just human analysis. Automatic approaches using machine learning and natural language processing are helpful for detection. For such applications, amongst several different approaches, it is essential to investigate the systems’ robustness to deal with biases towards identity terms (gender, race, religion, for example). In this work, we analyse gender bias in different datasets and proposed a ensemble learning approach based on different feature spaces for hate speech detection with the aim that the model can learn from different abstractions of the problem, namely unintended bias evaluation metrics. We have used nine different feature spaces to train the pool of classifiers and evaluated our approach on a publicly available corpus, and our results demonstrate its effectiveness compared to state-of-the-art solutions.
More Information
Statistics

Downloads

Downloads per month over past year

View more statistics

Metrics

Altmetric Badge

Dimensions Badge

Share
Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Actions (login required)

View Item View Item