Queensland university develops algorithm detecting women’s abuse

An algorithm was developed by a Queensland university to detect misogynistic Twitter posts.

Using tweets containing words abusive to women, such as whore, slut, and rape, the Queensland University of Technology’s sophisticated formula can analyse millions of tweets and identify content targeting women, including threats of harm or sexual violence.

“Our machine-learning solution cuts through the raucous rabble on the Twittersphere and identifies hateful content,” computer scientist  Richi Nayak told AAP.

“We hope the model can be adopted by social media platforms to automatically identify and report it to protect women”.

QUT researchers developed a text mining system where the algorithm learns tweet-specific and abusive language as it goes.

“The key challenge in misogynistic tweet detection is understanding the context of a tweet,” Professor Nayak said.

“The complex and noisy nature of tweets makes it difficult

“But, sadly, there’s no shortage of misogynistic data out there to work with”.

Prof Nayak said teaching a machine to differentiate context, without the help of tone, was key to this project’s success

The QUT team built a deep learning algorithm, which means the machine is able to look at its previous understanding of terminology and change its contextual and semantic understanding as it goes.

Researchers also ensured the algorithm could differentiate between abuse, sarcasm and friendly use of aggressive terminology as it built its vocabulary.

That’s tough to do because natural language is constantly evolving, Prof Nayak says.

“Take the phrase ‘get back to the kitchen’ as an example – devoid of context of structural inequality, a machine’s literal interpretation could miss the misogynistic meaning,” she said.

“But seen with the understanding of what constitutes abusive or misogynistic language, it can be identified as a misogynistic tweet.”

Prof Nayak said the model could also be expanded to identify racism, homophobia, or abuse toward people with disabilities.

“If we can make identifying and removing this content easier, that can help create a safer online space for all users,’ she said.

QUT’s faculties of Science and Engineering and Law and the Digital Media Research Centre published their research on the digital science platform Springer on Friday.

Leave a Reply

Your email address will not be published.