STSM interview: Biases in online media

Online media is a part of everyday life. Have you ever thought that news and social media posts can influence your mindset or world view? When we spend time online, all the posts we encounter are trying to persuade us in some way that the poster’s point of view is the right one. All the people behind those social media posts are influenced by their own biases and their behaviour is based on their world view. The emotional language is used by the influencers to try and reach a persuasive goal.

One way of understanding the world is through stereotypes. However, these stereotypes can become harmful when they do not allow for exceptions. We all have biases that are rooted into stereotypes about groups of people. To better understand how the media landscape on the internet affects us, Barbara Lewandowska-Tomaszczyk proposes an approach to identify and categorise speech biases. She is working towards creating a typology to identify biases from online discourse via computational modelling. According to Lewandowska-Tomaszczyk, biases can be explicit or implicit. They can be identified from language use and behaviour. These biases can be shown e.g., in political discourse, printed press or social media. To identify biases, we need to look at the history and culture of a country or a group of people.

When it comes to online media, free speech versus moderation comes to mind. Problems might arise, e.g., in political discourses and political news when biases can be a basis of influence on narratives and reporting facts selectively. Biases can be distinct, but also often conflicting background narratives. Biases become more difficult to identify when they are implied by these contrasting interpretations of facts. Lewandowska-Tomaszczyk would like to make media users’ communities become more aware of the presence of biases in different media, as well as the impact that these biases can have.

Biases are connected to emotions. Harmful biases can lead to hate speech. The question is when biassed speech crosses the line of becoming harmful. On one hand, if the language is explicitly offensive, uses harmful generalisations about a group of people and is vulgar, the case is clear. On the other hand, according to Lewandowska-Tomaszczyk, implied and vague language can mask harmful biases and intentions. Cases that are not so clear would benefit from a public discussion about where the line for moderation and censorship lies. A discussion about regulation versus free speech is close to the topic at hand, but before we can come to that, it is important to be able to identify and conceptualise biassed speech in online spaces. This is what Lewandowska-Tomaszczyk is working towards. 

Before embarking on her short term scientific mission, Lewandowska-Tomaszczyk wanted to connect with people who were also researching similar topics. During the LITHME Working Group 6 meetings she was able to hear a talk of her future Short Term Scientific Mission host institution colleague Dr Sviatlana Hoehn and had an opportunity to discuss the topic and related issues. Lewandowska-Tomaszczyk is particularly interested in bias identification in social media texts and bias computational modelling. Lewandowska-Tomaszczyk has also been working independently on the language of offence and insult, sees some overlap with the bias scope there and wanted to explore this research avenue as well.

“In fact, my expectations have been exceeded as I met a group of highly competent colleagues during my STSM at the University of Luxemburg. We managed to submit a paper for a conference, with some prospects for its publication, and now we’ve been working on the other aspects of bias modelling.”

“Considering my really fruitful research experience and openness to share the knowledge from other researchers during my STSM at Luxemburg University, I strongly recommend applying for STSMs to anybody who has an interest and a concrete plan for research cooperation in particular domains of language in the human-computer context. “

Barbara Lewandowska-Tomaszczykvisited the University of Luxemburg for one week and her scientific report will be published on the LITHME website in Autumn 2022. Lewandowska-Tomaszczyk was interviewed by LITHME assistant Enni Kuusela.

Scroll to Top
Skip to content