The series has prompted conversations world-wide about the influence of social media on children and toxic online content, sparking debates in government and across schools about what can be done to protect young people.
Dr Lauren Burch, an expert in digital and social media communication and marketing, shares her insights on how normalised behaviour is contributing to online bullying becoming a continual problem; how the language we use and way we communicate makes moderating the issue more complex, and why parents need to be involved in their children’s online activity.
“Individuals are growing up with more and more technology - having immediate access to other individuals [online]. So it may be that sending a message to someone and not necessarily thinking about it is something that's just very commonplace now, whereas previously we might have held back or not sent that message in an instance. Our acceptance of technology and the way we utilise it makes it a continual problem.”
Although content moderation policies are in place across most social media platforms, Dr Burch explains how the language we use and the way we communicate online, makes the issue more complex.
“There are some interesting technical insights that go into this and what we can and cannot identify as abusive. If you’re using specific words within a message, for example, the word hate, that can be flagged. Homophobic or racist messaging can also be flagged. We're also getting better AI technology to help us identify this, but AI can still miss some things.
“We know that there's a huge gap in what we would refer to as incivility or intolerance, which can result in online bullying. So, say you use a word that isn't necessarily going to be caught by content moderation because it’s used in a very specific way. That can be dealt with using different types of technology and algorithms that don't necessarily use a dictionary definition of a word but instead looks at sentence structure and the context the word is being used within.
“Emojis are considered within this as well, so, it becomes a complex issue in terms of natural language processing and how you identify their associated meaning. Adolescence highlights this brilliantly in terms of toxic masculinity on social media with the explanation behind specific emojis like the red pill or kidney beans.
“The technology is there, and it can be implemented in better ways by the platforms. This is where partnerships between social media platforms and academia can be useful to identify how online abuse is communicated through language or symbols with this knowledge applied to enhance content moderation algorithms.”
Discussing the importance TV series like Adolescence can play in raising awareness of the issues and also supporting parents to make big decisions in relation to their children and online activity, Lauren said:
“We’re seeing an increase in programmes that focus on online abuse, like Adolescence. Parents need to be aware that this behaviour exists. Their child could receive these types of messages. It's not out of the ordinary. We use devices to communicate so often in our daily lives. They are ingrained in everything we do, so that form of abuse, is a very real possibility.
“Parents are having to decide whether or not they're going to allow their child to have social media - a very nuanced and complex, very individual decision, but something that people should think about.”
To arrange an interview with Dr Lauren Burch, email Dan Trussell or call 01509 228686.