Skip to main content

New! You can personalise your feed. Try it now

Advertisement

Advertisement

Commentary: I was a cyberbullying victim — it motivates me to develop an AI tool to tackle online toxic speech

As a digital native, I have pretty much grown up my entire life with some form of technology.

To visualise toxic speech, imagine a person marching up and announcing to a crowd that a particular individual is “cringy” and deserves to die, says the author.

To visualise toxic speech, imagine a person marching up and announcing to a crowd that a particular individual is “cringy” and deserves to die, says the author.

Follow TODAY on WhatsApp

As a digital native, I have pretty much grown up my entire life with some form of technology.

From televisions, to Nintendo games, and to smartphones. I was always impressed with the new gadgets that the world created, and like most teens, I started using social media at 13.

Despite my love for technology, I view it now as both a blessing and a curse.

I had fun making content and texting my friends and family, but I also found myself being exposed to “cancel culture” and toxic speech.

To visualise toxic speech, imagine a person marching up and announcing to a crowd that a particular individual is “cringy” and deserves to die.

To be frank, hardly anyone would expose themselves nowadays unless infamy is what they seek.

That said, this is something that frequently happens online, where users unconcernedly send hateful comments because they feel safe hiding behind anonymous profiles.

TWO SIDES TO TECHNOLOGY

Nevertheless, I would not deny the positives to technology. As the saying goes, there are two sides to a coin.

On the positive side, with increased global connectivity, online interactions have become the norm and it is now easier than ever to study remotely and connect with people for work or family.

On the negative side, people are now more vulnerable to personal attacks by anyone, anywhere.

The exponential growth of user-generated content platforms and online communication platforms has enabled people to share their personal views or chat with one another.

The number of “flame wars” has also risen, starting with disagreement of views before escalating to heated arguments where both parties fight fire with fire.

These parties may not even remember what they were debating about, and start using personal attacks and sometimes even threats to win.

Witnessing these “flame wars” in my class chats and social media throughout my youth, only reinforced for me that hateful speech is undeniably destructive and violent.

MY EXPERIENCES WITH TOXIC SPEECH

Empathly co-founder Jamie Yau, 19, poses for a photo at Singapore Polytechnic on Nov 24, 2022.

I had my first experience with a form of toxic speech - cyberbullying - when I discovered that my friend had been saying mean things about me, and posting it on her private social media account.

She described me as “annoying”, “talkative”, and many other hateful comments which cannot be replicated here.

I felt betrayed and upset. “Is friendship something that can be so easily overwritten when one's emotions change? If she was upset with something, why didn’t she let me know?”

Looking back, I learnt something important. Nothing online ever stays private.

So when we post targeted hateful comments online, we are neglecting the feelings of the "subject". Which is why empathy is just so important in our digital spaces.

The second time I was exposed to toxic speech was a story I heard from my good friend.

She had bad experiences with her colleagues, as she showed me screenshots of her supervisor talking to her in a condescending tone on a workplace communication platform, and cyberbullying her in their private group channels.

I was incredibly taken aback. I felt frustrated and angry that she had to go through this terrible experience.

Although these were just two experiences, such feelings of unhappiness and hurt became a motivation. A motivation to change the way people use online platforms. A motivation and vision to make Singapore’s online space kinder, and more empathetic.

Bringing the strong feelings I had to make a change, I did some research on how big the issue of online toxicity is.

I found that 27 per cent of respondents in Singapore were involved with online personal attacks in 2021, and even 24 per cent of workers in Singapore have experienced a form of workplace bullying in 2019.

With a better idea of the issue, I had a casual chat with my friend, Timothy (who’s now my co-founder), about my hope to restart online conversations and create a safer internet for all.

He too was passionate about mental health and empathy and shared with me that he was also looking to cultivate online safety through a startup idea he had – Empathly.

The idea started off as a conversational moderator for schools and parents to use with children. After going back and forth about Empathly, we realised it had the potential to be much more.

A DIGITAL CONSCIENCE

After plenty of development, Empathly became an AI tool that aims to be the digital conscience for online conversations.

Empathly identifies the hateful comment before it is sent and provides a behavioural nudge to the user to encourage them to reconsider their comment.

If I type out a sexist comment “Women belong in the kitchen”, Empathly will issue a behavioural nudge in this case: “Are you sure you want to send that? The other person could feel discriminated against”

Our AI is able to identify three types of hate speech, while taking into account context.

They are: Violent, Sexist and Racist language. It can also detect abusive terms in English, Hokkien, Cantonese, Malay and Singlish — or Singaporean English.

With most solutions in the market only removing hateful comments after it’s sent and the damage is done, we want to pre-emptively change people’s behaviour from the core.

With behavioural nudges, we aim to let the users practise empathy by putting themselves into the receiver’s shoes.

As young startup founders, we have had our fair share of challenges. It has been difficult to establish ourselves in the startup space because of our age.

Timothy and I like to think: “With adversity comes strength”.

In order to prove our value, we had to stay up late nights, juggle school work, and even do ad-hoc sales with customers and partners to bring Empathly to greater heights.

Our journey to create Empathly from an idea to a working solution in 2022 has not been easy.

We are still an early-stage startup, but we are ambitious and tenacious. Our vision has not faltered, from creating a safer and kinder internet for us to work and play.

We believe that we can make this change. Most importantly, we hope to chart the way for future like-minded youth entrepreneurs, to step up and take ownership of positive changes we want to see in our community.

ABOUT THE AUTHOR:

Jamie Yau, 19, is chief operating officer of Empathly. She is passionate about online safety, eradicating hate speech and aims to make the internet safer with her startup. This is an adapted version of a piece that first appeared in The Birthday Book: Restart, a collection of 57 essays on what it means to have a new start in Singapore.

Related topics

AI toxic speech cancel culture social media

Read more of the latest in

Advertisement

Advertisement

Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.