Four Ways Online Games can Prevent Cyberbullying

Four Ways Online Games can Prevent Cyberbullying
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Let's talk about something serious: bullying. Although it has been around since time immemorial, bullying is at an all-time high, and it has become particularly prevalent online.

Ubiquitous internet access via smartphones, computers, and tablets has made it easier for people to anonymously bully others. Cyberbullying can have devastating effects on victims, including depression and physical pain. It can even lead to suicide.

We're all familiar with adults bullying each other on Twitter, Facebook and other social channels, particularly during election season. But we also see it increasingly impacting the lives of kids and teens; in fact, 68 percent of teens agree that cyberbullying is a serious problem.

This is an important child safety issue, and the responsibility for creating solutions doesn't fall exclusively to any one group. Parents are primarily responsible for monitoring their children's online activities, but game developers can also play a major role in the prevention of online harassment. Below are four ways developers of online kids' games can better prevent cyberbullying.

Implement a modern content filtering system

Kids are creative and they will find ways to get around dated chat filters. As a result, game developers must stay current with the latest chat analysis and filtering technology. Find a solution that can handle day-to-day chat and content filtering like Community Sift, an AI and heuristics-based smart content filter for businesses and communities. Tools like Community Sift can be integrated into existing systems, and they can do a lot of the heavy lifting to keep bullies at bay.

Hire in-game moderators to monitor kids' interactions

As innovative as machine learning and artificial intelligence are, humans still catch things that computers cannot. For this reason, professionally trained in-game moderators play an important role in preventing bullying. In Animal Jam, a top social network for kids, players have assigned new meanings to words ("scammed" becomes "scalloped," for example). This adds an additional layer of complexity that chat filters can miss. It's not just words--inappropriate visual content is even harder to spot with automation. A good example comes from Animal Jam's in-game "Masterpiece" feature.

"Masterpieces are artworks that kids can create and display within the Animal Jam world," said Clark Stacey, CEO of WildWorks, the creator of Animal Jam. "Our community has submitted over a million Masterpieces, which are each reviewed individually by human moderators. These moderators make sure that any inappropriate content submitted never appears in the game."

Give parents easy access to their child's account

Developers should treat parents as partners when combatting online bullies, especially as kids are not likely to share with a parent that they're being bullied. Automated email updates, parent dashboards, and readily accessible customer support are a few solutions developers can offer to keep parents involved in their child's online activity.

Foster dialogue about bullying and safety

Perhaps the most important bullying prevention tool for game companies is to communicate consistently with their players. Developers should define what behavior is acceptable with clear rules and create a simple mechanism for players to report violations.

As players take to social media to voice concerns, dedicated customer service is more important than ever. Nearly half of consumers expect a response from companies within an hour on social media. Prioritizing responses to bullying-related inquiries can reassure concerned parents and players that the developer takes bullying seriously.

Online bullying isn't going away, but game companies can mitigate it with good practices like adopting content filtering technology, employing human moderators, and partnering with parents.

Popular in the Community

Close

What's Hot