Cyberbullying is a real concern for students, parents and school administrators alike. But what are the most common sites for cyberbullying? And how can you protect your students from them?

 

Facebook

Many teens post to Facebook seeking positive affirmation. This makes them vulnerable to cyberbullying. Often, bullies create fake accounts by signing up under a random name. Then, they use a stolen photo for their profile picture. The now-anonymous bully can then access the unsuspecting victim’s page. This features makes Facebook one of the most common sites for cyberbullying. On Facebook, cyberbullying most often takes the form of private messages, comments and emoji reactions.

Facebook has improved their anti-harassment and anti-bullying tools. Additionally, they’ve strengthened their community standards against bullying. They’ve even created a Bullying Prevention Hub. However, parents and teachers should still remain vigilant. Discussing options such as unfriending, blocking or reporting the bully may be helpful. Also, spread the word about privacy settings for profiles. These settings can greatly decrease the risk of bullying.

To view more of Facebook’s privacy settings and options, go here.


Twitter

Twitter only requires an email address to create an account. After joining the platform, cyberbullies can tweet and direct message their victims at will.

Twitter added new anti-cyberbullying features in 2017. Users may now filter notifications they receive. They can also mute certain keywords or phrases. Additionally, users can see the status of the reports they’ve made. Any accounts or tweets they’ve reported will appear lower in the user’s feed. They also won’t turn up in the user’s search results. 

If reported, cyberbullies may receive a Twitter “time out.” This means they’re banned from the site for a certain period of time. If a user’s account is banned, Twitter blocks them from creating another account. 

For more on Twitter’s anti-bullying stance, click here.


YouTube

Cyberbullies use YouTube to send messages, comment and even to mock people with their own videos. 

YouTube has the usual tools, such as reporting, deleting and disabling comments. Reported users earn a strike, which is akin to a warning. If they receive a second strike within three months, they’re banned for two weeks. A third strike in three months results in account termination. 

Additionally, Youtube offers a unique tool: Restricted Mode. Parent or school administrators can use Restricted Mode to block comments on videos. 

For more information on YouTube’s policies, click here.


Reddit

Reddit contains communities called subreddits. Each subreddit (sometimes called a “sub”) focuses on a specific topic. Users can then communicate in a forum-like atmosphere, trading public messages in accordance with the subreddit’s rules.

In the past, Reddit accounts were completely anonymous, and blatantly offensive subreddits were free to flourish. There were subreddits dedicated to racism, white supremacy, misogyny and homophobia. In addition, some users created subreddits specifically to bully groups of people.

Recently, Reddit strengthened its anti-harassment rules and banned several subreddits for harassment and bullying. Additionally, you may now block and report anonymously. However, Reddit does require a certain number of reports be received before they start investigating. Once reported, individual user accounts may be suspended or banned. Entire subreddits may also be quarantined or banned, if they are offensive to “the average Redditor.” 

For more information on Reddit’s policies, click here.

 

Text by Jennie Tippett