Recently, media coverage of the role technology plays in students’ mental health and self-image has soared. In public hearings, lawmakers exposed the ways that social media harms teen mental health and body image. Many even accused Instagram and its parents company, Meta, of allowing content encouraging disordered eating. A recent article in the New York Times detailed another problem: the profusion of explicit images and videos on the internet. Shockingly, the Times revealed that students frequently access online pornography on school-owned devices.
In her New York Times article, Cecilia Kang reports the results of a 2022 survey conducted by Common Sense Media. According to the survey, by the age of 17, 75% of teenagers have seen explicit images on the internet. Most first encountered online pornography at the age of twelve. These encounters were often accidental. Kang states that 58% of respondents came across explicit images not by seeking them out but through search engines or social media posts. Schools rely on content filters to keep such content off of school computers. Kang’s report shows that content filters are not working. In fact, the survey showed that 41% of respondents “had seen images of nudity or sexual acts online during the school day.”
LearnSafe Detects Online Pornography on School-Owned Devices
The Children’s Internet Protection Act (CIPA) requires schools to monitor technology use by both students and staff. Many schools rely on content filters for CIPA compliance. However, students can easily bypass content filters. Kang’s reporting also highlights how content filters are not always effective. In order to fully protect students, schools need to pair content filters with screen monitoring software. Screen monitoring software, like LearnSafe, works with filters to protect students from explicit images. LearnSafe can detect student access to online pornography on school-owned devices. LearnSafe can also detect instances of cyberbullying and grooming, keeping students safe online.