Access to the Internet in schools provides students with many opportunities to practice independent research. However, this requires teachers and administrators to place a large amount of trust in the hands of most students, some of which may not be responsible enough to handle such freedom. As a result, many states have mandated that content filters be used in public schools and libraries to better control the subject matter that students are accessing on school-issued technologies.

A content filter, or information filter, is a program that makes questionable, distracting or inappropriate online materials inaccessible. Many companies will install these programs on office computers to keep their employees focused on work and off social media. Similarly, schools will have the software installed to keep students off of sites that are prone to hacking, display pornography or access other sorts of unsuitable content.

There are three types of content filters:

  • Blacklists are generally produced by private companies who compile a list of “bad” categories and buzzwords that computers are programmed to block.
  • Content inspections are when a computer pre-examines sites for banned or suspicious activities (such as hacking, malware, etc.) and blocks them based on what is found.
  • Extension blocking inhibits the downloading of programs (streaming services, files, etc.) that have the potential to harm or slow down a computer.

With all of these options, it would seem that content filtering is an effective method for protecting students from the damages that can be done by the Internet. However, these programs are often easily circumvented. Certain keywords and websites can be left off of blacklists and enable students to continue accessing them so long as no teachers or administrators catch them.

Many tech-savvy students use proxy servers to bypass content filtering programs, giving them access to whatever they wish online.

Students have access to plenty of technology, which gives them the upperhand in sidestepping even the most vigilant content filters. In such cases, there is little that can be done to monitor what students are doing on computers at school. Additionally, these programs lack the ability to monitor words and phrases on computers, which means that students can log on to approved websites (such as chat rooms, emails, etc.) and continue sending harmful materials. With all of this room for error, it is time to accept that content filters alone are not enough to protect students on the Internet.

Laura Jane Crocker


Automated Student Computer Monitoring

Human Monitoring

Human Monitoring

Review and Alert

SEL Solution


Tele-Mental Support

Remote Therapy Services