The presence of the Internet in schools provides many opportunities for students to practice independent research. However, this requires teachers and administrators to place a large amount of trust in the hands of most students, some of which may not be responsible enough to handle such freedom. As a result, many states have mandated that content filters be used in public schools and libraries to better control the subject matter that students are accessing on school-issued technologies.
A content filter, or information filter, is a program that makes questionable, distracting or inappropriate online materials unaccessible. Many companies will install these programs on office computers to keep their employees focused and off social media. Similarly, schools will have the software installed to keep students off of sites that are prone to hacking, display pornography or other sorts of unsuitable content.
There are three types of content filters: blacklists, content inspections and extension blocking. Blacklists are generally produced by private companies who compile a list of “bad” categories and buzzwords that computers are programmed to block. Content inspections are when a computer pre-examines sites for banned or suspicious activities (such as hacking, malware, etc.) and blocks them based on what is found. Extension blocking inhibits the downloading programs (streaming services, files, etc.) that have the potential to harm or slow down a computer.
With all of these options, it would seem that content filtering is an effective method for protecting students from the damages that can be done by the Internet. However, these programs are often easily circumvented. Certain keywords and websites can be left off of blacklists and enable students to continue accessing them so long as no teachers or administrators catch them. Many tech-savvy students can also use proxy servers to bypass content filtering programs giving them access to whatever they wish.
Students have access to plenty of technology which give them the upperhand in sidestepping even the most vigilant content filters. In such cases, there is little that can be done to monitor what students are doing on computers at school. Additionally, these programs lack the ability to monitor computers for words and phrases. That means students can log on to approved websites (such as chat rooms, emails, etc.) and continue sending harmful materials. With all of this room for error, it is time to accept that content filters alone may not be enough to protect students from the Internet.
Laura Jane Crocker