In a recent press announcement, Internet Service Providers AOL, Yahoo!, Microsoft, EarthLink, and United Online have formed a Technology Coalition at the National Center for Missing and Exploited Children to battle child exploitation on the Internet. This latest effort at computer industry self-regulation by some of the largest ISPs suggests that the fight against child exploitation may soon migrate to the workplace.
Executives responsible for determining electronic use policies at private and public companies may want to consider reviewing and adopting measures taken by the coalition.
Computer Industry Self-Regulation
Historically -- due to commercial and privacy concerns -- the computer industry has been reluctant to pledge the level of cooperation identified in the recent announcement. ISPs have, however, provided measured assistance to law enforcement efforts over the years. For example, since 1996 ISPs have removed over 20,000 pornographic images of children from the Web.
Under current federal law, The Child Protector and Sexual Predator Punishment Act of 1998, ISPs are required to report known incidences of child pornography to authorities. Although ISPs are not required by law to actively monitor customers or sites, members of the Technology Coalition have expressed the intent to do so.
The Technology Coalition's "aggressive new campaign" has four principle objectives:
- Developing and implementing technology solutions that can detect and disrupt the known images of child pornography on the Internet;
- Improving knowledge sharing among industry by establishing a central clearinghouse for known images of child pornography;
- Improving law enforcement tools that can assist in the location and identification of predators and distributors of child pornography; and
- Researching perpetrators -- technologies to enhance industry efforts.
Although specific program details of new technology coalition are not available, the group's objectives indicate a proactive approach in dealing with the problem. According to an executive at AOL, such measures may include scanning email attachments, web uploads, and instant messages.
In addition to industry self-regulation, workplace responses have been identified by law enforcement as a major strategy to reduce child pornography on the Internet. Many workplaces maintain their own web and email servers that allow employees to access and store data on their work computers. As a result, work computers are frequently implicated in incidents involving child pornography.
Child exploitation is not confined to a single profession or industry. In the United States alone, around 1,000 individuals are convicted of child exploitation crimes each year. There is thus an abundance of judicial precedent establishing that employees in both the public and private sector have no legitimate objective expectation of privacy from their employer when it comes to their work computers.
Employers who wish to participate in the fight against child exploitation will soon have the tools to do so, courtesy of the Technology Coalition. Microsoft is among the ISPs in the coalition, and it is entirely possible that anti-child pornography tools could be integrated as features in the new Windows Vista operating system, or come bundled in a Microsoft Outlook or Explorer security update pack.
There are several workplace strategies recommended by the Department of Justice which may be implemented to deter and alter the behavior of potential offenders:
- Adopting and enforcing computer use policies.
Corporate use policies often contain broad prohibitions on pornography and sending photo attachments by email. In organizations that regularly use images for legitimate purposes, such a blanket prohibition may not be practical. Companies need to balance workplace effectiveness with their community responsibility response.
- Auditing computer usage.
Many employers have existing ways to audit computer usage. The coalition's expected technical solutions include scanning email and email attachments for known images of child pornography. Such system would include some form of tagging that would be updated on a regular basis. It seems likely that workplace system administrators could install the new updates and scan their mail server's incoming and outgoing communications for offending materials.
- Filtering web usage.
Employers routinely engage in web-filtering practices that restrict the sites employees may visit. For example, some workplaces block access to free web email accounts so that employees cannot circumvent the company's official email systems.
If your workplace needs a compelling reason to implement technologies recommended by the coalition, consider that the campaign against child exploitation is a key priority in federal law enforcement, the judiciary and the legislature.
Over the past twenty years, the Supreme Court has issued a string of opinions that have paved the way for private industry's action against child pornography. In New York v. Ferber, the court held that child pornography does not enjoy First Amendment protection under the Constitution. In US v. Dost, the definition of child pornography was expanded to include sexually suggestive depictions of a lascivious nature. In Osborne v. Ohio, the private possession of child pornography was deemed to be illegal.
In addition to the judiciary's prescriptions against child pornography, there has been almost thirty years of federal legislation targeting child exploitation, including:
- The Sexual Exploitation of Children Act of 1978
- The Child Protection Act of 1984
- The Child Protection and Obscenity Enforcement Act of 1988
- The Child Pornography and Protection Act of 1996
- The Child Protector and Sexual Predator Punishment Act of 1998
As noted above, details of the industry initiatives have not been fully disclosed. However, the general idea of a central database of known pornographic images of children does raise privacy and due process questions. These potential issues become especially serious if the database will be used to detect, identify and criminally prosecute individuals who may or may not be actually engaged in illegal activities.
One such issue involves "virtual child pornography," or models that appear to be minors. The Supreme Court has recently addressed virtual child porn in Ashcroft v. Free Speech Coalition, holding that virtual images of children are not included in the definition of child pornography. What if such virtual depiction is wrongfully identified and included in the database? The issue of "false positives" looms as a very real hurdle for a database of kiddy porn.
The industry coalition could avoid this issue by implementing an administrative appeals process or some other way to remove false positives from the database. While a due process argument would not work against private industry ISPs, who are not state actors, the argument could be raised in the likely event that the industry collaborated with a government agency to populate the image database.
Of course, the vast majority of employers would not care whether or a particular sexually explicit image is or is not child pornography. In either event, an employee caught with such an image would be violating a broadly-drafted computer use policy's prohibition against pornography.
"False positives," or images mistakenly marked as child pornography, raise a more serious issue for employees. For example, detection of a false positive could mean the difference between censure from the employer and criminal prosecution for possession of child pornography.
While elimination of child pornography from the Internet is impossible, workplaces can take steps to reduce the volume by making it more difficult to access.