Online child exploitation is a horrific crime that requires an effective response. A draft bill, first proposed by Sen. Lindsey Graham (R-SC) in January, intends to provide exactly that. However, technology experts warn the bill not only fails to meet the challenge, it creates new problems of its own. My job is to enable journalists to do their work securely — to communicate with others, research sensitive stories and publish hard-hitting news. This bill introduces significant harm to journalists’ ability to protect their sources.
Under the Eliminating Abusive and Rampant Neglect of Interactive Technologies (or EARN IT) Act, a government commission would define best practices for how technology companies should combat this type of material. On the surface, EARN IT proposes an impactful approach. A New York Times investigation in September found that “many tech companies failed to adequately police sexual abuse imagery on their platforms.” The investigation highlighted features, offered by these companies, that provide “digital hiding places for perpetrators.”
In reality, the criticized features are exactly the same ones that protect our privacy online. They help us read The Washington Post in private and ensure we only see authentic content created by the journalists. They allow us to communicate with each other. They empower us to express ourselves. And they enable us to connect with journalists so the truth can make the page. This raises the question of whether the bill will primarily protect children or primarily undermine free speech online.
It should be pointed out that EARN IT does not try to ban the use of these features. In fact, the bill does not specifically mention them at all. But if we look at how companies would apply the “best practices,” it becomes clear that the government is intending to make these features difficult to provide, that the government is looking to discourage companies from offering — and increasing the use of — these features. By accepting EARN IT, we will give up our ability — and our children’s future abilities — to enjoy online, social, connected and private lives.
Our digital life is protected by the same features that allow some bad people to do bad things online.
Four of the “best practices” relate to requiring companies to have the ability to “identify” child sexual abuse material. Unfortunately, it’s not possible to identify this material without also having the ability to identify any and all other types of material — like a journalist communicating with a source, an activist sharing a controversial opinion or a doctor trying to raise the alarm about the coronavirus. Nothing prevents the government from later expanding the bill to cover other illegal acts, such as violence or drugs. And what happens when foreign governments want to have a say in what is “legal” and what is not?
Our digital life is protected by the same features that allow some bad people to do bad things online. They protect us as we visit The Washington Post website, use the Signal app to contact one of its journalists or use the Tor Browser to submit information to their anonymous tip line. These features all enable privacy, a core component of the journalistic process. They enable journalists to pursue and tell the truth, without fear or favor. And not just in the U.S., but globally. We should empower and enable this work, not sabotage it by removing crucial capabilities, even in the name of child protection.
The same New York Times investigation found that law enforcement agencies devoted to fighting online child exploitation “were left understaffed and underfunded, even as they were asked to handle far larger caseloads.” The National Center for Missing and Exploited Children (NCMEC), established by Congress in 1984 to reduce child sexual exploitation and prevent child victimization, “was ill equipped for the expanding demands.” It’s worth asking, then, why EARN IT does not instead empower these agencies with additional resources to solve crimes.
We must consider the possibility that this bill fails to achieve its stated goal. That it will not protect children online, and will introduce harm to their digital presence and ability to speak freely. Everyone deserves good security, and it’s on us to find ways to prevent harm without compromising on our digital rights. To force companies to weaken our protection to give law enforcement greater insight would be the equivalent of forcing people to live without locks and curtains in their homes. Are we willing to go that far?
That’s not to say we have to accept no solution. But it can’t be this one.