Back to News & Commentary

Newest Word to Take on Orwellian Overtones in Internet Age: “Trust”

Jay Stanley,
Senior Policy Analyst,
ϰſ Speech, Privacy, and Technology Project
Share This Page
April 18, 2013

What could be warmer and fuzzier than “trust”? Between two human beings, it’s a hard-won bond that binds them together. In society, it is a currency that helps create a prosperous and efficient economy and culture, as thinkers such as and have argued. But recently the word has taken on a new cast of ambiguity, and seems to be fast becoming the newest entry in the lexicon of Orwellian formulations, along with such once purely warm and positive words such as “security,” “defense,” and “intelligence.”

For example, some usages of “trust” include:

  • Trusted Identities in Cyberspace. In the context of cybersecurity, where one of the biggest problems is “attribution”—the ability to figure out who is carrying out a cyberattack—“trusted” often means “the ability to deny you the ability to be anonymous online.” The National Strategy for Trusted Identities in Cyberspace is a current effort to create a new infrastructure for identification online. As I explained here, such an effort could be a good thing if it makes use of existing cryptographic techniques that allow one to be trusted while also anonymous or pseudonymous. But, it’s unclear whether those techniques will end up being used.
  • Trusted Computing. Trusted computing involves using special microchips with hard-wired encryption capabilities that can’t be changed by the computer’s owner or operator. In theory this permits transactions to take place on a person’s computer—such as the downloading of a movie—that a remote party can trust will only be performed in an authorized manner. But that in order to protect such transations, the owner of a computer would have to be blocked from carrying out certain manipulations of data on their own machine—meaning that big companies or other parties could wrest control over the computer from that owner.

The basic dynamic here is that trust is being reapplied from people to machines. In order to trust a machine, one must block anyone’s ability to change, disguise, or spoof its identity, to program a computer to appear to be something that it is not, or deploy a whole host of other tricks and stratagems that can be used for ill by hackers.

That can be good in some circumstances—but all of those “hacker” stratagems are also a source of freedom. They are what allow a person to communicate anonymously online, or to look at a web page without being tracked and recorded. To block such possibilities with mathematical certainty, government or companies must fundamentally alter the nature of a computer, turning it from a “Turing Machine” whose operator can use it to run an infinite number of arbitrary programs, into something less

In short, building a machine that can be “trusted” is a pretty close equivalent to building a machine that is protected against control by humans who are distinctly not being trusted. In the human world, “trust” means that you are willing to allow another person more control, because you trust them. But in its new formulation, it means precisely the opposite: that control is being taken away from you.

There are other, broader uses of the word “trust,” outside the computing context, that carry similarly Orwellian overtones. For example:

  • Trusted traveler programs. In these programs, the government carries out background checks of one sort or another in order to label travelers as “trusted” or “untrusted,” with those who win the former designation getting preferential treatment in a screening process. That’s great for those lucky people who are “trusted,” but for those stamped as untrusted—often unable to determine why, or change their designation—the system , and overall it raises many disturbing questions. Examples include the CBP’s Global Entry program and the TSA’s Pre-Check.
  • Trusted Situational Awareness. The latest example of a spooky usage of the word is in Dayton Ohio where, as I wrote about recently, an aerial surveillance scheme has been labeled “Trusted Situational Awareness.” (Since I wrote that post, the program has apparently been terminated.)

Like the computer examples, these usages contain the word “trust” but imply its opposite: a broad increase in government power over ordinary people, rather than a granting of more power to those people.

Now that I think of it, “trust” became a dirty word already for a previous generation of Americans—the Populists and Progressives who learned that the newly gigantic corporate “trusts” were a new form of power over individuals that needed to be curbed, leading to the emergence of a whole new area of law called “antitrust” as well as “trust busters” like Theodore Roosevelt. Perhaps today’s crusaders for digital freedom can be thought of as the newest “trust busters.”

In the current and future battles over security, identity, digital freedom, and privacy, the newly ambiguous usage of “trust” reflects the longstanding struggle to find the right balance betweeen order and liberty as we search for the ideal of a society based on “ordered liberty.”

Learn More ϰſ the Issues on This Page