Pearl Harbor to 9-11 to the Panopticon

This post has been percolating for two years, it’s a request to folks working on networking, social networks, and other technologies to consider the implications of their actions. My thesis is that the combination of advertising driven firms that sell their audience as the product and efforts to prevent another 9-11 have combined to create a more pervasive surveillance state than we could have imagined a decade go, but without any increase in security. Like Number 6 after his escape from the Village in “The Prisoner” we have all relocated to the panopticon. Be mindful of what you are working on so that you don’t contribute unintentionally.

Understand the Implications and Impact of Your Work

I took a course at Stanford on Decision Analysis from Ron Howard in 1980 when I was getting my Masters in Engineering-Economic Systems and still remember very clearly one of the lectures he gave on the importance of understanding the implications of what you were working on, on knowing the uses your work would be put to. He talked about looking at a pair of brass valves he saw in a Time-Life book. They were beautifully engineered and designed to redirect the exhaust from a large diesel truck into the cargo area. The Nazis, inspired by earlier Russian models, used them to turn trucks into mobile gas chambers. His point was to ask if the German engineer designing the valves had really thought through the implications of his work.

Who Will Watch The Watchmen

“That everyone is being spied on, or that there is, de facto, a violation of our rights through the mass maintenance of metadata, is in my opinion, not a serious debate. It is a pretend scandal. The real scandal — that of misuse of the data — has yet to occur. But it will if we don’t liberalize the secrecy of the FISA court to some meaningful point.”
David Simon “We Are Shocked, Shocked” (2013)

My argument is not against the need for surveillance–or other weapons for that matter–but for thoughtful management of it to maintain a free society. David Simon wrote a great essay, “We Are Shocked, Shocked,” in 2013 that was a clear eyed view of current police surveillance techniques and the need to be realistic in how to balance privacy with security and avoid corrupt practices on the part of government workers.  He elaborates more clearly in a back and forth in the comment section:

“Is nuance so elusive? Apparently.

There is no scandal in the data acquisition. What the Guardian has revealed and claimed as indiscriminate is legit, despite the scope. What should be debated is the lack of oversight of the FISA court and its activity. And that is not the focus of the debate.

I don’t care that the FISA-court has allowed the feds to establish a data base of all American international phone activity. I would expect such given the geopolitical realities of global terror. I worry that in the event of the misuse of that data, we will not learn of the wrongdoing because the civilian oversight of the FISA court is too minimal. The cure for that isn’t to pretend that the relevant data doesn’t exist. The cure is to ensure accountability and greater transparency for the process, while still ensuring that national security secrets offered to the court remain secrets. That is the real issue.

That everyone is being spied on, or that there is, de facto, a violation of our rights through the mass maintenance of metadata, is in my opinion, not a serious debate. It is a pretend scandal. The real scandal — that of misuse of the data — has yet to occur. But it will if we don’t liberalize the secrecy of the FISA court to some meaningful point.

If that is hard to grasp, I apologize. But not a word of it is contradictory.”
David Simon in a comment on  June 7, 2013 at 7:37 pm in We Are Shocked, Shocked

The challenge is how to create a system of checks and balances and mutual accountability as many aspects of our life migrate on-line, either our direct interaction (e.g. making a post on Facebook, running a search on Google) or through the inadvertent action of devices that we carry (a cellphone reporting our location, a NEST thermostat reporting whether someone has adjusted it recently). The Internet of Things means that cyberspace will fully evert into many items that we carry, our online footprints and fingerprints will put us into the equivalent of Jeremy Bentham’s vision of a modern prison, where everyone’s actions can be observed at all times.

Keeping the NSA in Perspective

George Friedman wrote an essay in 2013 that provided an historical perspective on government surveillance, “Keeping the NSA in Perspective,” that outlined some key differences between what happened after Pearl Harbor (we defeated the Japanese in four years) and the War on Terror (we are continuing to refine and extend a surveillance state apparatus that was only meant to be temporary.

What drove all of this was Pearl Harbor. The United States knew that the Japanese were going to attack. They did not know where or when. The result was disaster. All American strategic thinking during the Cold War was built around Pearl Harbor — the deep fear that the Soviets would launch a first strike that the United States did not know about. The fear of an unforeseen nuclear attack gave the NSA leave to be as aggressive as possible in penetrating not only Soviet codes but also the codes of other nations. You don’t know what you don’t know, and given the stakes, the United States became obsessed with knowing everything it possibly could.

In order to collect data about nuclear attacks, you must also collect vast amounts of data that have nothing to do with nuclear attacks. The Cold War with the Soviet Union had to do with more than just nuclear exchanges, and the information on what the Soviets were doing — what governments they had penetrated, who was working for them — was a global issue. But you couldn’t judge what was important and what was unimportant until after you read it. Thus the mechanics of assuaging fears about a “nuclear Pearl Harbor” rapidly devolved into a global collection system, whereby vast amounts of information were collected regardless of their pertinence to the Cold War.

The Pearl Harbor dread declined with the end of the Cold War — until Sept. 11, 2001. In order to understand 9/11’s impact, a clear memory of our own fears must be recalled. As individuals, Americans were stunned by 9/11 not only because of its size and daring but also because it was unexpected. Terrorist attacks were not uncommon, but this one raised another question: What comes next?  [..]

There are two major differences between the war on terror and the aforementioned wars. First, there was a declaration of war in World War II. Second, there is a provision in the Constitution that allows the president to suspend habeas corpus in the event of a rebellion. The declaration of war imbues the president with certain powers as commander in chief — as does rebellion. Neither of these conditions was put in place to justify NSA programs such as PRISM.

Moreover, partly because of the constitutional basis of the actions and partly because of the nature of the conflicts, World War II and the Civil War had a clear end, a point at which civil rights had to be restored or a process had to be created for their restoration. No such terminal point exists for the war on terror.  [..]

The threat posed by PRISM and other programs is not what has been done with them but rather what could happen if they are permitted to survive. But this is not simply about the United States ending this program. The United States certainly is not the only country with such a program. But a reasonable start is for the country that claims to be most dedicated to its Constitution to adhere to it meticulously above and beyond the narrowest interpretation. This is not a path without danger. As Benjamin Franklin said, “They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.”

George Friedman in “Keeping the NSA in Perspective” (2013) [hyperlinks preserved from original]

The issue is not the need for surveillance of foreign nations but how to manage it without corroding our freedoms. The pervasive monitoring of on-line activities by domestic corporations seems to require a different set of control structures and may require firms like Google and Facebook to become highly regulated utilities to the extent that they exert monopoly control over what are in effect new public spaces.  Micah Sifry pointed out one set of risks in Mother Jones in Oct 2014 in “Facebook Wants You to Vote on Tuesday. Here’s How It Messed With Your Feed in 2012” and elaborated in “Why Facebook’s Voter Megaphone is the Real Manipulation to Worry About.

How exactly might independent reporters verify that Facebook indeed did what it said it did on a couple hundred million of its users’ pages? One of the least-noticed implications of our new age of data-intensive politics is that one side has nearly all the marbles. Until the media and other observers develop the tools to independently monitor the uses of Big Data by third-party platforms (as well as campaigns), the integrity of the process will rest entirely on the honesty of the data scientists and engineers inside these organizations, for only they will know if they are playing fairly. And we already know that campaigns will do whatever they think they can get away with to win. Will the big publicly traded corporations that are today’s new media platforms behave more ethically?

Researchers can be as coin-operated as any sales person and using the excuse that full details cannot be disclosed until the research has been published seems like a next generation variation on attorney-client privilege: we cannot disclose that because it’s covered by an agreement with our research partners to embargo disclosure until they have had a chance to publish.

See also these 9-11 related posts

Postscript: What Happens Next Will Amaze You

Maciej Ceglowski, founder of Pinboard, gave a great talk on Sep-14-2014 “What Happens Next Will Amaze You” that is worth watching or reading the transcript. Here are some key points:

  • What was the most damaging data breach in the last 12 months?  The trick answer is: it’s likely something we don’t even know about.
  • Our daily activities are mediated with software that can easily be configured to record and report everything it sees upstream. But to fix surveillance, we have to address the underlying reasons that it exists. These are no mystery either. State surveillance is driven by fear. And corporate surveillance is driven by money.
  • In his excellent book on surveillance, Bruce Schneier has pointed out we would never agree to carry tracking devices and report all our most intimate conversations if the government made us do it. But under such a scheme, we would enjoy more legal protections than we have now. By letting ourselves be tracked voluntarily, we forfeit all protection against how that information is used. Those who control the data gain enormous power over those who don’t. The power is not overt, but implicit in the algorithms they write, the queries they run, and the kind of world they feel entitled to build.
  • It’s easy to get really depressed at all this. It’s important that we not let ourselves lose heart. If you’re over a certain age, you’ll remember what it was like when every place in the world was full of cigarette smoke. Airplanes, cafes, trains, private offices, bars, even your doctor’s waiting room—it all smelled like an ashtray. Today we live in a world where you can go for weeks without smelling a cigarette if you don’t care to.
  • It took a long time to establish that environmental smoke exposure was harmful, and even longer to translate this into law and policy. We had to believe in our capacity to make these changes happen for a long time before we could enjoy the results. I use this analogy because the harmful aspects of surveillance have a long gestation period, just like the harmful effects of smoking, and reformers face the same kind of well-funded resistance. That doesn’t mean we can’t win. But it does mean we have to fight.

He proposes six fixes

  1. Right To Download
  2. Right To Delete
  3. Limits on Behavioral Data Collection
  4. Right to Go Offline
  5. Ban on Third-Party Advertising
  6. Privacy Promises

It’s worth reading the entire transcript.

2 thoughts on “Pearl Harbor to 9-11 to the Panopticon”

  1. Pingback: SKMurphy, Inc. Bill Davidow on Silicon Valley Values - SKMurphy, Inc.

  2. Pingback: SKMurphy, Inc. 15 Years After 9-11, Four After Benghazi - SKMurphy, Inc.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top