This afternoon at the Accelerating Change Conference 2004, David Brin, author of The Transparent Society,and Brad Templeton, chairman of the board at the Electronic Frontier Foundation, debated the relative virtues of transparency and privacy.
(I should admit a personal interest in this right up front: way back in 1996, I "debated" Brin in the pages of Time magazine on this very subject (the text of the article can be found online). My own views have evolved a bit since then; the "participatory panopticon" essays I've written here at WorldChanging reflect the ambivalent nature of transparency-enabling technologies, and my appreciation of both their benefits and drawbacks.)
In the extended entry, you'll find the notes I took during the Brin-Templeton debate. It seems they both want to play the role of the "realist" in the discussion. The core of Brin's argument is that surveillance technologies are here, and we should fight to make sure that they are two-way, and not just in the hands of elites; the core of Templeton's argument is that widespread transparency technologies will inevitably be corrupted and co-opted by those in power, and that we're better off fighting to hold the line where it stands now. It's the sign of a good discussion that they both made very strong cases for their views.
David Brin: wants a civilization that is open and accountable - wants more suspicion of authority, tolerance of diversity, and appreciation of eccentricity.
The key is a knowledgeable, knowing people. A good transparent society is one where most of the people know most of what's going on most of the time.
Markets, democracy, courts, and science = four great accountability media; they change human competition into emergent cooperative results. But such cooperative properties only emerge if we know what's going on.
We will have to redefine privacy. We're better off defending that remaining privacy if we're using collective transparency. Regulatory privacy (e.g., EPIC) doesn't work because it requires control of thoughts. Encryption-based privacy (e.g., EFF) doesn't work because nobody's shown how it could work.
Argues for accountability based on plausibility of harm.
Brin's corollary to Moore's Law: "Cameras are getting smaller, cheaper, accurate and more mobile every two years."
Brad Templeton: Quotes the Earl of Spencer: "Privacy is what they take away when they want to torture you."
A watched populace never boils. When we feel we're under surveillance, we self-censor. Anonymous communication is the foundation of free societies, including this one.
People tend not to care about loss of privacy until one's own privacy is invaded. We must protect other's privacy to protect your own.
Problems with transparency: transparency will be suborned; elites are too strong; national security and global competitiveness give strong motivations (as well as false motivations); "must see all to assure your privacy."
If you put in place the tools which would make a police state possible, they make the police state inevitable.
Repressive states are real; near-panopticons are real; transparent utopias are hypothetical.
Transparency can have unintended results, e.g., business quarterly reports leading to business focus on short-term results.
Surveillance doesn't work completely:
not even in prison camps
oppressed always win, at least in the small
it's always abused
what we build with the best of intentions will be used in less-than-enlightened areas
DB: Argues that social abuses of transparency could, in fact, turn out for the good. If the gov't is unable to successfully suppress populace, majorities can still oppress minorities (via transparency technologies). But since we've seen (for example with regards to gay populace, in New York, San Francisco) that more info eventually leads to acceptance, widespread observation will lead to broader tolerance. [Jamais: this was his weakest argument.]
Transparency definitely useful against repressive governments. Cites Abu Ghraib -- the more you have people who will take the pictures and distribute them, the harder it is to keep oppression secret.
Any time power gathers into elites, they'll try to use it maliciously. Transparency is a good way of undercutting elite power.
BT: Problem isn't the accountability of the elites, but the changes in society that alter balance -- ideas to have all transactions appear in public, for example, and public reputation records.
DB: Surveillance and "village" model is inevitable, so wants to make the village more beneficial.
BT: Doesn't argue for hardcore encryption & secrecy model, but for protection of what we have now. (DB: wow! That's the best argument against me I've ever found!)
DB: How do you know that the moderate privacy is real, not NSA fronts (e.g.)? It's epistemologically simpler to see what you know than to find out what others don't know.
Need to protect each other's privacy? No! Need to protect each other's freedom, from which protection of privacy would emerge.
Comments (3)
jamais:
i didnt realize you are here at ac 2004. i have long been a reader of worldchanging, lets meet up tomorrow. im the guy with the berkley sweatshirt and the bleached hair :)
Posted by Gregor J. Rothfuss | November 6, 2004 10:10 PM
Posted on November 6, 2004 22:10
Excellent report, Jamais. Sounds like the presentations were nuanced as befits the complex subject.
Posted by Jon Lebkowsky | November 7, 2004 6:09 AM
Posted on November 7, 2004 06:09
The discussions are more than nuanced, they're outright theoretical.
Is total transparency even possible? Wouldn't there be a tremendous motivation to develop an exception to the rule so as to reap the benefits but suffer none of the costs?
Then, who decides which information gets emphasized, de-emphasized or transformed in transit to a different "standard" of representation? Just because everyone has to make all information available doesn't mean that the person who wants it can find it easily or can even look at it within a reasonable time period (i.e. a filibuster requiring you to find "viewer X.Y" which requires you to find "font library A.B.C", which requires you to find another finder...et cetera). A small enough needle in a haystack practically does not exist, and a sufficiently long string that it is connected to still makes retrieving it impractical.
Then, the whole problem boils down to statistical analysis, another name for which is "guessing" :) and so you have come back to square one.
Posted by shox | November 10, 2004 10:11 PM
Posted on November 10, 2004 22:11