Laws of Robotics
View more presentations from Jamais Cascio.
Here's a sneak preview of the talk I'll be giving tomorrow.
___
« Talking About the Future on NPR (Updated) | Main | A Thin Slab of Book »
Here's a sneak preview of the talk I'll be giving tomorrow.
Comments
None of those laws seem to be applied to scientists and engineers who make things now. The laws appear to be things that tend to follow rather than precede the engineering.
So how do you propose to get there from here?
For example:
Responsibility and power are left to the politicians and the military.
We haven't even got much in the way of liability for software we create today. License agreements always remove that inconvenient aspect.
You will also run up against the "it's a tool not a sapient thing" argument too.
Rights and Empathy are going to be a hard sell until they are proven sapient, and probably even then.
Posted by: John B Stone | March 22, 2009 5:43 PM
Up until now, humans were a substrate that allowed memes to manifest itself into reality. Without complex tools (levers) these memes tended to be religious by and large, and only selforganized to perpetuate as religious movements. They worked as states, but states tend to be too big and clumsy to perpetuate indefinitely.
But with leverage exerted by value (money) the memes could perpetuate as corporations and economies. These entities are fiercely territorial and will kill competing ideologies.
In the past humans were the substrate for this. Humans revolted, humans believed, humans granted faith to all these entities. In turn humans unionized, established human rights, democratized society. Humans used to be the lynchpin in this arrangement.
There is a huge problem now. Humans will no longer be the carrier for valuing things. In the very near future there will be devices that attribute value and compete with humans in doing so.
The big change will be an economic shift - those corporate entities that use robotics more, use A.I. more will outcompete corporations that do not. This is clearly self-amplifying, especially if corporations start to become managed by A.I.
What if humans are no longer valuable? Right now humans are the critical component in economy as workers and consumers. In a couple of years there is absolutely no reason to have 'a few billion humans' around. They wont be able to do meaningful work and they have hideously inefficient consumption patterns.
Worse, when this transition occurs (if it hasn't already) humanity will be hard-pressed to protest. Revolution will become tricky when massproduced, fearless SWORD robots patrol the streets.
The above rules ALL assume we (human majorities) have some kind of managerial freedom as human beings. That assumption already is debatable for anyone who is not a male anglosaxon american, clearly.
At some time in the moment the center of gravity of decision-making, empowerment, economic power, consumption, fashion, will shift from humans to the cloud of resources surrounding corporations, and then it will focus on the machines inside them.
I know that criticism on corporations and the current market mechanisms is ideologically incorrect. It is heresy. And we cannot revolt, not without collapsing a house of cards that has already more or less collapsed. We cannot abandon modern information-based society. The people of the world are already being held ransom, effectively, intentionally or not.
Supply and demand. I have no clue how it will happen, but I am concerned that, as the slideshow above suggests, "machines get out of control" - I am not even worried much that machines take over.
What I am worried is that humans lose their inherent value, in tangible terms (no jobs, no economic power) or in more diffuse terms.
Instead of a skynet scenario, we may find humans becoming absolutely worthless, something that clings on for dear life on the fringes of real economic traffic. This would potentially be a dantes inferno for all who live to see it happen.
Posted by: Khannea | March 23, 2009 8:53 AM