Wednesday, February 16, 2005

Things That Make You Go Hmmm....

Down the rabbit hole we go. Read the following passage:
"First let us postulate that the computer scientists succeed in developing intelligent machines that can do all things better than human beings can do them. In that case presumably all work will be done by vast, highly organized systems of machines and no human effort will be necessary. Either of two cases might occur. The machines might be permitted to make all of their own decisions without human oversight, or else human control over the machines might be retained.

"If the machines are permitted to make all their own decisions, we can't make any conjectures as to the results, because it is impossible to guess how such machines might behave. We only point out that the fate of the human race would be at the mercy of the machines. It might be argued that the human race would never be foolish enough to hand over all the power to the machines. But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machines' decisions. As society and the problems that face it become more and more complex and machines become more and more intelligent, people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won't be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

"On the other hand it is possible that human control over the machines may be retained. In that case the average man may have control over certain private machines of his own, such as his car or his personal computer, but control over large systems of machines will be in the hands of a tiny elite - just as it is today, but with two differences. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consists of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race. They will see to it that everyone's physical needs are satisfied, that all children are raised under psychologically hygienic conditions, that everyone has a wholesome hobby to keep him busy, and that anyone who may become dissatisfied undergoes "treatment" to cure his "problem." Of course, life will be so purposeless that people will have to be biologically or psychologically engineered either to remove their need for the power process or make them "sublimate" their drive for power into some harmless hobby. These engineered human beings may be happy in such a society, but they will most certainly not be free. They will have been reduced to the status of domestic animals."

Would you like to know who the author is? Carl Sagan, perhaps? Isaac Asimov? Click for the answer. And apparently, he's not alone.

Bill Joy, cofounder and Chief Scientist of Sun Microsystems, wrote an intriguing article back in April 2000 for Wired magazine, "Why The Future Doesn't Need Us", that cites the book by Ray Kurzweil that cites the passage above. Joy, a leading computer architect, is also deeply troubled by the ramifications of unlimited development of robots, genetic engineering, and nanotechnology, and he suggests some frightening scenarios that could result.

Personally, I think our drifting into complete machine dependence is virtually inevitable. We've been doing it for years, and I don't see us ever stopping. We just can't resist the benefits, or the competitive necessity. I also think increased and perhaps total integration of machines into our bodies is virtually inevitable (here's why), something I took for granted in my last novel, Asteroid Burn.

But I'm also not sure it's a bad thing. Sure, being wiped out in a matter of hours by a nanotech accident in New Mexico is a bit alarming. But the evolution of humans into cyborgs over the years? Not so troubling. Joy cites his alarm that "on this path our humanity may well be lost." Lost in its current form, yes. But lost completely? No. Not for a long, long time. It will change, evolve. If this is the evolution of humanity, it is what we chose for ourselves. The shift may happen faster than I think (as Kurzweil argues), but I still think we won't be able to say no seriously enough to avoid it. Bill Joy and Freeman Dyson may criticize the scientists for succumbing to the irresistable seduction of more power through knowledge, but consumers, corporate managers and government planners are just as guilty, if not more so. They create the market for the scientists to play in.

I do have to say one thing, though. Will you futurists, technologists and armchair theorists PLEASE STOP SAYING TECHNOLOGY WILL FREE US FROM WORK?!??! I'm so tired of hearing that. Technology MAKES work, people. Every invention that "frees" us, whether it's a fax machine or a cellphone, increases the productivity demands that are placed upon us and the overall complexity that we must manage on a daily basis. If that weren't enough, you also have to maintain each fancy new invention with things like a renewable power supply, file and resource management, preferences, updates and patches, repairs, and all the rest of it. This is why I think, as the writer up top suggests, we will eventually be reduced to the status of domestic animals. We simply don't claim for ourselves what we work so hard to get, namely time for ourselves and for our lives as humans. Truthfully, we'd all rather be robots.

We envy their efficiency and productivity. We wish we could see the world with as much certainty and authority as they do. We envy their ability to get the job done, no matter how mundane or dangerous, without the trouble of fears and anxieties and office politics and last night's dispute with the spouse getting in the way. We love the entertainment that we get from them and other forms of technology. In short, we admire them, and even if we knew how to stop their march across our species, I'm not convinced we would want to.

I have many more thoughts about this subject, and could write for a couple of hours about it.

But the battery in my Palm Pilot is running low, my email icon is flashing, and I want to create a new playlist for my iPod. My hard drive will be sluggish unless I defragment it, I need to replace the printer cartridge, the fax machine is beeping for a paper refill, my cellphone will die if I don't plug it in, my computer has reminded me to scan it for spyware, and Windows wants to install a new security patch. Oh, and at some point today, when I get a break, I need to eat lunch. (While I'm doing that, I'll relax from all this techno-slavery by downloading a new ring tone for my cellphone.) Oh, and it's a good thing I have a good-paying job, or I wouldn't be able to afford all this maintenance!

Sorry, folks, I would like to write more, but duty calls! (But isn't this an amazing time we live in?!)

What do you think? Leave a comment...