Colin Conner
Dr. Adam Johns
Seminar in Composition
29 August 2008
Stop Before It’s Too Late
What if in thirty years we are all living in a world of misery, oppression, disease, and overcrowding, controlled by machines? According to Bill Joy, co-founder of Sun Microsystems, this may become a reality. In Bill Joy’s article “Why the Future Doesn’t Need Us.” he describes that if we keep advancing in our genetic engineering, nanotechnology, and robotics (GNR) that we could make humans an endangered race. To fix this he feels that as a whole species we would have to agree what direction we need to head in these GNR technologies, and what we would like to get out of them (Joy 14). I agree with his statement and think that it would be very beneficial to put all of the world’s knowledge to good use. The technology needs to be slowed down before humans rely completely on GNR technologies. In the future we need to make sure the GNR technologies cannot give humans immortality, machines cannot solely exist without human control, and that machines with GNR technologies only work for the good nature of our species.
In order for humans to never rely completely on GNR technologies and take over mankind, humans must never obtain immortality from them. All humans want to live forever, but never think what would happen if that dream were in fact a reality. Bill Joy states, “Neither should we pursue near immortality without considering the costs, without considering the commensurate increase in the risk of extinction” (16). It seems as if he is trying to say that by becoming immortal we would be living only with the help of the GNR technologies, so in retrospect we would practically be under total control of the machines. Being controlled by machines would make our species non-existent. Also as of now our world is grossly overpopulated, and immortality would just make that more of a problem. Immortality does not equal a utopian world, it only makes us dependent.
If machines can do everything independently we would rely on GNR technologies to just run our lives so we don’t have to. Giving robots outright control of themselves would take away jobs, land, and resources from humans. We need to have the on and of switch in our hands rather than theirs. At the rate we are going we could see a changing of the guards soon, “This is the first moment in the history of our planet when any species, by its own voluntary actions, has become a danger to itself” (Joy 10). This is our own mistake that we have to fix, and being in control of what we create is a start.
Our worst nightmare would be if GNR technologies fall into the wrong hands. Someone can use them to control all of mankind and cause us to follow their every whim. We do not want an arms race like we had with nuclear, biological, and chemical technologies or else we could “create a White Plague” (Joy 8). As Bill Joy says, “Science, they recognize, grants immense powers. In a flash, they create world-altering contrivances” (10). If we are going to continue to advance our GNR technologies, they have to be put to good use and benefit all of mankind. Since there are millions starving in third world countries, GNR technologies need to be used to create food that can grow in even the harshest of climates. Also using GNR technologies to help make our world environmentally friendly so that we do not become extinct because of pollution and global warming. GNR technologies have a great up side and can really help mankind’s progress as a species.
In Bill Joy’s article “Why the Future Doesn’t Need Us.” he says that a big concern of his is that if humans allow ourselves to completely rely on machines, we will become extinct. He quotes the Unabomber commenting that if humans rely on machines, “People won’t be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide” (Joy 2). I determined that there were three goals for our future if we did not want the Unabomber’s thought to be a reality. First, we could not ever achieve immortality through GNR technologies, or else we would become totally dependent. Second, humans need to be able to turn machines on and off at our own will rather than the machines. Finally, we could only use GNR technologies for the betterment of our race otherwise we would bow down to whoever had the most powerful GNR weapons. If we achieve these three goals in the future we will be able to coexist with machines, and we will be able to tell our grandchildren about the fears that we used to have of becoming exitinct.
1 comment:
Strong introduction, which competently summarizes Joy while presenting your own views. This may be the best *intro*, strictly conceived as an intro, so far.
After that, your strengths dissipate somewhat. I thought your discussion of immortality was fine, although it reads, in some ways, a little bit like a second introduction. However, your concerns are pretty articulate, and build nicely from Joy.
The next couple paragraphs are jarring. Really, you’d just started to develop your discussion of immortality (you might have focused on it entirely, or used it as your main example). Instead, the next two paragraphs seem almost like you’re just rephrasing some of Joy’s main points - the independent focus which you developed in the section on immortality practically disappears.
The conclusion is weak - it’s long and windy, and I don’t think you say anything in it which you didn’t say better earlier.
There was, I think, the beginnings a good paper on immortality here, but you ended up rehashing Joy instead of really claiming your own productive direction.
Post a Comment