Hofstadter on AI
June 14, 2008
Speaking of the Singularity - and how can you avoid it, really, these days? - Douglas Hofstadter, author of the classic Gödel, Escher, Bach as well as, more recently, I Am a Strange Loop, spots the misanthropy that lies beneath the sunny surfaces of the AI millennialists and many other techno-utopians:
Am I disappointed by the amount of progress in cognitive science and AI in the past 30 years or so? Not at all. To the contrary, I would have been extremely upset if we had come anywhere close to reaching human intelligence — it would have made me fear that our minds and souls were not deep. Reaching the goal of AI in just a few decades would have made me dramatically lose respect for humanity, and I certainly don't want (and never wanted) that to happen ...
Do I still believe it will happen someday? I can't say for sure, but I suppose it will eventually, yes. I wouldn't want to be around then, though. Such a world would be too alien for me. I prefer living in a world where computers are still very very stupid. And I get a huge kick out of laughing at the hilariously unpredictable inflexibility of the computer models of mental processes that my doctoral students and I co-design. It helps remind me of the immense subtlety and elusiveness of the human mind.
Indeed, I am very glad that we still have a very very long ways to go in our quest for AI. I think of this seemingly “pessimistic” view of mine as being in fact a profound kind of optimism, whereas the seemingly “optimistic” visions of Ray Kurzweil and others strike me as actually being a deeply pessimistic view of the nature of the human mind.
Thanks for that note. I have often described technouptopians, singularity-ists (and transhumanists in particular) as being megalomaniacs. They'd wipe out the whole human race in order to achieve superhuman status. I suspect this is the source of their misanthropy, they feel that if they wiped out all other humans, they'd be superhuman by default.
Posted by: Charles at June 14, 2008 05:32 PM
Remember, I just told you that from the other direction - "One basic way to tell the difference is essentially when science types can extend "themselves" through technology, they think "This is cool! Wonderful! Great! More!", while humanities types angst about "How has the basic nature of our essential souls been corrupted?".
[Tedious - that's an overview sentence, which does not cover every nuance, OK?]
So you're right, ZOMG!, there's tech types who do not think some idealized version of What It Means To Be Human is the highest state of grace. You think this is a bad thing - anti-human, if I may paraphrase. The opposite view is that is a very good thing _per se_, a desire to transcend the limits of one's existence (and of course, as with all such transcendent impulses, there's a cadre of con-men and snake-oil sellers around it).
Posted by: Seth Finkelstein at June 14, 2008 06:15 PM
However impressive could be computer's intelligence, finesse or poetry — you'll only have humans to tremble reading it, while screens would flash it ‘a high score’ in poetic relevance, in the silence of the CPU fans.
Posted by: Bertil at June 15, 2008 09:12 AM
Post a comment
Thanks for signing in, . Now you can comment. (sign out)(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)
"Riveting" -San Francisco Chronicle
"Rewarding" -Financial Times
"Ominously prescient" -Kirkus Reviews
"Riveting stuff" -New York Post