The serendipity machine is low on oil

It’s Friday, which means it’s time for the unveiling of the Official Rough Type Sentence of the Week. This one comes from Steven Johnson, and it appears at the end of a vertiginous post about mental hyperlinking:

“People who think the Web is killing off serendipity are not using it correctly.”

Now, first of all, I hadn’t even realized that there was a correct way to use the Web. I wish someone had explained this to me years ago, because I’m sure it would have saved me all sorts of time.

But what really drew me to Johnson’s line was the way that it immediately conjured up in my mind this vision of a scene that looked like something out of a Terry Gilliam movie. There’s this big, windowless room, and sprawling across it is a vast, elaborate steampunk contraption. It’s got all sorts of pipes and pulleys and gears and bellows, and it’s belching smoke and making loud metallic noises, and there’s a sign hanging from it that reads: Serendipity Machine. A guy is running madly around it yanking levers and pulling out stops and pushing buttons and fiddling with dials. Behind him, observing, is an old man in a white lab coat, a scientist, obviously. He’s stooped over, a grim expression on his face. The guy operating the machine suddenly stops, turns, and, exhausted, exclaims, “I can’t get any serendipity out of this damn thing!” To which the old scientist, wagging a crooked finger, responds, in a deep Austrian accent, “You are not using it correctly!”

From hunter-gatherer to cutter-paster

Edge is running a fascinating interview with the evolutionary biologist Mark Pagel, who puts the development of human culture into a cosmic perspective. He draws a parallel between the replication of successful innovations in a society and the replication of successful genes in an environment: “Natural selection is a way of sorting among a range of genetic alternatives, and finding the best one. Social learning is a way of sifting among a range of alternative options or ideas, and choosing the best one of those.”

Pagel argues that our evolution as “social learners” has likely had the effect, as it’s played out through hundreds of millennia, of encouraging the development of copying skills, perhaps over the development of originality. “We like to think we’re a highly inventive, innovative species,” he explains. “But social learning means that most of us can make use of what other people do, and not have to invest the time and energy in innovation ourselves … And so, we may have had strong selection in our past to be followers, to be copiers, rather than innovators.”

What that also means is that as the scope of our potential copying broadens, through advances in communication and networking, we have ever less incentive to be creative. We become ever more adept at cutting and pasting. The internet and social networking, observes Pagel, may mark the culmination of this long evolutionary trend:

As our societies get bigger, and rely more and more on the Internet, fewer and fewer of us have to be very good at these creative and imaginative processes. And so, humanity might be moving towards becoming more docile, more oriented towards following, copying others, prone to fads, prone to going down blind alleys, because part of our evolutionary history that we could have never anticipated was leading us towards making use of the small number of other innovations that people come up with, rather than having to produce them ourselves.

The interesting thing with Facebook is that, with 500 to 800 million of us connected around the world, it sort of devalues information and devalues knowledge. And this isn’t the comment of some reactionary who doesn’t like Facebook, but it’s rather the comment of someone who realizes that knowledge and new ideas are extraordinarily hard to come by. And as we’re more and more connected to each other, there’s more and more to copy. We realize the value in copying, and so that’s what we do.

And we seek out that information in cheaper and cheaper ways. We go up on Google, we go up on Facebook, see who’s doing what to whom. We go up on Google and find out the answers to things. And what that’s telling us is that knowledge and new ideas are cheap. And it’s playing into a set of predispositions that we have been selected to have anyway, to be copiers and to be followers. But at no time in history has it been easier to do that than now. …

What’s happening is that we might, in fact, be at a time in our history where we’re being domesticated by these great big societal things, such as Facebook and the Internet. We’re being domesticated by them, because fewer and fewer and fewer of us have to be innovators to get by. And so, in the cold calculus of evolution by natural selection, at no greater time in history than ever before, copiers are probably doing better than innovators. Because innovation is extraordinarily hard. My worry is that we could be moving in that direction, towards becoming more and more sort of docile copiers.

This gives a whole new twist to Mark Zuckerberg’s promotion of “frictionless sharing.”

UPDATE: David Brin is dubious.

May I toot my own horn?

Two nice notices of The Shallows appeared out of the online blue today, and doggone it if I’m not going to share them. At Paste, Kurt Armstrong reviewed the book, calling it “essential”:

It lays out a sweeping portrait of the thing we’re moving too quickly to see. It’s easy for someone like me to piece together opinions or carve rhetorically charged rants about the deleterious effects of our growing technological dependency. In contrast, Carr’s book bursts with research — from neuroscientists, cognitive psychologists and sociologists — and careful analysis. And anxious as Carr might be about what the Internet is doing to our brains, his writing isn’t shrill or self-righteous. It’s intelligent, deeply researched, articulate and, much to my dismay, most likely prophetic: “The great danger we face as we become more intimately involved with our computers … is that we’ll begin to lose our humanness, to sacrifice the very qualities that separate us from machines.”

And at The Millions, novelist Jonathan Safran Foer pegged The Shallows as “the best book I read last year”:

Carr persuasively — and with great subtlety and beauty — makes the case that it is not only the content of our thoughts that are radically altered by phones and computers, but the structure of our brains — our ability to have certain kinds of thoughts and experiences. And the kinds of thoughts and experiences at stake are those that have defined our humanity. Carr is not a proselytizer, and he is no techno-troglodyte. He is a profoundly sharp thinker and writer — equal parts journalist, psychologist, popular science writer, and philosopher. I have not only given this book to numerous friends, I actually changed my life in response to it.

Suddenly, I’m in the mood to go out and do some caroling.

About Facebook

Mike Loukides, 2011:

Let’s go back to music: It is meaningful if I tell you that I really like the avant-garde music by Olivier Messiaen. It’s also meaningful to confess that I sometimes relax by listening to Pink Floyd. But if this kind of communication is replaced by a constant pipeline of what’s queued up in Spotify, it all becomes meaningless. There’s no “sharing” at all. Frictionless sharing isn’t better sharing; it’s the absence of sharing. There’s something about the friction, the need to work, the one-on-one contact, that makes the sharing real, not just some cyber phenomenon. If you want to tell me what you listen to, I care. But if it’s just a feed in some social application that’s constantly updated without your volition, why do I care? It’s just another form of spam, particularly if I’m also receiving thousands of updates every day from hundreds of other friends.

Theodor Adorno, 1951:

If time is money, it seems moral to save time, above all one’s own, and such parsimony is excused by consideration for others. One is straightforward. Every sheath interposed between men in their transactions is felt as a disturbance to the functioning of the apparatus, in which they are not only objectively incorporated but with which they proudly identify themselves. That, instead of raising their hats, they greet each other with the hallos of indifference, that, instead of letters, they send each other inter-office communications without address or signature, are random symptoms of a sickness of contact. Estrangement shows itself precisely in the elimination of distance between people.

The programmers of the commercial web have always seen their goal as the elimination of distance and friction from transactions, and that objective has, not surprisingly, come to shape online social networks. But, when carried too far, the minimization of transaction costs in personal relations ends up having the effect of reducing those relationships to mere transactions. Intimacy without distance is not intimacy, and sharing without friction is not sharing. Qualities of tenderness become, in the end, forms of commerce. “The straight line,” Adorno went on to say, as if he were explaining, sixty years before the fact, Facebook’s social graph, “is now regarded as the shortest distance between two people, as if they were points.”

The cloud giveth and the cloud taketh away

I’ve been using Apple’s iDisk syncing service for – I can’t believe this – about ten years now. When I signed up, iDisk was part of the company’s iTools service, which was subsequently revamped and given the goofy name MobileMe. The transition to MobileMe was a little hair-raising, as there were moments when my iDisk seemed to flicker out of existence. Since I was using the service to sync critical documents between my computers, seeing the iDisk folder disappear caused, to say the least, a little panic. But Apple, after much bad press, worked out the kinks. iDisk since then has worked fine, and I’ve grown ever more dependent on it. Now, as Apple replaces MobileMe with iCloud, iDisk is about to get tossed onto the great junk pile of abandoned software. And I have to go through the nuisance of finding a replacement. Apple is also discontinuing its (fairly crappy) iWeb service, which I’ve been using to publish theshallowsbook.com. So there’s another pain in the ass I’m going to have to deal with.

The cloud is great in many ways, but it’s also fickle. Look at all the cloud services that Google has shut down: Google Health, Wave, Friend Connect, Buzz, Aardvark, Notebook, Sidewiki, Subscribed Links, Desktop, Jaiku, and so on (all the way back to that would-be eBay killer Google Base). None of them were particularly successful – you can certainly see why Larry Page decided to flush them down the famous Googleplex toilet – but given the scale of the net, even services and apps that don’t achieve a critical market mass may have a whole lot of users. Discontinued products and services are nothing new, of course, but what is new with the coming of the cloud is the discontinuation of services to which people have entrusted a lot of personal or otherwise important data – and in many cases devoted a lot of time to creating and organizing that data. As businesses ratchet up their use of cloud services, they’re going to struggle with similar problems, sometimes on a much greater scale.

I don’t see any way around this – it’s the price we pay for the convenience of centralized apps and databases – but it’s worth keeping in mind that in the cloud we’re all guinea pigs, and that means we’re all dispensable. Caveat cloudster.

People in glass futures should throw stones

Remember that Microsoft video on our glassy future? Or that one from Corning? Or that one from Toyota? What they all suggest, and assume, is that our rich natural “interface” with the world will steadily wither away as we become more reliant on software mediation. The infinite possibilities of our sense of touch become reduced to a set of scripted gestures.

Former Apple engineer Bret Victor makes a passionate, and nicely illustrated, case that we need to challenge the reigning visions of future computer interfaces, which he sums up as “Pictures Under Glass”:

Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade. Is that so bad, to dump the tactile for the visual? Try this: close your eyes and tie your shoelaces. No problem at all, right? Now, how well do you think you could tie your shoes if your arm was asleep? Or even if your fingers were numb? When working with our hands, touch does the driving, and vision helps out from the back seat. Pictures Under Glass is an interaction paradigm of permanent numbness. It’s a Novocaine drip to the wrist. It denies our hands what they do best. And yet, it’s the star player in every Vision Of The Future.

As Anne Mangen has argued, in her work on the tactile aspects of reading and writing, we tend to ignore the importance that our sense of touch plays in our intellectual and emotional lives, probably because we are unconscious of the effects. Unfortunately, that makes it easy for us to sacrifice the richness of our tactile sense when we use, or design, computers. We settle for pictures under glass, for numbness.

Chitchat

From an interview with William Gibson in The Paris Review:

For someone who so often writes about the future of technology, you seem to have a real romance for artifacts of earlier eras.

It’s harder to imagine the past that went away than it is to imagine the future. What we were prior to our latest batch of technology is, in a way, unknowable. It would be harder to accurately imagine what New York City was like the day before the advent of broadcast television than to imagine what it will be like after life-size broadcast holography comes online. But actually the New York without the television is more mysterious, because we’ve already been there and nobody paid any attention. That world is gone.

My great-grandfather was born into a world where there was no recorded music. It’s very, very difficult to conceive of a world in which there is no possibility of audio recording at all. Some people were extremely upset by the first Edison recordings. It nauseated them, terrified them. It sounded like the devil, they said, this evil unnatural technology that offered the potential of hearing the dead speak. We don’t think about that when we’re driving somewhere and turn on the radio. We take it for granted.

So true. When we think about technological change, we always think forward. But what was it like to have no electricity, no automobiles, no indoor plumbing, no air conditioning, no telephones, no recorded music, no movies, no TV, no radio? It was just a few generations ago – no time at all, really – and yet it’s gone, “unknowable,” as Gibson says. Weird.

From an interview with George Dyson in The European:

We used to have only human intelligence, and now that has been supplemented by computational intelligence. So we would expect the potential for innovation to become supplemented as well.

Yes and no. The danger is not that machines are advancing. The danger is that we are losing our intelligence if we rely on computers instead of our own minds. On a fundamental level, we have to ask ourselves: Do we need human intelligence? And what happens if we fail to exercise it?

The question becomes: What progress is good progress?

Right. How do we maintain our diversity? It would be a great shame to lose something like human intelligence that was developed at such costs over such a long period of time. I spent a lot of my life living in the wilderness and building kayaks. I believe that we need to protect our self-reliant individual intelligence—what you would need to survive in a hostile environment. Few of us are still living self-reliant lives. That is not necessarily a bad thing, but we should be cautious not to surrender into dependency on other forms of intelligence.

But, although we think we’re in control, driving the car, it’s rare that we’re cautious about such things. As Gibson observes, looking backward, “Nobody paid any attention.” Is it any different now? If what we take for granted is invisible to us, how can we possibly know what it will mean to lose it – or, having lost it, what it was worth?