Category Archives: Uncategorized

About Facebook

Mike Loukides, 2011:

Let’s go back to music: It is meaningful if I tell you that I really like the avant-garde music by Olivier Messiaen. It’s also meaningful to confess that I sometimes relax by listening to Pink Floyd. But if this kind of communication is replaced by a constant pipeline of what’s queued up in Spotify, it all becomes meaningless. There’s no “sharing” at all. Frictionless sharing isn’t better sharing; it’s the absence of sharing. There’s something about the friction, the need to work, the one-on-one contact, that makes the sharing real, not just some cyber phenomenon. If you want to tell me what you listen to, I care. But if it’s just a feed in some social application that’s constantly updated without your volition, why do I care? It’s just another form of spam, particularly if I’m also receiving thousands of updates every day from hundreds of other friends.

Theodor Adorno, 1951:

If time is money, it seems moral to save time, above all one’s own, and such parsimony is excused by consideration for others. One is straightforward. Every sheath interposed between men in their transactions is felt as a disturbance to the functioning of the apparatus, in which they are not only objectively incorporated but with which they proudly identify themselves. That, instead of raising their hats, they greet each other with the hallos of indifference, that, instead of letters, they send each other inter-office communications without address or signature, are random symptoms of a sickness of contact. Estrangement shows itself precisely in the elimination of distance between people.

The programmers of the commercial web have always seen their goal as the elimination of distance and friction from transactions, and that objective has, not surprisingly, come to shape online social networks. But, when carried too far, the minimization of transaction costs in personal relations ends up having the effect of reducing those relationships to mere transactions. Intimacy without distance is not intimacy, and sharing without friction is not sharing. Qualities of tenderness become, in the end, forms of commerce. “The straight line,” Adorno went on to say, as if he were explaining, sixty years before the fact, Facebook’s social graph, “is now regarded as the shortest distance between two people, as if they were points.”

The cloud giveth and the cloud taketh away

I’ve been using Apple’s iDisk syncing service for – I can’t believe this – about ten years now. When I signed up, iDisk was part of the company’s iTools service, which was subsequently revamped and given the goofy name MobileMe. The transition to MobileMe was a little hair-raising, as there were moments when my iDisk seemed to flicker out of existence. Since I was using the service to sync critical documents between my computers, seeing the iDisk folder disappear caused, to say the least, a little panic. But Apple, after much bad press, worked out the kinks. iDisk since then has worked fine, and I’ve grown ever more dependent on it. Now, as Apple replaces MobileMe with iCloud, iDisk is about to get tossed onto the great junk pile of abandoned software. And I have to go through the nuisance of finding a replacement. Apple is also discontinuing its (fairly crappy) iWeb service, which I’ve been using to publish theshallowsbook.com. So there’s another pain in the ass I’m going to have to deal with.

The cloud is great in many ways, but it’s also fickle. Look at all the cloud services that Google has shut down: Google Health, Wave, Friend Connect, Buzz, Aardvark, Notebook, Sidewiki, Subscribed Links, Desktop, Jaiku, and so on (all the way back to that would-be eBay killer Google Base). None of them were particularly successful – you can certainly see why Larry Page decided to flush them down the famous Googleplex toilet – but given the scale of the net, even services and apps that don’t achieve a critical market mass may have a whole lot of users. Discontinued products and services are nothing new, of course, but what is new with the coming of the cloud is the discontinuation of services to which people have entrusted a lot of personal or otherwise important data – and in many cases devoted a lot of time to creating and organizing that data. As businesses ratchet up their use of cloud services, they’re going to struggle with similar problems, sometimes on a much greater scale.

I don’t see any way around this – it’s the price we pay for the convenience of centralized apps and databases – but it’s worth keeping in mind that in the cloud we’re all guinea pigs, and that means we’re all dispensable. Caveat cloudster.

People in glass futures should throw stones

Remember that Microsoft video on our glassy future? Or that one from Corning? Or that one from Toyota? What they all suggest, and assume, is that our rich natural “interface” with the world will steadily wither away as we become more reliant on software mediation. The infinite possibilities of our sense of touch become reduced to a set of scripted gestures.

Former Apple engineer Bret Victor makes a passionate, and nicely illustrated, case that we need to challenge the reigning visions of future computer interfaces, which he sums up as “Pictures Under Glass”:

Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade. Is that so bad, to dump the tactile for the visual? Try this: close your eyes and tie your shoelaces. No problem at all, right? Now, how well do you think you could tie your shoes if your arm was asleep? Or even if your fingers were numb? When working with our hands, touch does the driving, and vision helps out from the back seat. Pictures Under Glass is an interaction paradigm of permanent numbness. It’s a Novocaine drip to the wrist. It denies our hands what they do best. And yet, it’s the star player in every Vision Of The Future.

As Anne Mangen has argued, in her work on the tactile aspects of reading and writing, we tend to ignore the importance that our sense of touch plays in our intellectual and emotional lives, probably because we are unconscious of the effects. Unfortunately, that makes it easy for us to sacrifice the richness of our tactile sense when we use, or design, computers. We settle for pictures under glass, for numbness.

Chitchat

From an interview with William Gibson in The Paris Review:

For someone who so often writes about the future of technology, you seem to have a real romance for artifacts of earlier eras.

It’s harder to imagine the past that went away than it is to imagine the future. What we were prior to our latest batch of technology is, in a way, unknowable. It would be harder to accurately imagine what New York City was like the day before the advent of broadcast television than to imagine what it will be like after life-size broadcast holography comes online. But actually the New York without the television is more mysterious, because we’ve already been there and nobody paid any attention. That world is gone.

My great-grandfather was born into a world where there was no recorded music. It’s very, very difficult to conceive of a world in which there is no possibility of audio recording at all. Some people were extremely upset by the first Edison recordings. It nauseated them, terrified them. It sounded like the devil, they said, this evil unnatural technology that offered the potential of hearing the dead speak. We don’t think about that when we’re driving somewhere and turn on the radio. We take it for granted.

So true. When we think about technological change, we always think forward. But what was it like to have no electricity, no automobiles, no indoor plumbing, no air conditioning, no telephones, no recorded music, no movies, no TV, no radio? It was just a few generations ago – no time at all, really – and yet it’s gone, “unknowable,” as Gibson says. Weird.

From an interview with George Dyson in The European:

We used to have only human intelligence, and now that has been supplemented by computational intelligence. So we would expect the potential for innovation to become supplemented as well.

Yes and no. The danger is not that machines are advancing. The danger is that we are losing our intelligence if we rely on computers instead of our own minds. On a fundamental level, we have to ask ourselves: Do we need human intelligence? And what happens if we fail to exercise it?

The question becomes: What progress is good progress?

Right. How do we maintain our diversity? It would be a great shame to lose something like human intelligence that was developed at such costs over such a long period of time. I spent a lot of my life living in the wilderness and building kayaks. I believe that we need to protect our self-reliant individual intelligence—what you would need to survive in a hostile environment. Few of us are still living self-reliant lives. That is not necessarily a bad thing, but we should be cautious not to surrender into dependency on other forms of intelligence.

But, although we think we’re in control, driving the car, it’s rare that we’re cautious about such things. As Gibson observes, looking backward, “Nobody paid any attention.” Is it any different now? If what we take for granted is invisible to us, how can we possibly know what it will mean to lose it – or, having lost it, what it was worth?

Utopia is creepy

Works of science fiction, particularly good ones, are almost always dystopian. It’s easy to understand why: There’s a lot of drama in Hell, but Heaven is, by definition, conflict-free. Happiness is nice to experience, but seen from the outside it’s pretty dull.

But there’s another reason why portrayals of utopia don’t work. We’ve all experienced the “uncanny valley” that makes it difficult to watch robotic or avatarial replicas of human beings without feeling creeped out. The uncanny valley also exists, I think, when it comes to viewing artistic renderings of a future paradise. Utopia is creepy – or at least it looks creepy. That’s probably because utopia requires its residents to behave like robots, never displaying or even feeling fear or anger or jealousy or bitterness or any of those other messy emotions that plague our fallen world.

I’ve noticed the arrival recently of a new genre of futuristic YouTube videos. They’re created by tech companies for marketing or brand-burnishing purposes. With the flawless production values that only a cash-engorged balance sheet can buy you, they portray a not-too-distant future populated by exceedingly well-groomed people who spend their hyperproductive days going from one screen to the next. (As seems always to be the case with utopias, the atmosphere is very post-sexual.) The productions are intended to present us with visions of technological Edens, but they end up doing the exact opposite: portraying a future world that feels cold, mechanical, and repellent. And the creepiness is only intensified by the similarities between the future they conjure up and the present that we live in.

The latest in this genre comes from Microsoft, and like its predecessors it seems to be the product of a collaboration between Stanley Kubrick and David Lynch. Make sure you watch it with the sound on, because the music in these videos is always richly creepy in itself:

I love the title of this video: Productivity Future Vision (2011). It’s so evocative.

Minds askew

Iain McGilchrist, the psychiatrist and former English professor whose 2009 book on the human brain, The Master and His Emissary, is endlessly fascinating, discusses his ideas on the meaning of the brain’s hemispherical divide in this wonderful animation:

That helps explain, among many other things, why we’re so drawn to the metaphor that portrays the brain as a computer.

Retransmission of a language-based practice

Penn prof Kenneth Goldsmith has seen the future of culture, and it’s a content farm:

For the past several years, I’ve taught a class at the University of Pennsylvania called “Uncreative Writing.” In it, students are penalized for showing any shred of originality and creativity. Instead they are rewarded for plagiarism, identity theft, repurposing papers, patchwriting, sampling, plundering, and stealing. Not surprisingly, they thrive. Suddenly what they’ve surreptitiously become expert at is brought out into the open and explored in a safe environment, reframed in terms of responsibility instead of recklessness.

We retype documents and transcribe audio clips. We make small changes to Wikipedia pages (changing an “a” to “an” or inserting an extra space between words). We hold classes in chat rooms, and entire semesters are spent exclusively in Second Life. Each semester, for their final paper, I have them purchase a term paper from an online paper mill and sign their name to it, surely the most forbidden action in all of academia. Students then must get up and present the paper to the class as if they wrote it themselves, defending it from attacks by the other students. What paper did they choose? Is it possible to defend something you didn’t write? Something, perhaps, you don’t agree with? Convince us.

All this, of course, is technology-driven. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future … While the author won’t die, we might begin to view authorship in a more conceptual way: Perhaps the best authors of the future will be ones who can write the best programs with which to manipulate, parse, and distribute language-based practices. Even if, as Christian Bök claims, poetry in the future will be written by machines for other machines to read, there will be, for the foreseeable future, someone behind the curtain inventing those drones, so that even if literature is reducible to mere code — an intriguing idea — the smartest minds behind the machines will be considered our greatest authors.