The editors of n+1 examine the rise of “webism” and some of its paradoxes:
The webists met the [New York] Times’s schizophrenia with a schizophrenia of their own. The worst of them simply cheered the almost unbelievably rapid collapse of the old media, which turned out, for all its seeming influence and power, to be a paper tiger, held up by elderly white men. But the best of them were given pause: themselves educated by newspapers, magazines, and books, they did not wish for these things to disappear entirely. (For one thing, who would publish their books?) In fact, with the rise of web 2.0 and the agony of the print media, a profound contradiction came into view. Webism was born as a technophilic left-wing splinter movement in the late 1960s, and reborn in early ’80s entrepreneurial Silicon Valley, and finally fully realized by the generation born around 1980. Whether in its right-leaning libertarian or left-leaning communitarian mode it was against the Man, and all the minions of the Man: censorship, outside control, narrative linearity. It was against elitism; it was against inequality. But it wasn’t against culture. It wasn’t against books! An Apple computer—why, you could write a book with one of those things. (Even if they were increasingly shaped and designed mostly so you could watch a movie.) One of the mysteries of webism has always been what exactly it wanted …
In The American Scholar, Sven Birkerts thinks about technological change and the future of imagination and the creative mind:
From the vantage point of hindsight, that which came before so often looks quaint, at least with respect to technology. Indeed, we have a hard time imagining that the users weren’t at some level aware of the absurdity of what they were doing. Movies bring this recognition to us fondly; they give us the evidence. The switchboard operators crisscrossing the wires into the right slots; Dad settling into his luxury automobile, all fins and chrome; Junior ringing the bell on his bike as he heads off on his paper route. The marvel is that all of them—all of us—concealed their embarrassment so well. The attitude of the present to the past . . . well, it depends on who is looking. The older you are, the more likely it is that your regard will be benign—indulgent, even nostalgic. Youth, by contrast, quickly gets derisive, preening itself on knowing better, oblivious to the fact that its toys will be found no less preposterous by the next wave of the young.
In the Times Magazine, Gary Wolf speculates that obsessive self-monitoring may be moving out of the fringe and into the mainstream:
Ubiquitous self-tracking is a dream of engineers. For all their expertise at figuring out how things work, technical people are often painfully aware how much of human behavior is a mystery. People do things for unfathomable reasons. They are opaque even to themselves. A hundred years ago, a bold researcher fascinated by the riddle of human personality might have grabbed onto new psychoanalytic concepts like repression and the unconscious. These ideas were invented by people who loved language. Even as therapeutic concepts of the self spread widely in simplified, easily accessible form, they retained something of the prolix, literary humanism of their inventors. From the languor of the analyst’s couch to the chatty inquisitiveness of a self-help questionnaire, the dominant forms of self-exploration assume that the road to knowledge lies through words. Trackers are exploring an alternate route. Instead of interrogating their inner worlds through talking and writing, they are using numbers. They are constructing a quantified self.
Placing the spreadsheeting-of-the-self trend in the context of the social-networking trend, Wolf observes, “You might not always have something to say, but you always have a number to report.” To give it a different spin: Who needs imagination when you have the data?