Blogging: a great pastime for the elderly

I remember when it was kind of cool to be a blogger. You’d walk around with a swagger in your step, a twinkle in your eye. Now it’s just humiliating. Blogging has become like mahjong or needlepoint or clipping coupons out of Walgreens circulars: something old folks do while waiting to croak.

Did you see that new Pew study that came out yesterday? It put a big fat exclamation point on what a lot of us have come to realize recently: blogging is now the uncoolest thing you can do on the Internet. It’s even uncooler than editing Wikipedia articles or having a Second Life avatar. In 2006, 28% of teens were blogging. Now, just three years later, the percentage has tumbled to 14%. Among twentysomethings, the percentage who write blogs has fallen from 24% to 15%. Writing comments on blogs is also down sharply among the young. It’s only geezers – those over 30 – who are doing more blogging than they used to.

Here’s how Pew puts the bad news:

While blogging among adults as a whole has remained steady, the prevalence of blogging within specific age groups has changed dramatically in recent years. Specifically, a sharp decline in blogging by young adults has been tempered by a corresponding increase in blogging among older adults.

They even have a chart, just to rub salt in the wound:

teenblogging.jpg

When I blog these days, I feel like I should be sitting in a rocking chair, wearing a highly absorptive undergarment, and writing posts debunking some overhyped new bunion treatment (iPads?).

Yesterday I was out taking a walk and I happened to pass a group of tweens congregating on a street corner. I heard one of them say, “Hey, that guy’s a blogger,” and then they all started throwing their empty energy-drink cans at me. I had to take refuge in a Starbucks. I spent a half hour crying into my double-tall.

I hear that in middle schools “blogger” has become the most common term of abuse, playing roughly the same role that “wuss” used to play:

“You’re such a blogger, Derek.”

“You’re the blogger, Sean.”

“Am not.”

“Are too!”

Blogger jokes are turning into the new big thing on college campuses:

Q: How many bloggers does it take to screw in a lightbulb?

A: Who cares?

Very funny.

Eric Schmidt’s second thoughts

I admit to having a bit of a personal interest in this, but I’ve been fascinated to see how the thinking of Eric Schmidt, Google’s CEO, has evolved over the past few years on the question of the Net’s effect on reading and cognition. Here are three quotes from Schmidt, the most recent of which came yesterday:

ericatlantic.jpgJuly 30, 2008: “I just got this in my in-box. Anybody read it? The Atlantic: ‘Is Google Making Us Stupid?’ I mean, we’ve got a problem if this is true, right? In the article, the author … points out that deep reading is equal to deep thinking, and since we’re not reading deep anymore, we’re obviously not deep thinking. And what I was realizing in reading this – and I encourage you all to read it – is that this is exactly what people said when color television arrived in my home in Virginia 40 years ago. This is also what people said 25 years ago when the MTV phenomenon occurred, about short attention spans and so forth. And I observe that we’re smarter than ever. So the important point here is that [despite] all of these sort of histrionics about the role of information and other changes, society is enormously powerful, enormously capable of adapting to the threats.”

March 6, 2009: “I worry that the level of interrupt, the sort of overwhelming rapidity of information — and especially of stressful information — is in fact affecting cognition. It is in fact affecting deeper thinking. I still believe that sitting down and reading a book is the best way to really learn something.
 And I worry that we’re losing that.”

January 29, 2010: “The one thing that I do worry about is the question of ‘deep reading.’ As the world looks to these instantaneous devices … you spend less time reading all forms of literature, books, magazines and so forth. That probably has an effect on cognition, probably has an effect on reading.”

I’m glad Schmidt has continued to ponder this issue, and I salute him for having the courage to air his concerns publicly.

Tweet fantasy

How cool would it have been if Twitter had been invented a couple hundred years ago so our forebears could have used it?

transcendo: RT @emerson new idea: “the making a fact the subject of thought raises it” http://bit.ly/cAhzDL (expand)<----interesting!

J. D. Salinger and me

I just heard the sad news that J. D. Salinger has died. He was 91.

I went to school at Dartmouth College in Hanover, New Hampshire, which is just a few miles north of Cornish, New Hampshire, where Salinger lived. During the summer between my junior and senior year, I had a job at the circulation desk at the Dartmouth library. I was working one morning when my boss tapped me on the shoulder and motioned with his head over to the side of the desk. I just caught a glimpse of a tall, slender, slightly stooped man going through the doorway into the stacks. “That’s J. D. Salinger,” my boss whispered.

Holy crap, I thought. I just saw J. D. Salinger.

About ten minutes later Salinger suddenly reappeared at the desk, holding a dollar bill. I went over to him, and he said he needed change for the Xerox machine. I took his dollar and gave him four quarters.

That’s my claim to fame: I gave J. D. Salinger change for a buck.

From writing to texting

The Britannica Blog is running, in conjunction with The Futurist magazine, a forum on Learning & Literacy in the Digital Age, which includes a piece by me on the resilience of the written word. (The brief piece actually originally appeared in a recent issue of The Futurist).

First paragraph:

The written word seems so horribly low tech. It hasn’t changed much for a few millennia, at least since the ancient Greeks invented symbols for vowels. In our twitterific age of hyperspeed progress, there’s something almost offensive in such durability, such pigheaded resilience. You want to grab the alphabet by the neck, give it a shake, and say, Get off the stage, dammit. Your time is up.

Read the rest of it.

Hello iPad, Goodbye PC

The New Republic has published my commentary on Apple’s iPad announcement. I reprint it here (with the important second sentence, which was cut from the New Republic version):

The PC era ended this morning at ten o’clock Pacific time, when Steve Jobs mounted a San Francisco stage to unveil the iPad, Apple’s version of a tablet computer. What made the moment epochal was not so much the gadget itself – an oversized iPod Touch tricked out with an e-reader application and a few other new features – but the clouds of hype that attended its arrival.

Tablet computers have been kicking around for a decade, but consumers have always shunned them. They’ve been viewed as nerdy-looking smudge-magnets, limited by their cumbersome shape and their lack of a keyboard. Tablets were a solution to a problem no one had.

The rapturous anticipation of Apple’s tablet – the buildup to Jobs’s announcement blurred the line between media feeding-frenzy and orgiastic pagan ritual – shows that our attitude to the tablet form has shifted. Tablets suddenly look attractive. Why? Because the nature of personal computing has changed.

Until recently, we mainly used our computers to run software programs (Word, Quicken) installed on our hard drives. Now, we use them mainly to connect to the vast databases of the Internet – to “the cloud,” as the geeks say. And as the Internet has absorbed the traditional products of media – songs, TV shows, movies, games, the printed word – we’ve begun to look to our computers to act as multifunctional media players. They have to do all the work that was once done by specialized technologies – TVs, stereos, telephones, newspapers, books – as well as run a myriad of software apps. The computer business and the media business are now the same business.

The transformation in the nature of computing has turned the old-style PC into a dinosaur. A bulky screen attached to a bulky keyboard no longer fits with the kinds of things we want to do with our computers. The shortcomings of the PC have created, the iPad hype suggests, a yearning for a new kind of device – portable, flexible, always connected – that takes computing into the cloud era.

Suddenly, in other words, the tablet is a solution to a problem everyone has. Or at least it’s one possible solution. The computing market is now filled with all sorts of networked devices, each seeking to fill a lucrative niche. There are dozens of netbooks, the diminutive cousins to traditional laptops, from manufacturers like Acer and Asus. There are e-readers like Amazon’s Kindle and Barnes & Noble’s Nook. There are smartphones like Apple’s iPhone and Google’s Nexus One. There are gaming consoles like Nintendo’s Wii and the Microsoft’s Xbox. In some ways, personal computing has returned to the ferment of its earliest days, when the market was fragmented among lots of contending companies, operating systems, and technical standards.

With the iPad, Apple is hoping to bridge all the niches. It wants to deliver the killer device for the cloud era, a machine that will define computing’s new age in the way that the Windows PC defined the old age. The iPad is, as Jobs said today, “something in the middle,” a multipurpose gadget aimed at the sweet spot between the tiny smartphone and the traditional laptop. If it succeeds, we’ll all be using iPads to play iTunes, read iBooks, watch iShows, and engage in iChats. It will be an iWorld.

But will it succeed? The iPad is by no means a sure bet. It still, after all, is a tablet – fairly big and fairly heavy. Unlike an iPod or an iPhone, you can’t stick an iPad in your pocket or pocketbook. It also looks to be a cumbersome device. The iPad would be ideal for a three-handed person – two hands to hold it and another to manipulate its touchscreen – but most of humans, alas, have only a pair of hands. And with a price that starts at $500 and rises to more than $800, the iPad is considerably more expensive than the Kindles and netbooks it will compete with.

But whether it finds mainstream success or not, the iPad is the clearest sign yet that we’ve entered a new era of computing, in which media and software have merged in the Internet cloud. It’s hardly a surprise that Apple – more than Microsoft, IBM, or even Google – is defining the terms of this new era. Thanks to Steve Jobs, a bohemian geek with the instincts of an impresario, Apple has always been as much about show biz as about data processing. It sees its products as performances and its customers as both audience members and would-be artists.

Apple endured its darkest days during the early 1990s, when the PC had lost its original magic and turned into a drab, utilitarian tool. Buyers flocked to Dell’s cheap, beige boxes. Computing back then was all about the programs. Now, computing is all about the programming – the words and sounds and pictures and conversations that pour out of the Internet’s cloud and onto our screens. Computing, in other words, has moved back closer to the ideal that Steve Jobs had when he founded Apple. Today, Jobs’s ambitions are grander than ever. His overriding goal is to establish his company as the major conduit, and toll collector, between the media cloud and the networked computer.

Jobs doesn’t just want to produce glamorous gizmos. He wants to be the impresario of all media.

The Shallows: table of contents

My next book, The Shallows: What the Internet Is Doing to Our Brains, argues that the tools we use to think with – our “intellectual technologies” – not only shape our habits of thought but exert an actual physical influence on the neurons and synapses in our brains. I look at the Internet, an extraordinarily powerful intellectual technology, in this context, examining what the scientific and historical evidence tells about the effects it is having on our thoughts, memories, and even emotions – and how different the effects are from those exerted by earlier intellectual technologies such as the printed book.

Here’s the table of contents for The Shallows:

Prologue: “The Watchdog and the Thief”

Chapter 1: “HAL and Me”

Chapter 2: “The Vital Paths”

Chapter 3: “Tools of the Mind”

Chapter 4: “The Deepening Page”

Chapter 5: “A Medium of the Most General Nature”

Chapter 6: “The Very Image of a Book”

Chapter 7: “The Juggler’s Brain”

Chapter 8: “The Church of Google”

Chapter 9: “Search, Memory”

Chapter 10: “A Thing Like Me”

Epilogue: “Human Elements”

The Shallows will be published in June in North America, by W. W. Norton, and in September in the U.K., by Atlantic Books. Translated editions are also forthcoming.