The Twitter Trap?

Has technology robbed us of more than it's given in return?

NYT executive Bill Keller isn’t a Luddite–he just hates technology. His 13-year-old daughter’s joining Facebook has set him off:

I don’t mean to be a spoilsport, and I don’t think I’m a Luddite. I edit a newspaper that has embraced new media with creative, prizewinning gusto. I get that the Web reaches and engages a vast, global audience, that it invites participation and facilitates — up to a point — newsgathering. But before we succumb to digital idolatry, we should consider that innovation often comes at a price. And sometimes I wonder if the price is a piece of ourselves.

Joshua Foer’s engrossing best seller “Moonwalking With Einstein” recalls one colossal example of what we trade for progress. Until the 15th century, people were taught to remember vast quantities of information. Feats of memory that would today qualify you as a freak — the ability to recite entire books — were not unheard of.

Then along came the Mark Zuckerberg of his day, Johannes Gutenberg. As we became accustomed to relying on the printed page, the work of remembering gradually fell into disuse. The capacity to remember prodigiously still exists (as Foer proved by training himself to become a national memory champion), but for most of us it stays parked in the garage.

Sometimes the bargain is worthwhile; I would certainly not give up the pleasures of my library for the ability to recite “Middlemarch.” But Foer’s book reminds us that the cognitive advance of our species is not inexorable.

My father, who was trained in engineering at M.I.T. in the slide-rule era, often lamented the way the pocket calculator, for all its convenience, diminished my generation’s math skills. Many of us have discovered that navigating by G.P.S. has undermined our mastery of city streets and perhaps even impaired our innate sense of direction. Typing pretty much killed penmanship. Twitter and YouTube are nibbling away at our attention spans. And what little memory we had not already surrendered to Gutenberg we have relinquished to Google. Why remember what you can look up in seconds?

Robert Bjork, who studies memory and learning at U.C.L.A., has noticed that even very smart students, conversant in the Excel spreadsheet, don’t pick up patterns in data that would be evident if they had not let the program do so much of the work. “Unless there is some actual problem solving and decision making, very little learning happens,” Bjork e-mailed me. “We are not recording devices.”

Although, apparently, we used to be.

Look, there’s no doubt that technology changes how we live, including how we use our brains.  But doesn’t the mere fact that our technology is constantly evolving refute the notion that we’ve lost problem solving capability? I don’t know whether the cognitive advance of our species is inexorable–but it certainly continues to roll full steam ahead.

Have we lost something? Well, I suppose. My penmanship certainly isn’t what it was in junior high. Then again, my typing has improved–and is actually more readable than my best cursive ever was. Would I know my way around better if I had to rely on my memory rather than Google Maps? Probably.  I often find myself looking up directions to somewhere I’ve been once or twice before. But, strangely, I actually find it much easier than it used to be to get to places I’ve never been precisely because I can look it up and get directions.

While my fact retention remains pretty good, it’s not what it used to be. How much of that’s a function of age versus technology, I can’t say. Then again, I’m able to verify and supplement what I do know with amazing ease thanks to search engines. As my colleague Alex Knapp reminds me, the ability to look stuff up isn’t a substitute for being widely read and informed; having independent context is invaluable. But being able to look things up without leaving my desk–or, indeed, wherever I happen to be, since I have a smart phone–beats having to rely on nothing but memory or having to take a trip to the library.

Next, though, Keller goes from the arguable to the absurd:

Basically, we are outsourcing our brains to the cloud. The upside is that this frees a lot of gray matter for important pursuits like FarmVille and “Real Housewives.” But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.

We now have the ability to connect, in quite intimate fashion, with people we’d never met in the pre-Internet world. I have hundreds of interactions with people from literally all over the world on a daily basis. Contrast this with the provincial existence most had twenty years ago, and it’s obvious that the opportunity to reflect and develop genuine empathy have blossomed rather than atrophied. If you want to spend the saved time playing FarmVille instead, that’s your problem.

The most obvious drawback of social media is that they are aggressive distractions. Unlike the virtual fireplace or that nesting pair of red-tailed hawks we have been live-streaming on nytimes.com, Twitter is not just an ambient presence. It demands attention and response. It is the enemy of contemplation. Every time my TweetDeck shoots a new tweet to my desktop, I experience a little dopamine spritz that takes me away from . . . from . . . wait, what was I saying?

I read and send a lot of tweets. I’m at over 23,000 sent and have read many times that. Yet, somehow, I’m able to turn away from Tweetdeck for hours–even days–at a time when I’ve got something else to do. And, seriously, the most prestigious newspaper in the United States is live-streaming birds nesting and a fake fireplace? While calling us shallow?

My mistrust of social media is intensified by the ephemeral nature of these communications. They are the epitome of in-one-ear-and-out-the-other, which was my mother’s trope for a failure to connect.

I’m not even sure these new instruments are genuinely “social.” There is something decidedly faux about the camaraderie of Facebook, something illusory about the connectedness of Twitter. Eavesdrop on a conversation as it surges through the digital crowd, and more often than not it is reductive and redundant. Following an argument among the Twits is like listening to preschoolers quarreling: You did! Did not! Did too! Did not!

Perhaps it’s because Keller has only sent 20 Tweets in the two-year history of his Twitter account? If your account’s a placeholder, it’s hardly going to be a source of enlightening conversation and deep connection. Nobody says you have to eavesdrop. You’ve got 18,777 followers. Engage with them.

And, if all you’re getting is “You did! Did not! Did too! Did not!” you’re following the wrong people.  Hell, try following people who work for you like CJ Chivers and Liz Heron.

FILED UNDER: Science & Technology, Uncategorized, , , , , , , ,
James Joyner
About James Joyner
James Joyner is Professor and Department Head of Security Studies at Marine Corps University's Command and Staff College. He's a former Army officer and Desert Storm veteran. Views expressed here are his own. Follow James on Twitter @DrJJoyner.

Comments

  1. So Mr. Keller thinks that we don’t know as much because we don’t memorize entire books? Really? Counter this with what Bill James has noted before concerning the difficulty of comparing baseball players of the early 20th century with those of today. All we really know about Honus Wagner is what his statistics are and what his contemporaries thought of him. Meanwhile, you’d be amazed hom many people know who the fifth starter for the NY Mets, what his ERA is, and even he eats for breakfast.

    Now wee can choose to fill our heads with trivia or meaningful information, but it is hard to imagine that the average person on the street doesn’t actually carry around a lot more information about his/her world now than the average person on the same unpaved cartpath did before Luddite became an appropriate adjective for an editor of the NY Times. I think Mr. Keller is merely lamenting his loss of importance as a gatekeeper to information.

  2. john personna says:

    He lost me at “digital idolatry” because “digital” is a medium. It’s a medium supporting a galaxy of messages. No one is such a darn fool that they endorse all of them. I’d suggest he read some Claude Shannon … if I thought he had a receptive mind.

    Other than that, since I am hiking and navigating these days, my personal theory is that our memory is for places and things. I enjoy a pretty good spacial and trail memory. How dare Keller subvert my navigational memory from it’s primary role!

  3. John Burgess says:

    Didn’t Plato rant some time ago about how writing and reading were going to weaken minds? Why yes, I believe he did… It’s so sad that humanity has been trapped in the Bronze Age since.

  4. Scott scoggins says:

    Yes, my dear. It’s called “Death by Technology” By one of the most brilliant minds I’ve ever read.

    Keep living, you’ll see what I mean.