Thursday, May 19, 2011

The Rise of Social Media is NOT the Death of the Human Soul

   Bill Keller, executive editor of the New York Times, published this Op-Ed on the NYT website tonight in which he argues that the adoption of scoail media comes at the price of "a piece of ourselves:" namely, "the soul." It seems to me that a lot of people who grew up before the advent of the internet are afraid that we're somehow losing our this giant, looming technology-beast. But I really thing our souls are safe where they are.
   As an example of what humans have lost due to the progress of technology, Keller claims that we long ago lost the ability to memorize vast quantities of information due to the rise of the printing press:
Until the 15th century, people were taught to remember vast quantities of information. Feats of memory that would today qualify you as a freak — the ability to recite entire books — were not unheard of.
Then along came the Mark Zuckerberg of his day, Johannes Gutenberg. As we became accustomed to relying on the printed page, the work of remembering gradually fell into disuse. 
   First of all, it's not as if everyone in medieval Europe could just start belting out passages of their favorite epic poem on a whim. Some people could, sure. And they were called bards and they made their living doing just that. But prodigious memorization It has always been a limited skill — one that simply had a more prominent role before the rise of print and electronic means to preserve information. I also wonder why he thinks this skill has been lost. What about stage actors? They make their living doing this every day.
  But even if it were true: who cares? As he notes,
"What little memory we had not already surrendered to Gutenberg we have relinquished to Google. Why remember what you can look up in seconds?"
  Well... exactly. As the researcher he quotes states, "We are not recording devices." Why should we try to be? 
  
   Keller proposes that a downside of having machines sort through information for us is a decreased ability to problem-solve by ourselves:
Robert Bjork, who studies memory and learning at U.C.L.A., has noticed that even very smart students, conversant in the Excel spreadsheet, don’t pick up patterns in data that would be evident if they had not let the program do so much of the work.
“Unless there is some actual problem solving and decision making, very little learning happens,” Bjork e-mailed me.
  Fine, that's legit. It seems to me that this is a problem we can get around by requiring students to work through a few problems early on, to understand the process of arriving at a particular answer. We already do the same thing in high schools across the country every single day: remember proofs, from geometry? The only reason they are taught is to show you the process of thinking through why something is true.

   Keller vaguely entertains the value of saved time and effort that is so critical to the argument:
The upside is that this frees a lot of gray matter for important pursuits like FarmVille and “Real Housewives.”
   Clearly, he's joking here, but in doing so he is brushing aside the important point that we do end up with a lot more free time and brain space in the end that people can put towards any pursuit - FarmVille, maybe, but maybe to read a few more books, or to put a few more hours into their career, or into their family time.
   Finally, he implies that communication through social media isn't real communication.
I’m not even sure these new instruments are genuinely “social.” There is something decidedly faux about the camaraderie of Facebook, something illusory about the connectedness of Twitter.
   I do see the argument, and I think there are legitimate concerns here. But I don't think conversation through social media is any less real because it is online: it is bona-fide, emotion-arousing, intellect-stimulating exchange of opinion, perspective, and feeling. I even believe that love can exist through the internet.
As a kind of masochistic experiment, the other day I tweeted “#TwitterMakesYouStupid. Discuss.” It produced a few flashes of wit (“Give a little credit to our public schools!”); a couple of earnestly obvious points (“Depends who you follow”); ... and an awful lot of nyah-nyah-nyah (“Um, wrong.” “Nuh-uh!!”). 
Almost everyone who had anything profound to say in response to my little provocation chose to say it outside Twitter.
   This is pretty unfair. If you're going to say that everything that happened within 140 characters was at best "a flash of wit," and everything interesting took longer than that to say, then you're setting a double-standard. If you're looking for a long, deep, interchange of opinions, go elsewhere. Twitter simply does not facilitate that: it's built for dissemination, not discussion. And it's really good at what it was built for.

 Reason, debate, intellect, and emotion still exist and can be expressed online; because of the restrictions of Twitter as a platform, you're simply unlikely to find it there. It's unfair to declare that social media promotes "faux" conversation because it's not transmitted in the traditional mode. As with anything else, it is healthier to focus on the possibilities created by a new status quo than to stew over the loss of the old one.

No comments:

Post a Comment