Archive for the ‘The View from Harvard’ Category

Robert Darnton is a very smart man. A professor and librarian at Harvard, he has recently been an outspoken critic of the Google Book Settlement Agreement (see “Google and the New Digital Future” and “Google’s Loss: The Public’s Gain“). In the wake of Judge Denny Chin’s sound (and righteous) rejection of the settlement on March 22, 2011, Darnton continues to press, courageously, for a Digital Public Library of America (DPLA), “a collection of works in all formats that would make our cultural heritage available online and free of charge to everyone everywhere.”

So I was pretty surprised to catch his glib and naive op-ed in The Chronicle of Higher Education, “5 Myths About the `Information Age.’” In an attempt to dispel what he sees as a lingering “false consciousness” about the ascendancy of new information technologies, he creates some pungent myths, and some curious straw man arguments, of his own.  I’ll take his points one at a time.

1. The Book is Dead. Darnton cites statistics showing that more books are produced each year than the year before.  Hence, the book is alive and well. While there is a contingent of sentimentalists who cling to the physicality of the book (how it feels, smells) and lament the shift to digital texts, most people who worry about the death of books are really concerned with the death of reading in whatever format, specifically the kind of careful reading that generates thoughtful analysis and original writing.

Two comprehensive studies by the NEA from 2004 and 2010 (see here) conclude that “there is a general decline in reading among teenage and adult Americans,” and that “both reading ability and the habit of regular reading have greatly declined among college graduates.” (Darnton asks later in the article whether it’s really true “that deep reading has declined.” I guess he has his answer.)

Just because hundreds of thousands of books are published every year doesn’t mean anyone is reading them.

2. We have entered the information age. “Every age is an age of information,” Darnton writes, “each in its own way and according to the media available at the time.” He admits that methods of communication are changing “rapidly,” but denies that the change is “unprecedented.” But does it really matter which adjective applies? What’s important is that more information is available today, and more readily available, than ever before. And that’s a good thing.

Good or bad, the popular technology industry is dedicated almost exclusively to producing a neverending series of devices that allow us to consume ever greater amounts of information at ever greater speeds. If this isn’t an age characterized by a general, almost obsessive preoccupation with information—whether anyone is doing anything useful with it is no longer the point—then I don’t know what the hell to call it.

3. All information is now available online. I don’t know anyone who thinks that all information is available online, but quite a few people think, equally incorrectly, that any information not found online is superfluous or irrelevant. This is the misconception in need of remedy. But instead he goes on about the absurdity of the comment (that no one seriously makes) being obvious to “anyone who has ever done research in archives,” as if legions of people outside of universities are doing archival research, or any sort of research that involves typing more than a few words into Google.

I understand and appreciate that he wants to make a case for print here—and he knows more than anyone that physical texts endure like their digital equivalents never will—but I’m disappointed that he threw away such a great chance to address the genuine false consciousness at hand. That’s something that can be said about the whole essay, unfortunately.

4. Libraries are obsolete. Because the libraries at Harvard and New York are “crammed with people,” libraries across the land must also be doing just fine, Darnton suspects, despite the fact that many of them have closed or are closing (see herehere and here, for instance). At universities they are converting to student commons (see here and here) that feature cafés and computers but few, if any, books. The thriving library my parents took me to every week as a kid, the library in which I learned to love to read, is now closed 3 days a week, and operates the rest of the time on reduced hours with reduced staff. Hey, at least it’s still open.

Libraries are not obsolete. They are more important than ever. But, in regions less well-endowed than Cambridge and Manhattan, they are being discarded, shamelessly debased.

5. The future is digital. Darnton knows that we are shifting to a “dominantly digital ecology,” but wants to assure the skeptics that “new technology is reinforcing old modes of communication rather than undermining them.” In fact, it’s doing both. The question to ask is, Do the sellers of these technologies make more money by encouraging and rewarding distractibility than they do by encouraging and rewarding deep reading and archival research?

Things look much brighter from Harvard, clearly. But does the view from Ivy League windows provide a true representation of the world most of us live in? To quote the last words of The Sun Also Rises: “Isn’t it pretty to think so?”

Read Full Post »

Novelist Zadie’s Smith’s review of David Fincher’s film The Social Network (“Generation Why?“) is so palpably, patronizingly self-righteous that it’s damn near impossible to take seriously any of her more cogent criticisms of Facebook and Web 2.0. She argues that we should resist Facebook and programs like it because they reduce human beings to data sets on websites, but she does so in such a way as to reduce an entire generation of human beings into mesmerized idiots who are powerless to resist Facebook and programs like it.

Throughout the essay, Smith takes special pleasure in dissecting what she sees as the pernicious idiosyncracies of Facebook and its creator, Mark Zuckerberg, but she seems awfully proud that she was there, at Harvard, suffering “the awful snow… turning your toes gray, destroying your spirit, bringing a bloodless end to a squirrel on my block,” when Facebook made its debut. It’s a hard knocks life at Harvard, apparently, and the only things worse than the “inanimate, perfect” snow and dead squirrels were the animate, imperfect students:

I felt distant from Zuckerberg and all the kids at Harvard. I still feel distant from them now, ever more so, as I increasingly opt out (by choice, by default) of the things they have embraced. We have different ideas about things. Specifically we have different ideas about what a person is, or should be.

She goes on to describe these kids, the bloodless cyborgs playing theatrical foil to her intrepidly unique flesh-and-blood, as “Generation Facebook.” Being British-born, Smith likely was never tagged with the Generation X epithet, but those of us who were can tell her that it’s always a bad idea to make baseless (and base) generalizations about everyone of a certain age (or color, or nationality, etc.). If Smith herself is too sentient to get “entrapped” by the program (which she used and then promptly—and bravely, we are led to believe—quit), might not others also possess such impressive powers of self-determination?

Facebook is different things to different people. For me, a thirty-eight-year-old married guy, it’s a diversionary toy that allows me to antagonize my friends and gauge my mother’s mood without picking up the phone or sending an email. If I were 20 years younger, it might be the virtual equivalent of a physical hangout, like the diner in Happy Days or the cafe in Friends—or it might still be a diversionary toy. There are no doubt others, of all ages, who think that adding more “friends” actually means having more friends, and who generally confuse virtual worlds with the real one. But these others are dupes in need of a philosophical education; we shouldn’t launch their misrepresentations—through fear, ignorance, or, in Smith’s case, self-aggrandizing elitism—into majority rule.

“Connection is the goal,” she says of Zuckerberg’s program. “The quality of that connection, the quality of the information that passes through it, the quality of the relationship that connection permits—none of this is important.” She’s right, but that doesn’t mean social network users automatically have or want fewer meaningful relationships. (I’m repeating myself, but the fact that she thinks so poorly of our cognitive defenses and so highly of hers vexes me to my core.) Do we have to be vigilant about the time we spend wrapped up in the distraction technology (Nicholas Carr’s phrase) of the internet? Of course, and she quotes programmer Jaron Lanier (You Are Not a Gadget: A Manifesto) on the dangers of getting hooked on our screens:

Different media designs stimulate different potentials in human nature. We shouldn’t seek to make the pack mentality as efficient as possible. We should instead seek to inspire the phenomenon of individual intelligence.

And again:

Resist the easy grooves [these designs] guide you into. If you love a medium made of software, there’s a danger that you will become entrapped in someone else’s recent careless thoughts. Struggle against that!

Yes on all counts, but there’s a difference between a call to vigilance from a technology industry insider and a grandstanding performance by an actor playing The Last Real Person on Earth:

I am dreaming of a Web that caters to a kind of person who no longer exists. A private person, a person who is a mystery, to the world and—which is more important—to herself.

Mark Zuckerberg wants you to buy into the idea that a person—that personhood—can be genuinely represented by strings of data in a few boxes on a screen. We should resist this nonsense, just as we resist countless other lies on a daily basis. Just as important, we need to resist the ancient impulse to lump people we don’t really understand into pernicious categories that are, in words Smith uses to describe Facebook, “self-promoting” and “slickly disingenuous.”

Read Full Post »