The Harold Hill Award for Most Transparent Con Job of the Year goes to Mark Prensky, who suggests in the formerly estimable Chronicle of Higher Education that paper books be banned from university campuses (“In the 21st Century, Let’s Ban (Paper) Books“). I honestly thought I was reading satire when I got to this part:

Any physical books in students’ possession at the beginning of the year would be exchanged for electronic versions, and if a student was later found with a physical book, it would be confiscated (in return for an electronic version).

That’s right—confiscated! Like a joint, or a switchblade, or firecrackers, or George Orwell’s 1984 (well, unless it’s on your Kindle, and wasn’t subsequently erased from your Kindle). Fascism is so fashionably futuristic!

What is, according to Prensky, the greatest advantage of this all-digital campus, as opposed to the seemingly more reasonable pursuit of utilizing the best of both worlds—digital books along with hard copy?

Ideas would be freed from the printed page, where they have been held captive for too many centuries.

Say again?

The physical book is, in many ways, a jail for ideas—once a book is read, closed, and shelved, for most people it tends to stay that way. Many of us have walls lined with books that will never be reopened, most of what is in them long forgotten.

But what if all those books were in our pockets and could be referred to whenever we thought of them?

Let me try to wrap my head around this one. Once I’ve read a book, not only do I forget it, I am for some reason unable to take it off the shelf and read it again. But if I had an e-reader on my person, I am instantly empowered to do just that, as if the technology can do the work of reading for me. The argument is ridiculous to the point of being gibberish.

Most of Prensky’s witless rhetoric about “moving education into the future” I’ve written about before (see my posts here and here), and I’m not going to pick the whole pile apart again. So let’s skip to the punchline.

Many entrepreneurs are already inventing software that allows the quick and fertile connection of one’s ideas and those of others, but an all-digital campus would provide a powerful incentive to develop those programs even faster and take them further.

If you don’t know what Mark Prensky does for a living, here’s a hint: He’s an educational software designer and author of books promoting “21st-century student learning through technology.” His new educational paradigm can be summed up quite easily: Less reading of books, more playing of games.

In my last post I talked a lot about libraries, but not at all about librarians. That was a mistake. While I vehemently object to Godin’s flippant assertion that we need libraries “not at all,” I agree with him that librarians are more important than ever (though his suggestion that they teach students “how to use a soldering iron or take apart something with no user serviceable parts inside” is almost surreal). We need them to, in his words, “figure out creative ways to find and use data,” especially now that there is so much of it,  and so much of it is suspect.

Bobbi Newman, at Librarian by Day, also addressed Godin’s post:

We ARE fighting for the future of the librarian as a producer, concierge, connector, teach[er] and impresario, but we know to do that we need books. We need the information contained in those books…

And she makes a great point about all this “free” information that apparently renders books obsolete. In fact, libraries pay for this information, and “The price of those databases is going up, not down.” I like Wikipedia a lot, but it is not and never will be a library.

The lesson here is that, if you want to know something about the future role of librarians, it’s best to ask librarians. Just be sure to wait until they’re done soldering.

Seth Godin writes motivational books for managers who think throwing day-long seminars featuring highly paid speakers (like Seth Godin) is a better motivator than taking everyone to a no-strings-attached lunch or, better yet, giving everyone a day off work. He says his latest book, Poke the Box, is about “the spark that brings things to life.” He firmly believes that “We need to be nudged away from conformity and toward ingenuity, toward answering unknown questions for ourselves.” First of all, I can neither ask nor answer a question of myself if that question is unknown. Second, if I’m supposed to answer questions for myself, then why do I need to buy his silly book?

On Godin’s blog (free of charge!) you will find sparky, nonconformist ruminations such as, “Just imagine how much you’d get done… if you stopped actively sabotaging your own work.” In another post (“Self directed effort is the best kind”) he writes, “Effort’s… incredibly difficult to deliver on a regular basis. So we hire a trainer or a coach or a boss… There’s an entire system organized around the idea that we’re too weak to deliver effort without external rewards and punishment.” Isn’t that cute? My dear Seth, you are that system.

I’m generally indifferent to Seth Godin and the self-help racket that keeps him and many others so comfortable and so slickly chipper, but he has gone and made one of those pronouncements about the future of libraries so popular these days among cost-cutting officials and charismatic geeks who stand to make oodles of money on the internet and digital technologies.

In short, Godin believes that libraries as warehouses of “dead books” are obsolete, or soon will be, thanks to Netflix, Kindle, and Wikipedia. (Wikipedia and online sources have, in fact, “basically eliminated the library as the best resource” for grade schoolers and undergraduates. It sounds wacky, I know, but if we assume students are researching headier topics such as Magic: The Gathering and Star Wars action figures, then he’s absolutely correct.) His library of the future “is filled with so many web terminals there’s always at least one empty,” and it has “The vibe of the best Brooklyn coffee shop combined with a passionate raconteur of information…” I found the following especially creative:

And the people who run this library don’t view the combination of access to data and connections to peers as a sidelight–it’s the entire point.

No, Seth. The point of a library is that it offers free public access to information, most of which is available to be taken home, freely, for extended periods of time.

Providing web access to those who can’t afford it has rightly become a core service of public libraries; so, by all means, bring in more terminals. But until every book currently available to libraries in physical copy is available online or via e-reader, and until computers and/or e-readers are readily available to be freely taken home by the public, then libraries must fulfill their mission—conformist though it may be—of offering “dead books” to those who seek them.

People with money tend to devalue the difference between something that is relatively inexpensive and something that is free. What they experience as an inconvenient gap is seen by those without money as an unbridgeable gulf, often painful to look upon. It is just conceivable that in a few years “[e]readers will be as expensive as Gillette razors, and ebooks will cost less than the blades,” but even if it’s true, the plain fact is that many will have to choose between razors and e-readers.

You can poke whatever box you like, Godin, so long as that box isn’t the one institution in the civilized world that offers the means for its citizens, regardless of economic or social standing, to freely educate themselves.

Robert Darnton is a very smart man. A professor and librarian at Harvard, he has recently been an outspoken critic of the Google Book Settlement Agreement (see “Google and the New Digital Future” and “Google’s Loss: The Public’s Gain“). In the wake of Judge Denny Chin’s sound (and righteous) rejection of the settlement on March 22, 2011, Darnton continues to press, courageously, for a Digital Public Library of America (DPLA), “a collection of works in all formats that would make our cultural heritage available online and free of charge to everyone everywhere.”

So I was pretty surprised to catch his glib and naive op-ed in The Chronicle of Higher Education, “5 Myths About the `Information Age.’” In an attempt to dispel what he sees as a lingering “false consciousness” about the ascendancy of new information technologies, he creates some pungent myths, and some curious straw man arguments, of his own.  I’ll take his points one at a time.

1. The Book is Dead. Darnton cites statistics showing that more books are produced each year than the year before.  Hence, the book is alive and well. While there is a contingent of sentimentalists who cling to the physicality of the book (how it feels, smells) and lament the shift to digital texts, most people who worry about the death of books are really concerned with the death of reading in whatever format, specifically the kind of careful reading that generates thoughtful analysis and original writing.

Two comprehensive studies by the NEA from 2004 and 2010 (see here) conclude that “there is a general decline in reading among teenage and adult Americans,” and that “both reading ability and the habit of regular reading have greatly declined among college graduates.” (Darnton asks later in the article whether it’s really true “that deep reading has declined.” I guess he has his answer.)

Just because hundreds of thousands of books are published every year doesn’t mean anyone is reading them.

2. We have entered the information age. “Every age is an age of information,” Darnton writes, “each in its own way and according to the media available at the time.” He admits that methods of communication are changing “rapidly,” but denies that the change is “unprecedented.” But does it really matter which adjective applies? What’s important is that more information is available today, and more readily available, than ever before. And that’s a good thing.

Good or bad, the popular technology industry is dedicated almost exclusively to producing a neverending series of devices that allow us to consume ever greater amounts of information at ever greater speeds. If this isn’t an age characterized by a general, almost obsessive preoccupation with information—whether anyone is doing anything useful with it is no longer the point—then I don’t know what the hell to call it.

3. All information is now available online. I don’t know anyone who thinks that all information is available online, but quite a few people think, equally incorrectly, that any information not found online is superfluous or irrelevant. This is the misconception in need of remedy. But instead he goes on about the absurdity of the comment (that no one seriously makes) being obvious to “anyone who has ever done research in archives,” as if legions of people outside of universities are doing archival research, or any sort of research that involves typing more than a few words into Google.

I understand and appreciate that he wants to make a case for print here—and he knows more than anyone that physical texts endure like their digital equivalents never will—but I’m disappointed that he threw away such a great chance to address the genuine false consciousness at hand. That’s something that can be said about the whole essay, unfortunately.

4. Libraries are obsolete. Because the libraries at Harvard and New York are “crammed with people,” libraries across the land must also be doing just fine, Darnton suspects, despite the fact that many of them have closed or are closing (see herehere and here, for instance). At universities they are converting to student commons (see here and here) that feature cafés and computers but few, if any, books. The thriving library my parents took me to every week as a kid, the library in which I learned to love to read, is now closed 3 days a week, and operates the rest of the time on reduced hours with reduced staff. Hey, at least it’s still open.

Libraries are not obsolete. They are more important than ever. But, in regions less well-endowed than Cambridge and Manhattan, they are being discarded, shamelessly debased.

5. The future is digital. Darnton knows that we are shifting to a “dominantly digital ecology,” but wants to assure the skeptics that “new technology is reinforcing old modes of communication rather than undermining them.” In fact, it’s doing both. The question to ask is, Do the sellers of these technologies make more money by encouraging and rewarding distractibility than they do by encouraging and rewarding deep reading and archival research?

Things look much brighter from Harvard, clearly. But does the view from Ivy League windows provide a true representation of the world most of us live in? To quote the last words of The Sun Also Rises: “Isn’t it pretty to think so?”

Voig knew that the machines could not really be trusted. The creations were no better than the creators, and indeed resembled them in many of the worst ways. Like men, the machines were frequently subject to something resembling emotional instability. Some became overzealous, others had recurring hallucinations, functional and psychosomatic breakdowns, or even complete catatonic withdrawals. And aside from their own problems, the machines tended to be influenced by the emotional states of their human operators. In fact, the more suggestible machines were nothing more than extensions of their operators’ personalities.

The above excerpt is from an extraordinary 1962 novel by Robert Sheckley called Journey Beyond Tomorrow. The story revolves around a young man named Joenes, who sets out from his insulated Pacific Island home on what becomes a Kafkaesque journey through a future America. Near the end of the book,  Joenes, now a befuddled agent of the U.S. government, is flying back to the Octagon (built over the Pentagon, which turned out to be much too small for the bureaucracy within) when his plane is fired upon by an American missile defense system.

Back in Washington D.C., the War Probabilities Calculator spits out a stream of possible causes behind the attack, and the top five are given to General Voig, who “knew that most of his information came to him from extremely expensive machines that sometimes could not tell the difference between a goose and a rocket; machines that required regiments of highly trained men to minister to them, repair them, improve them, and to soothe them in every way.” 

50 years later, it’s not just the machines that are overzealous extensions of their creators’ personalities, but the programs designed to run on them. Sheckley’s words are eerily contemporary. 

When I was a kid, I had a Fisher-Price Movie Viewer, and I can still remember turning that crank and watching, over and over again, the scene from Bambi where Thumper tries to teach the awkward fawn how to stand up and walk on the frozen pond. There was no sound, and the clip couldn’t have lasted much longer than 30 seconds, but it was lots of fun. I was grateful then to have even a slice of the movie.

More than 30 years later, with the release of the Blu-Ray edition of Bambi, Disney is introducing something called Second Screen, an app that allows you to sync your laptop or iPad with your HDTV. While the movie sweeps by on the TV, you can paint pictures on your iPad, put together puzzles, “explore” trivia, etc.

Disney calls it “a revolutionary movie watching experience,” which I find pretty odd, because it’s instead a revolutionary distraction from the movie watching experience. We’ve had special features for almost as long as we’ve had home video, but as far as I know they’ve never run concurrently with the main event (I don’t count actor/director commentary). Has it come to the point that we need a slew of “interactive” pursuits to mitigate the burden of having to sit still (for a whopping 69 minutes, in this case) and give a landmark film our undivided attention?

When kids get excited about reading, it’s a good thing. When they get excited about e-readers, I’m not so sure.  According to an article in the New York Times (“E-Readers Catch Younger Eyes and Go in Backpacks“), sales of young-adult e-books are booming, with HarperCollins reporting a 19% increase compared to a year ago.

Are parents buying these young adults Nooks and Kindles because they want to  “use the platform… as a way for kids to learn,” as one parent-blogger put it, or because the devices are becoming social markers like smartphones, with “tweens and teenagers clustered in groups and reading their Nooks or Kindles together, wirelessly downloading new titles with the push of a button, studiously comparing the battery life of the devices and accessorizing them with Jonathan Adler and Kate Spade coversin hot pink, tangerine and lime green.”

I suppose it could be a little of both, and the article is quick to point out that we don’t know yet “if younger people who have just picked up e-readers will stick to them in the long run.” My guess is that the kids who developed a love of reading before they got an e-reader will be glued to both print and electronic books. Experiencing the story inside the “covers” is what will matter to them, so the format in which the stories appear will be largely irrelevant.

The kids who showed no interest in books pre-Kindle, and whose parents did not actively promote and nurture such an interest (no, buying a Kindle doesn’t count), will dress up their machines for a few weeks, then toss them under the bed next to the old video games and “learning technologies” their parents hoped would do the job of parents.

I expected that all these young-adult e-books would at least be cheaper than their “dead tree” counterparts—would make sense economically, in other words—but here’s a price comparison I did today on Amazon of all the titles mentioned in the article:

Hush, Hush, by Becca Fitzpatrick:

$9.99 paperback; $8.99 Kindle edition

The Lion, the Witch and the Wardrobe, by C.S. Lewis:

$6.99 paperback; $7.99 Kindle edition; $35.24 paperback 7-book series set (unavailable on Kindle)

Pretty Little Liars, by Sara Shepard:

$8.99 paperback; 8.99 Kindle edition

I Am Number Four, by Pittacus Lore:

$10.79 hardcover; $9.99 Kindle edition

Before I Fall, by Lauren Oliver:

$7.20 hardcover; $9.99 Kindle edition

Clockwork Angel, by Cassandra Clare:

$12.05 hardcover; $9.99 Kindle edition

By my calculations, parents would save a total of $.07 on the Kindle editions, but of course they would have to buy a Kindle first (priced pretty reasonably now at $139).