We all know the cycle: If you want to get clicks on the Internet, take something people love and tell them they’re doing it wrong. It’ll prompt reads from people worried they’re doing it wrong, shares from early adopters who agree with you, hate-reads from people who disagree, and maybe a counter-take or two from people who were so flummoxed by your argument they had to publish a reply, like this one.
Such is the case with “Why you should quit reading paper books,” a perfectly titled troll from Andy Sparks, a startup cofounder and Medium blogger. “I believe everyone should quit reading print books almost entirely,” he opines in the opening paragraph of the trending piece. “In 2013, I broke up with print and decided to get engaged with my memory instead,” he adds, before detailing his workflow: read book on Kindle, highlight favorite passages, upload said quotes to Evernote for later retrieval. It’s a personal database, a way of stockpiling “passages that would have become lost memories,” he says. Though he too prefers the experience of reading the hard copy, he “can’t stand the feeling of wanting to go back to a passage and not being able to find it.”
There’s a saccharine sweetness to the sentiment: while smacking of Silicon Valley exceptionalism, the post manages to not only swing and miss on what science says about how memory actually works, but it also misreads what our literary tradition says reading is for, and what the current trend in the book market is.
While the Kindle sold out on its debut a decade ago, the e-reader market has since cooled off, and sales of its (former) rival, the Barnes and Noble NOOK, have cratered. Despite an attention-sucking election cycle and a lack of breakout fiction hits, sales of American print books were up 3.3 percent in 2016, the third consecutive year of growth, while e-book sales were down almost an estimated 20 percent. In the UK, retail books sales were up 7 percent, and e-book sales down 4 percent. Pew was already reporting declines in e-reader ownership in 2015, which some attribute to a loss in the cool factor. “It was new and exciting,” literary agent Cathryn Summerhayes said of the Kindle in an interview. “But now they look so clunky and unhip.”
And forgive me for splitting epistemological hairs, but saving quotes to Evernote is not memory; it’s creating an archive. You are not on your way to knowing these lines by heart—as poetry lovers have done for centuries, and human culture was built upon—you are storing them by click. It’s another example of how devices do not so much aid thinking as replace it: neuroscientists who study the intersection of computers and cognition tell Thrive that we outsource mental processes to hardware. The thing about human brains is that, primed by evolution, they tend to only remember the things they’ve been taught through experience to be important. If you only use GPS to navigate a place, you’ll learn your way around less quickly; rather than mental maps, it’s Google Maps. Jason Chein, who runs Temple University’s Neurocognition Lab, told me that if I’ve just moved to a new neighborhood, I should get around by way of landmarks, so that spatial memory is lodged in my noggin rather than my smartphone.
A 2011 paper with the foreboding title “Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips” is also useful here. In that study, researchers found that when people are asked hard questions, they think more about how they’d fetch that information rather than recalling that piece of information itself. It’s not memory Sparks is humblebragging about; it’s information storage. If you’re forever attached to your devices, that might not be that big of a difference—and the phone as extension of self is most certainly a thing—but if you want to think on your feet, it’s best to not be craning your neck down.
Books, unlike the “content” on a screen, are objects in the world. Their inhabiting three dimensions helps with memory, too: Researchers since the 1970s have been noticing how memory is “visuospatial,” or contingent on depth perception and the composition of objects in your plane of view—like how you can still recall how a treasured passage in a favorite novel looked lying there on the page the first time you fell in love with it. More recently, research has found that proofreaders catch more errors in print, and that the shifting heft of a book as your progress through it makes a weight-based anchor, too. There is also much less distraction with a book—evidenced by how much better kids’ reading comprehension is with the real thing in paired reading, since their parents are more likely to talk about the story rather than the device they’re reading it on. High schoolers had better comprehension in reading print in a 2013 study, a difference the authors credited partly to the spatial-ness of what they were reading. The students in the computer condition could only view one page at a time, while the print reader could see and feel the physical dimensions of the text. That allowed them to build mental maps of the reading—not unlike the intrepid traveler who navigates a new neighborhood sans GPS—and gain better comprehension for it. (If you want the brain to learn, you have to make it work.) Books are also notably absent of screen fatigue, something you really appreciate if you spent fifty-plus hours a workweek typing into a keyboard, then dabble in some Netflix in your off hours.
There are, of course, individual differences and particular use cases that lend themselves to e-readership. A fellow media bro told me that he loves his Kindle because it allows him to read at night without annoying his slumbering girlfriend, and the compact form factor is nice when you’re being squished into the subway on the way to work, ditto for all the space and weight you save trudging it around in your carry-on. It’s a supplemental, rather than primary, device, he explained to me over beer and cheese balls. The flexibility of a screen is great, too: Older readers tend to dig e-readers because the contrast and backlighting aids legibility. College students whose course loads include e-reader-friendly classes are into them, and anything that would help students get less ripped off on textbooks has to be a net positive, too.
The biggest vote in print’s favor might come care of the American philosopher and educator Mortimer Adler, in his How to Read a Book: The Classic Guide to Intelligent Reading, book that lives up to its title. Adler distinguishes between three different ends for reading: entertainment, information, and understanding. The meanings of the first two are pretty immediate: you read for fun (Stephanie Meyer) or to learn lots of facts (Malcolm Gladwell). The third is on a higher plane, one that the good professor encourages us to pursue: toward not only being able to regurgitate quotes—or look them up in an app—but gain wisdom, to do the sort of learning that makes you, in a real way, a better person.
Adler challenges you—and certainly challenged me—to read the kinds of books—one might call them great— that stretch you, that move you from a state of understanding less about the world to understanding more, whether that’s a Dante or a Kahneman. We go through the effort of reading these books “for the light they throw upon human life and human society, past, present, and future,” he declares. The way you gain access to the wisdom inside them is by doing the work, and this, to me, is where the book thoroughly outclasses more modern technologies: you can highlight and underline, draw brackets and make margin notes—the second-hand ones, of course, being one of the reading life’s great joys. You can also, as Adler and Brain Pickings blogger Maria Popova encourage, make your own index of key ideas in an empty page in the book’s front matter, giving you a readily referenceable resource while also drilling those insights deeper into your memory. In all its analog glory, a book allows you to chew on the text in three dimensions—could there be a better way of making sure it’s digested? It’s not time for breaking up with print. In 2017, we’re renewing our vows.