In the neurobiology labs, technology is a requisite. A neurobiology lab by definition is pillared by electron microscopes, spectrophotometers, genetically enhanced bacteria that glow and survive and die on command (or by a well-designed experiment). Science and technology reciprocate each other’s ability to perpetuate onwards in this noble quest for the holy grail of knowledge. The superbly sophisticated equipment allow for novel questions to be asked – questions that perhaps even a decade before nobody had dared to pose because they did not have the means to answer it. Perhaps the ultimate emblem of technology in modern day science belongs to the physicists; their Large Hadron Collider on the border of Switzerland and France, which with the turn of a switch measures the trajectories of proton-proton collisions, strives to of course, out-think the philosophers in deciphering what reality really is.
I have no qualms with all of this.
Yet, technology has spawned, on the more commercialistic side of existence, items that allow for a higher efficiency lifestyle that have gone over the edge of superfluous. Unlike central heating, vacuums, and radios which are all technological advancements that justify themselves in some substantial manner by their larger degree of necessity, the Kindle, or the more generic notion of an E-book, is a (relatively) new device on the market that appears to be one of the grosser transgressions of our generation. I will overlook look the fact that the Kindle has been most insensitively, forebodingly anointed with a name that simultaneously is defined by the OED as “to set fire to, set on fire, ignite, light (a flame, fire, or combustible substance).†A paperback book would, yes, fall into the latter category of “combustible substance.†(A sadistic nod to Fahrenheit 451?) While the convenience of an E-book such as the Kindle is clear – carrying a bookshelf that would otherwise cumbersomely weigh hundreds or (for the bookish) thousands of pounds in a bag slung over your shoulder – what it asks for in exchange is an eventual self-induced literary ruin. That is not to say that it is that case now, but we have sown the seeds so that sometime in the far-flung future, second-hand bookstores with all their beautiful musky smells, with all their books blossoming sepia-toned fringes that accompany inscribed marginalia, with pages that hold both a story within the text and within its physicality of being passed from hand to hand up in time and finally (but not permanently) to land in your own, might very well become obsolete. Homer, Keats, and Woolf are instead impersonalized on a quietly glowing screen, a screen calculated by algorithms to best fool us into believing they are what they are trying to emulate: the papyrus, the paper, the ink, the textured print. With hard drives in the terabytes now, it’s not inconceivable to have all the words in the world compressed on a couple data chips. It’s phenomenal, really, that such a feat is well within the realm of possibility, but it is too, disheartening. Much like how it would be tactless to end a relationship through a text message, the same family of principles seems to apply and make it vulgar to consider digitizing all our literary heroes. The visceral quality of smoothing a book’s pages, the proportional weathering dependent on how much handling it gets, and the satisfying weight of it after it has been read will be what we pay for the convenience that we acquire with their new, weightless bodies.
Gone will be the days of wandering and discovering the unpredictable in libraries and bookstores, letting the unread waft through you, luring you to pull its spine from the shelf, dust off its jacket and earning a spot in your home. Instead, we will be targeted with “based on your purchase history†recommendations, although perhaps one day a program that simulates the random encounter with a new book will finally be accurately coded.
Granted, I am looking and predicting a future that is not within ten, twenty, fifty years from now, but hundreds of years (unless I am grossly underestimating the decline of our humanity). I desperately hope this will never be the case, but I must say that I am glad that I won’t be here to see how it plays out.
Sue majors in Neuroscience & English and tends to lurk in bookstores.
Leave a Reply
Be the First to Comment!