Chad Wellmon has a smart essay in The Hedgehog Review arguing that “Google Isn’t Making Us Stupid…or Smart.”

Our compulsive talk about information overload can isolate and abstract digital technology from society, human persons, and our broader culture. We have become distracted by all the data and inarticulate about our digital technologies…. [A]sking whether Google makes us stupid, as some cultural critics recently have, is the wrong question. It assumes sharp distinctions between humans and technology that are no longer, if they ever were, tenable…. [T]he history of information overload is instructive less for what it teaches us about the quantity of information than what it teaches us about how the technologies that we design to engage the world come in turn to shape us. 

It’s something you should definitely read, but it also reminded me of a section of my book that I lovingly crafted but ultimately editing out, and indeed pretty much forgot about until tonight. It describes the optimistic and pessimistic evaluations of the impact of information technology on– well, everything– as a prelude to my own bold declaration that I was going to go in a different direction. (It’s something I wrote about early in the project.)

I liked what I wrote, but ultimately I decided that the introduction was too long; more fundamentally, I was really building on the work of everyone I was talking about, not trying to challenge them. (I don’t like getting into unnecessary arguments, and I find you never get into trouble being generous giving credit to people who’ve written before you.) Still, the section is worth sharing.

Since their earliest days, personal computers and Internet connections have been bundled with what I’ve come to think of as Digital Panglossianism, after the famously optimistic Dr. Pangloss in Voltaire’s “Candide.” “Those who have asserted all is well talk nonsense,” he told his student, Candide; “they ought to have said that all is for the best.” Having access to information simply makes people smarter. Having unlimited quantities of information means that you can learn and know anything. The intellectual demands of today’s video games, and their social character, helps make us smarter, more collaborative, and just better people, more likely to devote our “cognitive surplus” to worthy world-changing causes.

Contrast this vision with that of the Digital Cassandras. In the Iliad, the god Apollo blessed Cassandra with the ability to see the future, but cursed her with the knowledge that her predictions would be ignored. Given that she foresaw the destruction of her home and royal line (she was a daughter of Trojan king Priam), and knew that her warnings would never be taken seriously, Cassandra tended to be a bit of a pessimist.

Today’s digital Cassandras are warning that our technologies are creating problems that we can avoid– but all often are choosing to ignore. Online spaces that are designed to foster a sense of community, or connect users with friends, end up “offering the illusion of companionship without the demands of friendship,” as Sherry Turkle puts it. Further, as Jaron Lanier says, our infatuation with collective intelligence leads to a kind of “digital Maoism.” Treating humans as the raw material of metacognition leads us to “gradually degrade the ways in which each of us exists as an individual,” he argues, and underestimate “the intrinsic value of an individual’s unique internal experience and creativity.” It impoverishes both inner life and public life: the Internet’s collective intelligence hasn’t solved many of the world’s problems, but is has watched a lot of cat videos. And finally, of course, Google is making us stupid.

Digital Panglosses and Cassandras disagree about a lot, but they share one common assumption: that the Internet and ubiquitous computing constitute something new. Sven Birkerts talks about a “total metamorphosis” brought about by the Web, of literary practice confronting a “critical mass” of technological and social changes. Pierre Levy talks about the Web as “civilization’s new horizon,” and that “we are moving from one humanity to another.” In other words while they disagree about whether this is to be applauded or feared, they agree that we’re living through an unprecedented period in history.

They share another, even more important, common assumption. Whether they see information technologies having an ultimately positive or negative effect, they both write as if its impacts are inevitable and unavoidable. Forgetting how to read novels will impoverish your inner life, or make you a playful postmodern trickster. Games are breeding-grounds for sociopaths, or training academies for tomorrow’s leaders. Social software will usher in a new age of connectivity and community, or make us more isolated and alone than ever. Either way, good or bad, technologies are going to change you, and there’s not a lot you can do about it.

Neither side means to promote a sense of passivity and inevitability, but they do. So, I think, do historians of media and technology who try to navigate between the optimistic and pessimistic views of Pangloss and Cassandra. The Web, they argue, is but the latest information technology to change our brains. The invention of writing– particularly the invention of the Greek alphabet, the first that could accurately reproduce the full range of a language’s sounds– profoundly altered the way we thought. The printing press set information free five hundred years before the Internet, while the newspaper was the first near real-time medium, and a critical foundation for the growth of “imagined communities.” The radio, telephone, and television were transforming the world into a “global village” in the 1960s, according to Marshall McLuhan.

Not only have we been living through information revolutions for as long as history can remember; we’ve lamented the changes, too. Socrates distrusted the new medium of writing. In 1477, the Venetian humanist Hieronimo Squarciafico complained in his “Memory and Books” that “Abundance of books makes men less studious; it destroys memory and enfeebles the mind by relieving it of too much work.” A hundred fifty years ago, the telegraph was a “Victorian Internet,” talked about in the same apocalyptic and prophetic tones used to describe the Web today.

All this is true, but I think it suffers from two problems. First of all, it’s easy to read these histories and conclude that new information technologies have always created problems, and that we’ve always survived. Eventually, we make peace with the new, or adapt, or forget that things ever were different. Ironically, while many historians of technology see themselves as critics of technological determinism, they end up making the changes we’re experiencing seem inevitable and irresistible. History also tends to be cruel to its Cassandras. Socrates’ criticism of writing looks pretty stupid given Greek civilization’s incredible legacy of philosophy, theatre, science, and literature– all of which would have been lost, and arguably would never have been created, without writing. Few people remember Thomas Watson’s brilliant leadership of IBM; instead, they remember his (perhaps incorrectly-attributed) offhand remark that he couldn’t imagine the world ever needing more than five computers.

Second, for all their thoroughness and careful scholarship, these histories offer an incomplete view of the past. It’s certainly true that human history is full of technological innovation, and some brilliant scholarship has shown how new information technologies– some as simple as the use of spaces between words– have changed the way we write, argue, and think. But it’s wrong to assume that only history’s losers have resisted the changes, or that we should take as a lesson that today’s worries about the decline of focus and attention– not to mention our own personal sense of dismay at being more distractible and having a harder time concentrating– are mere historical epiphenomena. In fact, there’s a long history of creating systems and institutions that cultivate individual intellectual ability, support concentration, and preserve attention. 

Some of humanity’s most impressive institutions and spaces are devoted to cultivating memory and contemplation: think of the monastery, the cathedral, or the university, all of which can be seen as vast machines for supporting and amplifying concentration. Or, at a much more personal level, the coffee-house, personal office, and scholar’s study all to different degrees support contemplation and deep thought. Contemplative practices themselves have a long history. As historians of religion have noted, Buddhism emerged in a South Asia that was newly urbanized, quickening economically, developing into a global center of trade, and on the verge of supporting large states. Christian monasteries first developed as an alternative to urban living; every culture has its holy men, shamans or witches, but only complex societies need (and can support) monastic institutions with large groups working, praying, and studying together. Early universities straddled a physical and psychic line between drawing on the capital and infrastructure of cities (Paris, Oxford, Bologna and Cambridge were all medieval centers of trade) and providing an intellectual respite from the bustle of trade and machinations of politics.

In other words, alongside the history of technological change, cognitive shifts, and increasing distraction, there is a history of innovation in contemplation. Efforts to create spaces and practices that help people concentrate and be creative are as old as writing and print. This knowledge can help us today. There are specific things we can take from this history in crafting our own responses to today’s digitally-driven pressures: as we’ll see later in the book, there are design principles that we can borrow from contemplative objects and spaces, and apply to information technologies and interactions. 

Just as important, this history offers hope. It reminds that for millennia, people haven’t just complained about or resisted the psychological and social challenges presented by new media and information technologies, nor have they passively allowed change to happen. They’ve developed ways of dealing. And some of those practices— like varieties of Buddhist meditation— possess a sophistication and potency that modern science is only now starting to fully appreciate.