Despite its birth in slavery and genocide, there was always some hope that the United States would one day live up to the aspirations of its founders rather than continue to turn a blind eye to the evils inherent in some of their political compromises. Even when we were certain we wouldn’t live to see that day, we had reasons not to feel like fools when looking forward to its eventual arrival.
Today, as Trump and his sycophants begin gleefully making plans to burn our books and chisel our names off the nation’s monuments, we should take a moment to remind ourselves what will inevitably become of them once their political orgasm has spent itself. Winning won’t magically make them any less ignorant, any more capable of coping with anything more complex than their own appetites for self-aggrandisement. While they’re busy gloating, grifting, and genuflecting to their preposterous version of the Christian god, the Chinese or Russians may very well show up to eat the lunch so cluelessly laid out for them, or what is even more likely, climate change may finally turn Arizona into hell with the fire out, and blow down or drown everything in Florida from Mar-a-Lago to Tallahassee.
In the meantime, we should mind how we go. Remember that they don’t own the future—many of their children will come to hate them soon enough, especially their daughters. Ignore the taunts, save the bullied wherever and whenever we can, and never, ever forego an opportunity to pour a cup of virtual sugar into a coal-roller’s gas tank.
Part Two has been in progress for a painfully long time, but the future is proving to be an even more elusive beast than I thought when I first began this somewhat speculative apologia….
In its issue of April 4, 1994, the New Yorker published an article by Nicholson Baker, Discards, which called into serious question what he was convinced was an unwise rush by libraries to replace traditional card catalogs with a computer-based approach to information access and retrieval. Baker’s article was widely read in academic library circles, not only because it was critical of the work librarians were doing, but also because it had appeared in the New Yorker. Specialists working in fields as obscure as technical librarianship aren’t normally accustomed to reading such critical assessments of their work from outside the profession, especially when those assessments turn out to be as accurate in their details, and as forthright in their judgments as Discards was about our bibliographical stewardship.
When I first took up Baker’s article that April, I admit I found myself wishing that it hadn’t been quite so accurate, or quite so forthright, and for good reason. At the time, I was employed in the Cataloging Department of the University of California, Santa Barbara Library, where for the preceding ten years I’d been supervising the work of something called the Catalog Maintenance and Retrospective Conversion Section.
Catalog maintenance, the section’s traditional responsibility, meant managing the card catalog—adding new cards to the drawers as new books arrived and were cataloged, correcting errors in the existing cards, and updating them to reflect revised entries whenever the Library of Congress made changes to its subject thesaurus, or altered its preferred form of an author’s or editor’s name.
The retrospective conversion assignment had been added in the mid-1980s, at the point when all current cataloging was finally being done on our new computer system. To complete the transition from a card-based catalog to a computer-based one, we were tasked with entering the older, manually-produced contents of the card catalog into our digital cataloging database. Once that nearly decade-long task was complete, we discarded both the cards—some 10 million of them—and the cabinets which had housed them, and replaced them with public access computer terminals. Access to the new digital catalog could then be provided first on dedicated terminals within any library in the University of California system, and then, after a surprisingly short interval, to anyone anywhere in the world who had UC library privileges, an Internet connection, and a Web browser.
That was what retrospective conversion meant and what it did, and that was precisely the activity, carried out by my section from the mid-Eighties to the early Nineties, and repeated in libraries all over the world, which had given rise to Baker’s article. He thought that what we were doing was not only short-sighted, but barbarous—an offense against civilization itself—and was saying so in no uncertain terms.
Despite his obviously careful research, his passionate indictment of our project—in the pages of the New Yorker, no less—struck me at the time as being perverse, perverse in the sense that he seemed much more determined to condemn us for what he believed we were destroying than to evaluate what we believed we were creating, or our success in creating it. In Bakers’s view, it seemed, far from being the responsible stewards of the world’s intellectual patrimony we’d imagined ourselves to be, we were in fact vandals, the moral equivalent of those universally despised vandals who’d once set fire to the Library of Alexandria. Was this a fair judgment? I certainly didn’t think so, but just as I was deciding that the situation was unprecedented enough, and the outcome uncertain enough, not to take his judgment personally, on the very last page of Discards, I found this:
(U.C.S.B., incidentally, finished throwing out its main catalogue late last summer.)
Incidentally. Certainly not a word that I’d have chosen. UCSB was my library, throwing out its catalog was my job, and I could have told him, had he asked me, that there was nothing incidental about it. Arguments about intent, though, were apparently beside the point. What concerned Baker was not intent, but consequences, consequences which he was far more certain about than we were. I put down my copy of the New Yorker and recalled the end of Dr. Frankenstein’s career. Was I really a vandal? Would there be a mob of concerned citizens with pitchforks and torches waiting for me in the library parking lot after work?
Given that Twitter and Facebook didn’t exist in 1994, I really didn’t have anything to worry about. An indictment and trial of supposedly philistine librarians in the court of a public opinion generally indifferent to abstract policy squabbles was highly unlikely. Yet if Baker’s attempt at framing a public policy indictment of our work seemed perverse to me, his instinct that some sort of public policy questioning should be taking place was valid enough to be taken seriously. Indifferent or not, the public was clearly going to be affected by the technologies of the coming digital age, not just affected, but shaped by them. The disappearance of card catalogs from their libraries was, if anything, merely the thin edge of the coming wedge.
Thirty years later, deep into the age of Amazon, Google, and Wikipedia, of LLM, ChatGPT, Simon and Bard, it’s hard to recall precisely what form my testimony might have taken in the event that Baker and the New Yorker had actually succeeded in putting us all on trial. All I can remember now with any precision is my certainty that printed books were already becoming an anachronism, that libraries were already in the process of becoming museums of the printed word, and that librarians would have little future except as their curators. All of this, I was convinced, would happen sooner than even Nicholson Baker feared, and would turn out in the end to be even more radically disruptive than many of my colleagues, committed as they imagined themselves to be to our digital future, could bring themselves to admit.
I haven’t spent more than an hour or so in the UCSB library—nor any other library—since I retired in late 2003, nor have I kept up with library journals, or the professional literature in general. As a consequence, I have only the vaguest of notions what, if anything, has changed in the intervening twenty years in the mission of libraries and librarians as viewed by librarians themselves. I can’t imagine that they still think of themselves as principal actors in the digital transformation of information storage and access, but I do hope that they’ve remained principled stewards of the triumphs of the past, and skeptical about some of the more outrageous claims made those who are now in charge of the digital transformations of the 21st century. In any event, what happens now in libraries is no longer mine to judge. If there’s a problem, I’m willing to concede that I had a part in creating it. If there’s to be a solution, I’m well aware that I won’t have any part in devising it.
Ten years ago I was privileged to witness the emergence of a dragonfly from its nymph form. A creature that at first glance had seemed like a beetle to my untrained eye had crawled laboriously up from my back yard, attached itself to the side of the concrete step leading to the sliding glass door to my bedroom, and remained there, unmoving, tempting me to believe it had died.
I don’t remember exactly how long it remained there, but it was long enough for me to pass it a number of times on my way back and forth to the alley behind my house. Then, the last time I started to pass it, I discovered to my surprise that the back of its carapace had split open, and the dragonfly it had become was perched atop the empty shell that had sheltered it, had been it, and was now unfolding its wings in the afternoon sun. I watched until the wings, now fully dried, suddenly began to vibrate, then powered a beautiful, iridescent liftoff and arrow-like disappearance into the distance, a movement almost too fast to follow with the naked eye.
Being a child of the 20th/21st centuries, my first thought at witnessing this astonishing sequence of events was that any sufficiently advanced technology will be indistinguishable from magic, my second that there are more things in heaven and earth, Horatio, than are dreamt of in your philosophy. Being a child of the 20th/21st centuries, I didn’t find those two thoughts to be incompatible. Now, ten years later, I still don’t.
For an old Mac guy, John Gruber, bless his heart, has always done his damndest to be fair in his judgments about tech. After several days of watching some of my favorite tech columnists lift their legs on iPads in general, and the new iPads in particular, reading his review of Apple’s M4 iPad Pro pretty much made me jump for joy.
I’m typing this on my new M4 iPad Pro with a nano-textured screen, and I don’t care what anybody says—the little girl in Apple’s “What’s a computer?” ad of 2017 got it, and John Gruber, prince of the grumpy old Mac diehards that he is, also gets it. He’s made my day….
Full disclosure: I’m 30 years older than John, and far grumpier, but the iPad still has the power to make me want to live another hundred years. That little girl—and John—speak to me, and for me, and I suspect I’m not alone.
Apple is certainly guilty of at least some of the transgressions it’s been accused of by Margrethe Vestager, the principal finger-wagger of the European Commission. Arrogant corporate behemoths are a tax on the general welfare, right enough, but so also are vengeful bureaucrats whose principal complaint seems to be that Americans got to the future before the French and Germans had a chance to certify it.
There are lots of smart people on both sides of this unfortunate culture clash, so I suppose it’s possible that some sort of quasi-equitable justice will eventually be done, but I’m not optimistic. I mean, c’mon people, really—does anyone at this late date actually want a cell phone designed by the European Commission?
Our grandchildren aren’t stupid. Their mental equipment isn’t inferior to ours. They just live in a different world, one which no longer belongs to us even though we helped create it. It’s theirs now, and whatever we imagine, we’re no longer in any position to judge them. Likely they’ll be fine, but if they turn out not to be fine, it’s going to be very hard to show how listening to us would have made the slightest bit of difference.
In the U.S., the Republicans’ sad entourage of the desperate, demented, and enraged are tearing at the Constitution’s exposed achilles tendons. In Russia the gangsters of Prigozhin are battling the siloviki of Putin for control of the spoils of a twice-failed totalitarian state. In India, Hindutva pursues a scorched-earth battle against Islam. In Germany the AfD tidal wave has engulfed the SPD, broken the CDU, and arrived at last in Bavaria, intent on washing away once and for all what little is still left of the CSU’s liberal democratic pretensions.
In Italy a fascist consumerism has sprung full-grown from the brow of Meloni. The trains now run on time, and foreign investors are once again reassured. In Finland and Sweden, the local populists have decided that white people are the only real people after all. In Israel, Syria, Hungary, Belarus, and Turkey, the warlord grifters have outlasted everyone. In Saudi Arabia and the gulf states, the kings, emirs, sultans and satraps of one kind or another are now completely convinced that having more money than Allah the Merciful means not having to apologize to anyone ever.
In Iran, a cabal of wizened religious fanatics calling themselves the Islamic Republic have yet to see any reason to deny themselves the perverse pleasure of beating and imprisoning women at random, and of shooting their own children whenever the kids act like they might be the coming thing. In China, a suspiciously but undeniably prosperous Communist (sic) Party oligarchy has decided that the Universal Declaration of Human Rights is nothing more than a confession of the failures and arrogance of so-called Western civilization.
I don’t think Web 3.0 is going to be a lot of help in preserving what’s left of the secular humanism that evolved over four centuries in Europe, and was sealed with the French and American revolutions. Neither will eleven aircraft carrier battle groups or a triad of delivery systems for nuclear weapons, given that both have long been controlled by people who bear no allegiance whatsoever to secular humanism either as a creed or a philosophy of government.
I’m not one to cry O tempora, o mores! every time someone in Washington does something stupid, but I do think that if the last sad road show of the Enlightenment comes to your town, you should go and listen to what was once promised us, and what very shortly we’ll all be missing.
The concern expressed in The Center for AI Safety’s Statement of AI Risk seems justified to me, but it also seems to me that many of the signatories have still not grasped the real nature of that risk. It’s the second order effects that’ll do us in—not the singularity and its presumptively implacable AI overlords, but rather the symbiotic processes already inherent in pervasive computing, processes which we can all sense, but are still in denial about what it will take, in terms of an evolution in human consciousness, to successfully navigate those spaces which still exist between where the machine ends and we begin.
In his 1960 Critique de la Raison Dialectique, Jean-Paul Sartre indulged himself in a typically poetic digression about how we can’t tell—may never be able to tell—whether we’re dreaming the machine, or the machine is dreaming us. This is a commonplace now, but although it wasn’t entirely new in 1960, it was still controversial enough to meet with widespread ridicule among the opinion makers of the day. And of course Sartre was describing the strictly physical interactions of humans and industrial age assembly lines, when machines were dumb, and humans were still thought to be the masters no matter how deeply their own mental processes were conditioned by the mechanical repetitions of their jobs.
The machines today are no longer dumb, and we can no longer afford the illusion that we are the masters of either the physical or the mental aspects of the machine/human symbioses of the 21st century. I’m not sure why, but I’m not as bothered by this as the signatories of this letter are telling me I ought to be. It certainly isn’t because I’m an optimist in the narrow sense ot the term. I expect great darknesses in our future, but not the ones that are supposedly keeping the tech bros up at night. These latter day idiot savants aren’t the real heralds of our new distempered age, it’s the kids now glued to TikTok all day. What their stewardship of our future will look like remains beyond anyone’s current power to predict. To make a long story short, it’s not the end of humans that should concern us, but the end of humanism, which seems to be losing its grip on the tiller of this ship of fools we’re crewing well before a new helmsman is ready to take its place.