21st Century Institutions: the U.S. Presidency

Donald Trump is not the first U.S. President to be elected in the 21st Century, and the odds are he’s not going to be be the last.* He does seem likely, however, to go down in history as the century’s most characteristic one, at least as things look now, twenty years into the accursed opening maneuvers of its history-making engines.

President Trump himself is undeniably one of those engines, a steam-powered anomaly in an era increasingly lit up at night by the output from solar panels, wind turbines, and lithium ion batteries. It would probably be easier, and it would certainly be more pleasant, to write about the American Presidency without mentioning the Donald, but since he does seem to represent some sort of numinous final stage in the rot that’s been eating away at the office since 1945, there’s no credible way to avoid dealing with him in all his radiantly decadent glory.

Back in 2011, in snarking at the dozen or so GOP presidential candidates of the time, I called Newt Gingrich the Dorian Gray of the Republican Party. By the middle of 2016, as candidate Trump’s arsenal of creepy facial expressions began its final assault on our international media landscape, I realized that Newt had been a mere pretender. Not even Oscar Wilde himself could have imagined the world we were now living in, a world in which the real-life equivalent of his fictional character actually preferred having the evolving portrait of his depravity visible to everyone and his dog.

When January, 20, 2017 finally did arrive, it was even weirder than usual for a Presidential Inauguration Day. Most of the political class and its media pilot fish were still hung over from the excesses of the election in November. To their momentarily everlasting astonishment, it seemed, Trump had actually managed somehow to get himself elected President, and was standing there now, live in front of the assembled cameras, taking the oath of office. Oh. My. God. The nuclear football in the possession of a sociopathic, blowhard hotel developer! Sackcloth and ashes! Baskets of deplorables! Facebook and Russians! Blah, blah, blah.

Trump himself was soon to be busy elsewhere in Washington. Once he’d gotten the rug swapped out in the Oval Office, had more Trump-suitable golden drapes hung above its windows, and settled his very stable genius behind the Resolute desk, he got down at last to the real work: redecorating the American political landscape with a stunning array of bagmen, bootlickers, generals and ex-generals, racist Dixie irredentists, religious fanatics, voodoo economists, firearms fetishists, Fox News ressentimentistes, rust belt coal rollers, libertarians looking for a hill to die on, and his own children. In 2020, I continue to wonder: is there really anyone left in the United States who still believes that this was all some sort of diabolical accident?

No, there isn’t. And no, it definitely wasn’t. Most of the electorate understands very well that this train wreck of an election was no accident, whether they voted for Donald Trump or not. And yet, amazing as it is to contemplate, our luck has held once again. Despite the best efforts of Trump and his merry band of magatrumpistas, the United States seems unlikely to become a failed state during his reign, no matter how diligent its political class is at helping him carve one out of the complex patrimony of the U.S. Constitution.

If we can somehow manage to ignore all the present din and idiocy, what is undeniable about the history which has led us to Trump in the White House is that already by the latter half of the 1970s, the international economic order set up by the western allies following the defeat of Nazi Germany and Imperial Japan was becoming alarmingly unstable. Contrary to the arrogant predictions of our so-called foreign policy experts, the economic restoration of our defeated enemies had not, in fact, bound them in perpetuity to political alliances dominated by the United States. China was not, in fact, going to be permanently denied the economic and political deference due it as a society which embraced more than 18% of the world’s population. Even the lesser nations of the world would not, in fact, continue to fear being denied a place at the trough of American largesse, especially as there came to be less and less in it for them.

The election of Donald Trump is far more, I think, than a macabre trick that the rubes in the MAGA hats have played on themselves. It’s also the clearest demonstration we’re ever likely to get that prominent members of the American political class are not as savvy as they make themselves out to be. Simply put, they’ve failed to prepare the American people for the historical metamorphosis which has brought the postwar Pax Americana to an end. Even more simply put, bearing humiliating witness to a Trump in the White House is the price they’re paying for that failure. Whether they realize it or not, the longer they keep propping up the status quo, the higher that price will be. After four years of the Trump administration, let alone eight years, I’m pretty sure the vig alone will wind up bankrupting them.

While it may be true that even in the hands of a Donald Trump the U.S. Presidency remains as remarkable an institution as it ever has been, it’s definitely true that it’s never been a less transparent one, especially with respect to the exact nature of its formal and informal powers. To give just one example, there are 17 agencies in the so-called United States Intelligence Community, which, according to the latest figures available to the public, have granted top secret security clearances to a total of over 900,000 people. Meanwhile, President Trump is reported to have restricted his daily security briefing to two pages, while supposedly watching four hours of Fox News a day. What reason is there for anyone to believe that he’s actually in charge of what is going on in these agencies? Who can predict the impact that such calculated ignorance will have on our national security, or our foreign policy in general? Certainly someone is in charge — many someones more likely — but I doubt that any of them are significantly more accountable to the President, at least on a day-to-day basis, than they are to the public at large.

Admittedly it’s hard enough to run an effective federal administration with a full complement of politically competent policy experts, and a chain of responsibility that extends to the lowest level of the executive branch. You certainly can’t run one effectively with a staff consisting of your daughter, your son-in-law, Sean Hannity, Sheldon Adelson, a rota of retired generals, lobbyists and golf partners, and an assortment of idiot yes-people whose only significant achievement is the byzantine complexity of their self-abasement. The country is too large, its political and economic infrastructure too complex, to be managed entirely from the top down; the responsibilities of the Federal Government are too extensive, and its interlocking bureaucracies too encrusted with decades of turf wars, interagency rivalries, and deviant ideological agendas to respond competently to even the most intelligently conceived policy directives.

The bottom line, I’m afraid, is that the U.S. Constitution is showing its age, and so is the U.S. Presidency. I think it’s significant that both President Trump and three of his Democratic Party challengers for the office in 2020 are over 70 years old. I remember when we used to laugh at the infirmities of the gerontocracy under Brezhnev in the Soviet Union, and Mao in China. These days, it looks as though the laugh is on us.

*YMMV. I have say, though, that if I were a bookie, I’d be reluctant to offer the current Vegas line on that bet, especially if I had to lay off any significant amount of it. That’s the kind of move that might just wind up getting you your legs broken — or worse — even when most of your ordinary day-to-day tormentors would be running around shrieking and waving their hands, looking for a window to jump out of.

The Laws of Physics, the Limits of Desire

A couple of years ago, a friend of mine who’s relied on me off and on for thirty years for tech advice, came to me with a complaint that his iPhone 6 — then barely two years old, out of warranty, but a month or so short of being off contract — was shutting down at random, even though it still seemed to have plenty of charge left in the battery. We rounded up all the usual suspects to no avail, then hauled ourselves off to the nearest Apple store, where the genius at the genius bar, assisted by Apple’s own diagnostic tools, rounded up all of her usual suspects, and concluded that there didn’t seem to be anything wrong with his phone — except, of course, that the random shutdowns made it at best unreliable, and at worst, unusable. At my suggestion, my friend paid off his old contract, and replaced his suddenly unfaithful companion with the then current model, an iPhone 7. Two months later, I read that Apple was replacing batteries for free in out-of-warranty iPhone 6’s exhibiting random shutdowns, with no questions asked, and, as usual for Apple, no explanations given. A year later, again with no explanations given, Apple began throttling iPhones with aging batteries which could no longer supply the necessary voltages under peak load conditions.

It’s been painful to watch Apple get Twittered, Facebooked, and ultimately sued over this whole affair. The fact is that those of us who were aware of the limitations of lithium-ion battery technology, Apple engineering executives above all, should have seen this coming. The very things that make the iPhone magical — its pocketable size, its ever-increasing computing power, and its appliance-like simplicity and ease of use — are also, to a greater degree than Apple marketers would have us believe, based on an illusion.

The sad truth is that Apple had painted itself into a corner bounded on the one side by an understandable, if misplaced confidence in its own hardware and software innovations, and on the other by a misguided attempt to protect the technological innocence of its customers from the consequences of their own addictions.

Marketing has its own imperatives, and as any marketing expert worthy of the name would probably concede, a certain blindness to the long-term consequences of its own cleverness has never been much of an impediment to its operating budget, or to its status in the corporate hierarchy. Until, of course, the shit hits the fan. Then the public dance of recriminations is performed, and everyone concerned goes back to business as usual. Except for the hapless consumer, who’s inevitably forced to grumble, sigh, roll his or her eyes, then pay whatever the going rate is to get back on the road.

Apple could have done a lot better a lot quicker. Its customers love what it promises, even those of us among them who know to what extent the promise exceeds the current limits of technology. Progress requires us to dream forward, and to accept that sometimes along the way our reach will exceed our grasp. That said, a little more transparency from those at the pointy end would be welcome. Infantilizing the consumer as a path to marketing success has all sorts of support from the countless schools of social science pilot fish who’ve attached themselves to corporate C-suites in the postwar decades. God forbid that I should deny what their statistics are telling them about our human vulnerabilities. I would ask them, though, to consider how they feel when their own strings are pulled.

If We Can Somehow Bring Ourselves To Take the Long View, We Probably Should….

Revised from a recent comment of mine on this Crooked Timber thread:

A sort of Marxist point about our present distempers: the conditions of existence have changed, probably irrevocably, for the Scots-Irish coal miners of West Virginia, the libertarian ranchers of the West, and the industrial workers of Ohio and Pennsylvania, and they’re not happy about it. Should Tim Cook, Mark Zuckerberg, Jeff Bezos, or Elon Musk feel any more sympathy for them than their own ancestors felt for Chief Joseph, Sitting Bull, or Geronimo? A similar observation could be made about our lack of sympathy for the Taliban and the Salafists.

One difference is striking, though, about our current last-ditch defenders of traditions outmaneuvered by modernity. They’re more widely distributed, and they’re also much better armed. The consolations of Whatever happens — we have got — the Maxim gun — and they have not have succumbed in their own fashion to a modernity not even the Moderns themselves seem to understand. Not yet, anyway.

Marx thought that once the conditions of existence had changed sufficiently, the past would be, or could be, swept away by revolutionaries with their eyes on the future. Seen up close, from the vantage point of an individual life, the process is far uglier, no matter what subsequent theoretical revisions from the foundries of Marxist ideology, or cheerleading from neoliberal think tanks promise us. Somewhere between Faulkner’s The past isn’t dead, it isn’t even past, and Gibson’s The future is already here, it’s just not evenly distributed, there’s a place to stand that won’t offend either our conscience or our common sense. Maybe. One hopes. YMMV.

The Evolution of Noblesse Oblige

Andrew Carnegie:

We’ll buy a lot of books, and build a lot of places where working people can go and read them. This will help them better themselves, and in the process provide a more civilized future for all of us.

Bill Gates:

The country is not producing enough of the kind of people we’ll need for the future we’re creating. Beneficial change will require us to seize control of the school curriculum from clueless education experts and teacher’s union officials — an annoying, but essentially trivial task. First, we’ll buy their compliance. Then, if necessary, we’ll buy their gratitude.

Can it really be this simple? Probably not. Still, if you have kids, this might be a good time to take more seriously your own contribution to the world they’re going to be living in.

From 1994: The Future of Our Discontents

This essay appeared here and there on the Web in the mid-Nineties in slightly different versions — at one point it was even published in translation by an Italian Blade Runner fan site. There are things in it that I’d change if I were writing it today, but since the only point in re-jiggering the parts that seem embarrassing twenty years later would be to fake a prescience I didn’t have then, and don’t now, who would I be fooling?

 

The Future of Our Discontents

The Contributions of Ridley Scott’s film Blade Runner
to the Landscapes of the Twenty-First Century

Four towering gas flares roar up in the foreground and drift away over the landscape like the exhaust from some monstrous calliope. In the valley below them, a vast, corroded jewel of a city glitters in the smog-tinted autumn twilight. The year is 2019, and the city is Los Angeles—capital of the Pax Americana and setting for Blade Runner, Ridley Scott’s thirty million dollar meditation on the future of post-industrial capitalism.

Scott’s City of Angels isn’t for optimists. His images of things to come are as melancholy and as architectural as Dore’s engravings of Hell, as intensely detailed as a flatworlder’s logic. Yet the darkly eloquent nightmare at the center of his vision is also strangely apolitical—no one is explicitly blamed for the horrors it offers us. I doubt it ever occurred to Scott that anyone would object—not on ideological grounds, at any rate.

So in the summer of 1982, when Blade Runner was first released to theaters, the reaction against it may well have come as something of a surprise. Reviewers pointed inevitably to flaws in the film’s execution, but I suspect that the real difficulty lay in its tone.

Rightly or wrongly, we Americans have always considered our immunity to the historical accidents which plague other people an essential part of our birthright. The decade of Vietnam, Watergate, and the Arab oil embargo had been a depressing ten years for us, and few of us were eager to be reminded of it, especially when we were expected to pay for the privilege. Scott’s problem was this: just when Ronald Reagan was promising to restore American bragging rights, Blade Runner appeared to conjure openly with the demon of false pride. To many moviegoers, the result may have seemed less a prophecy than an exercise in cultural defeatism.

It would be hard to blame them. Blade Runner was—and is—a disturbing film, but it is also an emotionally accurate one. For anyone who continues to harbor doubts about the future of American empire, there is something deeply unsettling about the rain-soaked Los Angeles in which Blade Runner is set, something eerily familiar in its crumbling architecture and punked-out Third World inhabitants. I wonder how many people, picking their way through theater parking lots on a warm June evening in 1982, imagined for a moment that they heard thunder in the air behind them, or looked apprehensively for oriental characters on the exit signs as they started their cars and drove away. I suspect that there were more than a few.

It may have come as unwelcome news to Scott’s investors, but these disoriented moviegoers, and their reactions to Blade Runner—irritation on the one hand, and deja vu on the other—may have been a better guide to its long-term impact than the box office receipts. However much—or little—it grosses, Blade Runner will remain a compelling reminder of just how nasty life in the twenty-first century may eventually become. It shouldn’t be surprising that audiences find it haunting even as they reject it.

The source of their ambivalence is clear enough, after all. We Americans are the inventors of the twentieth century, but we still haven’t found a home in it. Despite our history, and our pretensions to technological superiority over the rest of the world, we are a lot like the victims of the Chinese curse—fated, no matter what our individual desires, to live in times which are a lot more interesting than we would prefer.

Perhaps that is why, in philosophical terms, the second half of the twentieth century has been so devastatingly quiet. The testimony of Heisenberg and Freud is on record in every library, and incidents like Chernobyl or urban gridlock are reported daily in the media, yet the ruling orthodoxy of the post-war era remains unswervingly Newtonian. It understands only cause and effect, problem and solution, and it recognizes only one heresy: anyone who dallies, even for a moment, with the notion that uncertainty might be a permanent feature of human enterprise is automatically persona non grata. Like Socrates, he may be tolerated, but he is never left alone with the children.

The price of this attitude, especially among the orthodox themselves, is a profound and chronic restlessness. The hope that we can find solutions is tempered always by the fear, largely unexpressed, that we may ourselves be part of the problem. We sense that we are facing not one, but two futures: the future of our public allegiances, and that darker future of moral decay, terrorism and nuclear holocaust which occasionally decorates our private nightmares.

Unfortunately, both are based on the same evidence. Whether we like it or not, the choice between them is always an act of faith—which may be why the use of the future as a metaphor for the misgivings of the present has become such a commonplace. It serves the same function in our time as Hell did in Dante’s: it provides us with the excuse—and occasionally the means—to confront what we have all too carefully hidden from ourselves.

And what illuminations does Blade Runner offer? In the story itself, very few. Whether Scott was forced in the end to capitulate to the demands of big-budget Hollywood marketing, or simply held himself aloof from the political implications of his vision, plot and character development in Blade Runner never rise much above the sad conventions of the comic strip and television cop show. It is only in the incidentals that Blade Runner is interesting—the details of set design and lighting, the tight integration of sound track and music; the tantalizing view, in minor characters and in actions taking place at the edges of the frame, of a city in which much more serious business is being transacted than the pursuit and destruction of some assembly-line Prometheus.

If this business is familiar, it should be. In embryonic form, its essential features surround us already. The political and social dominance of so- called “corporate cultures” is a fact; so also is the increasingly neurotic synergy of computers and real-time video which so gleefully confuses our dreams and our waking experiences. If androids seem farfetched, we have only to remember that it has taken less than thirty years for the first tentative experiments in genetic engineering to develop into the industrial technology which makes worldwide headlines today. Whatever the level of our individual understanding, we must all be aware that a sudden environmental or political apocalypse is not the only threat we face.

It is in the light of this awareness, which he clearly shares, that Scott has rebuilt and illuminated the city of Los Angeles for us. We see it as it might look on a late autumn afternoon nineteen years into the coming century: corrupt and essentially ungovernable, the decaying but still powerful queen of a technological civilization which steadfastly refuses to look at either the compass or the clock.

In the foreground, Harrison Ford and Rutger Hauer argue whether androids have souls, and whether, having been denied the right to self-determination by no less a figure than their creator himself, they are justified in resorting to mayhem. We scarcely notice Scott in the background, quietly aiming his camera down a narrow alley at the approaching Götterdämmerung.

Scott imagines the end coming, as Eliot did, “not with a bang, but a whimper.” His camera conducts us through the overpopulated ruin of Hollywood and into the downtown fortress headquarters of the Tyrell Corporation, where Eldon Tyrell, its founder and CEO, has embarked on a mission which would impress even Lee Iacocca. With the natural ecology in terminal disarray, the Third World noisily hawking its noodles in the glass-walled canyons below him, and little hope for continued profit in his dominion over either one, Tyrell—one of those superbly educated and superbly callous white men who, for lack of a better word, we agree to call technocrats—has hit upon a radical solution to his dilemma: he will replace every living thing, human beings included, with more tractable models of his own design.

If his solution seems far-fetched, consider what the present century has already accustomed us to: purge trials and personality cults, Mutual Assured Destruction; the technologies of mass production applied to genocide…. Attempts to selectively re-engineer all of Creation are still comfortably beyond the limits of technology, but should a more capable technology someday become available, it isn’t difficult to imagine a future General Motors or Pentagon willing to experiment with it.

The likelihood of such a future, and the moral fatigue which accompanies it, are among the true horrors of modern life. Against that day, the day when Tyrell’s particular gladness comes to pass, the twin ideologies of politics and religion have little to offer us. The future isn’t an ideological problem.

Sadly, it isn’t an aesthetic problem either. Artists often see evil more clearly than the rest of us, but they rarely have any greater power to correct it. In the realm of the senses, they are not so much politicians as historians. What makes their history more interesting than the kind we are taught in school is that sometimes, as in Blade Runner, it has yet to be confirmed by events.

In painting and in literature this is a familiar concept; in film it is still relatively new, and still poorly understood. Film is a powerful medium, but the experiments with form which, for more than a century, have marked the best of the other arts, have yet to excite much interest among professional filmmakers.

There are two reasons for this, I think. The first arises from the economics of film. Unlike painting or writing, filmmaking is not a handicraft. Film is a collaborative, high technology medium, and feature- length works cost far too much to be sold profitably to the individual collector. Such works must be mass-marketed if they are to be marketed at all, which effectively restricts any experiments in form to those which are likely to be tolerated by mass audiences.

The second reason is more subtle. The illusion of control over reality is one of the most seductive attractions of film, not just for audiences, but also for filmmakers themselves. These days, a director with Hollywood financing at his disposal can do just about anything God can do—provided, of course, that his ambition (and imagination) are up to the task. It is only natural that his acolytes—the cameramen, grips and electricians; the make-up and wardrobe and continuity people—should harbor illusions of their own. Sending an antique Rolls Royce over a cliff just to capture a second or two of film is heady stuff, especially when someone is paying you a fabulous amount of money to do it.

Unfortunately, the grandeur and the exhilaration are available only when the film retains some connection, however tenuous, with its theatrical antecedents. Animation, computer-generated imagery, and other seemingly avant-garde graphic forms may be interesting, but they lack the appeal of film as wish-fulfillment theater.

Audiences in particular have very sensitive antennae for this quality—the quality of their own dreams (or nightmares) given palpable form. The reason that a film like Blade Runner can successfully attach itself to an audience’s subconscious has less to do with its technology—the computer-controlled cameras, digital sound sampling, traveling mats, etc.—than with its psychology, a quality relied upon as much in Shakespeare’s plays as it is in any film.

Of course the tools of traditional theater and of film are vastly different. In Shakespeare’s plays, natural language is the only artifice required—it conjures so powerfully by itself that any staging beyond the absolute essentials seems almost a distraction.

The situation in Blade Runner—as in films generally—is reversed: it is the visual environment of the film which carries the weight of the narrative. In Blade Runner this environment is so strongly present, and so richly suggestive, that at times the relatively little dialog which is required to advance the plot seems more a threat to the dramatic illusion than a support for it.

In large part, this is due to the subtlety of Scott’s visual vocabulary. He gives us brief, but detailed glimpses of a world which is a plausible, if not precisely logical, extension of our own, then openly invites us to interpret them.

To take just one example: almost everyone in Blade Runner smokes—the impenetrable tobacco haze is an important visual element in many of the interior scenes. It isn’t clear, in an age when smoking has become anathema, an almost universal symbol of dissipation and self-contempt, what Scott intends by giving it such pride of place in his version of the twenty-first century.

The answer, I think, is that the smoking in Blade Runner is intended as an echo, a reminder that all the poisonous exhalations of a capitalist society are voluntary. The fatal eroticism embodied in the fumes from Rachael’s smoldering cigarette is repeated in the orange skies above Los Angeles, in the continual dark rain which beats along its decaying rooftops and licks at the trash fires in its streets. Consumption pursued as an end in itself, Scott seems to be saying, will lead to the same, slow catastrophe whether it is an individual or an entire society which does the pursuing. The American Dream is, in fact, a nightmare, and it has always been one.

The psychological acuteness in such metaphors is evident, but for those of us who are roughly Scott’s contemporaries, there is a particular irony in the way he applies them.

We are the children of plenty—our own age of innocence began in the national celebration which accompanied the end of World War II. We grew up surrounded and sheltered by the post-war fantasies of the Fifties. According to the folklore of our childhood, it was American know-how, American technology which had brought us our final victory over war and poverty.

I remember articles in the Weekly Reader of my childhood about the industrial benefits of atomic power and the increased farm yields made possible by the invention of DDT and 2-4-D. I also remember the stories my parents let slip about their own lives; the indignities they had suffered during the Depression, their separation from one another during the war years.

For those of us who had the good fortune to be born white, the future held no particular terrors. What scared us was the past which our parents had so narrowly escaped. Until the Vietnam War arrived to provide us with horrors of our own, we were only too happy to join them in fleeing theirs.

For our generation, then, the more the future threatens, the more it resembles the past—and Scott seems to understand this instinctively. In Blade Runner, he presses the Thirties into service again, like the Bogeyman, to cast its long shadow over our hopes. The Mayan Revival is once again the dominant architectural style; the women of all men’s dreams are once again encased in Joan Crawford’s stiff-shouldered exoskeleton. The horrors of Scott’s landscapes turn on a delicate and beautifully original application of McLuhan’s principle of historical simultaneity—the visual artifacts of another era detached from their original context and pressed into service as emotional shorthand in our own. As the founding generation of the Information Age, we are perhaps the first to be conscious of these signals; we are certainly not the last to be vulnerable to the distortions which they can sometimes introduce into our collective memory. In any event, we respond to the language, and Scott speaks it to perfection.

Yet inventive as it is, this psycho-historical sleight-of-hand isn’t the only trick Scott has in his bag. His camera may imagine the future, but it also echoes the present with an accuracy that is often unsettling.

Consider the immigration patterns of the past twenty-five years. Most of us are aware of the Hmong and the Marielitos, of the continual flood across our southern borders. We’ve seen the changed population mix in our cities, especially along the southern rim of the country. Scott takes the trend a step further, imagining a time when all of our imperial chickens have finally come home to roost.

At street level, his Los Angeles is part Ginza and part Decline of the West—home not only to the rubber-clad, German-speaking dwarves who clamber over Deckard’s car looking for parts to pry loose and sell, but also to Gaff, the extraordinary Latino with white eyes who practices Origami and speaks an argot which sounds like an amalgam of Japanese and Indonesian. (Gaff is especially interesting. As I read him, he is the embodiment of that elegant nihilism which hereditary second-class citizenship sometimes produces among the more talented members of minority groups. If James Baldwin had been stripped of the last of his illusions and armed with a license to kill, he might have behaved a lot like Gaff.)

Although we may not yet be able to find exactly these creatures walking the streets of our cities (even in Los Angeles) there is a definite possibility that we may someday have to make a place for them. And if their numbers happen to include a few clones and cyborgs, who will complain? The same people who complain now, probably—”the kind of cop who used to call black men ‘Niggers.’” For the rest of us, what people are made of—then as now—will have little to do with the physical composition of their bodies.

After all, we are Americans, and Scott’s twenty-first century, despite its stylistic kinship with fascist public works, is an American century. He recognizes that Americans—Californians, at any rate—have always made their own decisions, have always flirted with anarchy.

When Chew tells Batty “I just do eyes,” he is testifying to a form of social disintegration which Orwell could never have imagined. Not ideology run amuck, but the implosion of ideology has sent the people in the streets of Scott’s Los Angeles scurrying. They must re-invent themselves daily out of the unstable bits and pieces of a collapsing rationality. Every man in this New Jerusalem is an island, and a floating one at that. Such purpose as exists here isn’t totalitarian—it’s individual, and small. A subcontractor in human eyes fits the logic of such a society perfectly.

And whether we like it or not, we know that Chew speaks for all of us. If things sometimes go wrong, he believes, as we do, that it can’t be our fault—we are responsible only for a piece of the action, not for the whole. Politics is said to be the pride of free men, but in the twentieth century, the free man has learned to avoid all forms of responsibility, politics included. Politics has been reduced to a form of longing, an eloquence in the prayers of the oppressed. We may sympathize with their plight, but we ourselves are longer interested in acting on their behalf. Outside the Third World, the clear and present danger in our era isn’t totalitarianism, it’s chaos—the lack of necessary and sufficient reasons for doing anything at all.

We know this, all of us. We don’t often discuss it, but we know it. Should automobiles someday come equipped with self-contained oxygen systems, and large-scale mining operations be set up in our urban landfills, we’ll have another chance to reassess Scott’s powers of prediction.

In the meantime, we can say only that his contribution to the vocabulary of our future discontents has been taken up by other filmmakers. When we hear petty fences speaking what sounds like Korean in Trouble in Mind, or see petty Ministry of Information hacks in Brazil driving three-wheel Messerschmidt staff cars (ca. 1949) it wouldn’t hurt us to remember who was first to disembark on these shores. Ridley Scott may not be aware of it himself, but he understands our nightmares.

If you doubt it, make your way over Mulholland Drive, or coast down from Griffith Observatory some night when you have nothing better to do. Try to see la Ciudad de Nuestra Señora, Reina de los Angeles without the gas flares, without the winking lights of ground effects spinners above the Harbor Freeway. If you find, as I do, that the present is overlaid by something darker, something more ominous, then you may agree with me that Scott has, in Wim Wenders’ wonderful phrase, “colonized our subconscious.” When all is said and done, this is what has always been expected of artists. We should be grateful that someone still possesses the ancient power.

A Cold Day in Hell

Today Paul Krugman has discovered that, gasp, technological unemployment is really real! And it’s really, finally here! And it really, really will result in a permanent transfer of wealth from labor to capital, no matter how many college degrees laborers go into debt to acquire! (and, coincidentally, of course, this also seems to imply that Marx might actually have been a bit smarter than we thought.)

I’m being unfair, or at least uncharitable, to the penitent Dr. Krugman, who’s a nice guy, and would be a nice guy even if he weren’t an economist. Still, this is an amazingly belated observation on his part. I thought that these economist guys all knew this stuff, but were afraid to mention it for fear of devaluing their Keynesian cheerleading. Horrifying to think that they didn’t actually know it at all.