Now can we please get back to taking care of the real business of the future? While we still have one, that is?
Attention Kayleigh McEnany:
This is the most famous picture of Mildred Elizabeth Gillars, Axis Sally, in U.S. military custody in 1946, on her way to be tried and convicted of treason against the United States. Is this really who you want to be when the sadistic lunatic whose lies you’re so diligently passing on to us finally gets his comeuppance?
Modularity, not convergence, is the future. There’s not going to be any other foreseeable way, short of magic, to approach the ideal state of computing hardware design, in which the use case alone determines the form factor. If you have the money to acquire its full arsenal of devices, Apple currently comes closer to this ideal than anyone else, Microsoft included.
In this future, it’s not going to matter where data is stored. So long as every device granted access to a unit of data is seeing the same instance of it, with both security and synchronization routinely embedded in every transaction, and therefore rendered trivial to the user, it won’t matter at any given moment where it is stored. Again, Apple gets this better than anyone else, even when its execution has been less than ideal. That’s why its somewhat premature effort to do away with files and file systems on the iPad will ultimately prove to be the right way to go. Files as a concept are obsolete. Computing devices understand this. Human beings do not. Handicapped by our reliance on the conceptual commonplaces of the past, we haven’t yet figured out what the ideal relationship should be between the tangible and the virtual, but we will. We’ll have to.
Ambidexterity is the new black. Trackpad, touchscreen, or mouse? Keyboard, stylus, or voice? Why not all at once? An embarrassment of riches ought to be the goal here. On our most treasured devices, there’ll always be at least three or four ways of doing anything, no matter where our hands are, or our eyes. We should be thinking musical instruments, not typewriters; collages, not spreadsheets, and we should try to keep in mind that whatever advances are made in the underlying technologies, imagination is still the most formidable aspect of the human side of the human/computer interface. Steve Jobs understood this, which is undoubtedly why Apple still understands it today, and why I think they’re very likely to remain the most reliable overall steward of human interface design and development as the 21st Century progresses.
It’s probably a good thing that no one in the Republican Party dares contradict the vicious imbeciles of the Trump administration. Better for the rest of us if they’re all still looking in the wrong direction when Nemesis finally comes for them.
Donald Trump is not the first U.S. President to be elected in the 21st Century, and the odds are he’s not going to be the last.* He does seem likely, however, to go down in history as the century’s most characteristic one, at least as things look now, twenty years into the accursed opening maneuvers of its history-making engines.
President Trump himself is undeniably one of those engines, a steam-powered anomaly in an era increasingly lit up at night by the output from solar panels, wind turbines, and lithium ion batteries. It would probably be easier, and it would certainly be more pleasant, to write about the American Presidency without mentioning the Donald, but since he does seem to represent some sort of numinous final stage in the rot that’s been eating away at the office since 1945, there’s no credible way to avoid dealing with him in all his radiantly decadent glory.
Back in 2011, in snarking at the dozen or so GOP presidential candidates of the time, I called Newt Gingrich the Dorian Gray of the Republican Party. By the middle of 2016, as candidate Trump’s arsenal of creepy facial expressions began its final assault on our international media landscape, I realized that Newt had been a mere pretender. Not even Oscar Wilde himself could have imagined the world we were now living in, a world in which the real-life equivalent of his fictional character actually preferred having the evolving portrait of his depravity visible to everyone and his dog.
When January, 20, 2017 finally did arrive, it was even weirder than usual for a Presidential Inauguration Day. Most of the political class and its media pilot fish were still hung over from the excesses of the election in November. To their momentarily everlasting astonishment, it seemed, Trump had actually managed somehow to get himself elected President, and was standing there now, live in front of the assembled cameras, taking the oath of office. Oh. My. God. The nuclear football in the possession of a sociopathic, blowhard hotel developer! Sackcloth and ashes! Baskets of deplorables! Facebook and Russians! Blah, blah, blah.
Trump himself was soon to be busy elsewhere in Washington. Once he’d gotten the rug swapped out in the Oval Office, had more Trump-suitable golden drapes hung above its windows, and settled his very stable genius behind the Resolute desk, he got down at last to the real work: redecorating the American political landscape with a stunning array of bagmen, bootlickers, generals and ex-generals, racist Dixie irredentists, religious fanatics, voodoo economists, firearms fetishists, Fox News ressentimentistes, rust belt coal rollers, libertarians looking for a hill to die on, and his own children. In 2020, I continue to wonder: is there really anyone left in the United States who still believes that this was all some sort of diabolical accident?
No, there isn’t. And no, it definitely wasn’t. Most of the electorate understands very well that this train wreck of an election was no accident, whether they voted for Donald Trump or not. And yet, amazing as it is to contemplate, our luck has held once again. Despite the best efforts of Trump and his merry band of magatrumpistas, the United States seems unlikely to become a failed state during his reign, no matter how diligent its political class is at helping him carve one out of the complex patrimony of the U.S. Constitution.
If we can somehow manage to ignore all the present din and idiocy, what is undeniable about the history which has led us to Trump in the White House is that already by the latter half of the 1970s, the international economic order set up by the western allies following the defeat of Nazi Germany and Imperial Japan was becoming alarmingly unstable. Contrary to the arrogant predictions of our so-called foreign policy experts, the economic restoration of our defeated enemies had not, in fact, bound them in perpetuity to political alliances dominated by the United States. China was not, in fact, going to be permanently denied the economic and political deference due it as a society which embraced more than 18% of the world’s population. Even the lesser nations of the world would not, in fact, continue to fear being denied a place at the trough of American largesse, especially as there came to be less and less in it for them.
The election of Donald Trump is far more, I think, than a macabre trick that the rubes in the MAGA hats have played on themselves. It’s also the clearest demonstration we’re ever likely to get that prominent members of the American political class are not as savvy as they make themselves out to be. Simply put, they’ve failed to prepare the American people for the historical metamorphosis which has brought the postwar Pax Americana to an end. Even more simply put, bearing humiliating witness to a Trump in the White House is the price they’re paying for that failure. Whether they realize it or not, the longer they keep propping up the status quo, the higher that price will be. After four years of the Trump administration, let alone eight years, I’m pretty sure the vig alone will wind up bankrupting them.
While it may be true that even in the hands of a Donald Trump the U.S. Presidency remains as remarkable an institution as it ever has been, it’s definitely true that it’s never been a less transparent one, especially with respect to the exact nature of its formal and informal powers. To give just one example, there are 17 agencies in the so-called United States Intelligence Community, which, according to the latest figures available to the public, have granted top secret security clearances to a total of over 900,000 people. Meanwhile, President Trump is reported to have restricted his daily security briefing to two pages, while supposedly watching four hours of Fox News a day. What reason is there for anyone to believe that he’s actually in charge of what is going on in these agencies? Who can predict the impact that such calculated ignorance will have on our national security, or our foreign policy in general? Certainly someone is in charge — many someones more likely — but I doubt that any of them are significantly more accountable to the President, at least on a day-to-day basis, than they are to the public at large.
Admittedly it’s hard enough to run an effective federal administration with a full complement of politically competent policy experts, and a chain of responsibility that extends to the lowest level of the executive branch. You certainly can’t run one effectively with a staff consisting of your daughter, your son-in-law, Sean Hannity, Sheldon Adelson, a rota of retired generals, lobbyists and golf partners, and an assortment of idiot yes-people whose only significant achievement is the byzantine complexity of their self-abasement. The country is too large, its political and economic infrastructure too complex, to be managed entirely from the top down; the responsibilities of the Federal Government are too extensive, and its interlocking bureaucracies too encrusted with decades of turf wars, interagency rivalries, and deviant ideological agendas to respond competently to even the most intelligently conceived policy directives.
The bottom line, I’m afraid, is that the U.S. Constitution is showing its age, and so is the U.S. Presidency. I think it’s significant that both President Trump and three of his Democratic Party challengers for the office in 2020 are over 70 years old. I remember when we used to laugh at the infirmities of the gerontocracy under Brezhnev in the Soviet Union, and Mao in China. These days, it looks as though the laugh is on us.
*YMMV. I have to say, though, that if I were a bookie, I’d be reluctant to offer the current Vegas line on that bet, especially if I had to lay off any significant amount of it. That’s the kind of move that might just wind up getting you your legs broken — or worse — even when most of your ordinary day-to-day tormentors would be running around shrieking and waving their hands, looking for a window to jump out of.
Yeah, yeah, I know. But like they say, sometimes it does rhyme.
It’s not that Mark Zuckerberg, Travis Kalanick, or Jack Dorsey are unscrupulous; it’s that they don’t seem to have any idea what scruples are.
A couple of years ago, a friend of mine who’s relied on me off and on for thirty years for tech advice, came to me with a complaint that his iPhone 6 — then barely two years old, out of warranty, but a month or so short of being off contract — was shutting down at random, even though it still seemed to have plenty of charge left in the battery. We rounded up all the usual suspects to no avail, then hauled ourselves off to the nearest Apple store, where the genius at the genius bar, assisted by Apple’s own diagnostic tools, rounded up all of her usual suspects, and concluded that there didn’t seem to be anything wrong with his phone — except, of course, that the random shutdowns made it at best unreliable, and at worst, unusable. At my suggestion, my friend paid off his old contract, and replaced his suddenly unfaithful companion with the then current model, an iPhone 7. Two months later, I read that Apple was replacing batteries for free in out-of-warranty iPhone 6’s exhibiting random shutdowns, with no questions asked, and, as usual for Apple, no explanations given. A year later, again with no explanations given, Apple began throttling iPhones with aging batteries which could no longer supply the necessary voltages under peak load conditions.
It’s been painful to watch Apple get Twittered, Facebooked, and ultimately sued over this whole affair. The fact is that those of us who were aware of the limitations of lithium-ion battery technology, Apple engineering executives above all, should have seen this coming. The very things that make the iPhone magical — its pocketable size, its ever-increasing computing power, and its appliance-like simplicity and ease of use — are also, to a greater degree than Apple marketers would have us believe, based on an illusion.
The sad truth is that Apple had painted itself into a corner bounded on the one side by an understandable, if misplaced confidence in its own hardware and software innovations, and on the other by a misguided attempt to protect the technological innocence of its customers from the consequences of their own addictions.
Marketing has its own imperatives, and as any marketing expert worthy of the name would probably concede, a certain blindness to the long-term consequences of its own cleverness has never been much of an impediment to its operating budget, or to its status in the corporate hierarchy. Until, of course, the shit hits the fan. Then the public dance of recriminations is performed, and everyone concerned goes back to business as usual. Except for the hapless consumer, who’s inevitably forced to grumble, sigh, roll his or her eyes, then pay whatever the going rate is to get back on the road.
Apple could have done a lot better a lot quicker. Its customers love what it promises, even those of us among them who know to what extent the promise exceeds the current limits of technology. Progress requires us to dream forward, and to accept that sometimes along the way our reach will exceed our grasp. That said, a little more transparency from those at the pointy end would be welcome. Infantilizing the consumer as a path to marketing success has all sorts of support from the countless schools of social science pilot fish who’ve attached themselves to corporate C-suites in the postwar decades. God forbid that I should deny what their statistics are telling them about our human vulnerabilities. I would ask them, though, to consider how they feel when their own strings are pulled.