“In economics, the majority is always wrong.”
~ JK Galbraith
One of the unfortunate gifts of the current, star-crossed administration is that there's something for everyone that will get their knickers in a twist. If immigration or climate change isn't your thing, just wait a few days, and some administration official will come out with a statement that lands somewhere in the space between spectacularly ignorant or merely deeply ill-considered. My latest opportunity to double-take arrived a few days ago, when Secretary of the Treasury (and Goldman Sachs alum) Steven Mnuchin opined that the threat of artificial intelligence to employment is “not even on my radar screen”.
To be fair, the clip is brief enough that it is difficult to conclude whether or not Mnuchin knows what he is talking about. Too often when we talk about technology we fixate on one aspect of it, and intend (although not always) that this aspect stands in for the entirety of the technological phenomenon. These days, favored metonymies are ‘AI', along with ‘robots' and ‘algorithms'. Keeping this in mind while listening to the Mnuchin clip, it's unclear what he actually means when referring to AI. Although I suspect he's talking about the holy grail of AI, which is artificial general intelligence, or an AI that is indistinguishable from human intelligence.
If that is the case, then he did a disservice to the question, which was about the impact of AI on employment. Or, if you'll allow me to pluck out the metaphor, the impact of technology on employment, which is much more amorphous. Mnuchin's dodge was to say that, since we won't have human-equivalent AI for the foreseeable future, it's something that's not worth thinking about, at least until it happens. Come to think of it, I've heard this dodge before, mostly from the mouths of climate change skeptics and deniers. In both cases, the purpose is to obfuscate and delay until the truly catastrophic comes to pass, then innocently maintain that “no one could have seen this coming” or some such nonsense.
However, Mnuchin gives us a good opening for asking how technology and employment are influencing one another, or at least how we might think about these categories and phenomena. At another point in the same clip, he expresses optimism that technology is good for productivity, that it creates new jobs and industries and stimulates demand – all the old chestnuts. But how much of this is true, and is this time really different?
Oddly enough, it may make sense to begin the discussion from another perspective entirely. The vast majority of articles that have lately tackled the specter of automation and unemployment have approached it from the point of view of individuals. For example, the 3.5 million truck drivers who will shortly be rendered obsolete by driverless fleets. That's a lot of people, but we are still talking about an aggregation of individuals. This way of thinking may seem to make sense, since we persistently characterize the economy as exactly that: an aggregation of individuals. Individual agents make decisions to buy and sell, and the supply and demand curves shift accordingly. Firms put goods and services out on the market, and people either buy them, or they don't. Price is revealed, and all is well with the world. This may be adequate for anyone just beginning to learn about economics, but there are other interpretations that are perhaps even more powerful, and more resonant with the historical progress of industry.
Written exactly 50 years ago, economist John Kenneth Galbraith's The New Industrial State elaborated a theory of the firm that proposed a very different way of looking at production. As he observed large corporations engaging in extremely high-stakes, long-term bets, he noted that the commitment required to design and manufacture something as complex and expensive as a jet liner required an altogether different way of looking at the market. For Galbraith, the prime directive for a firm was not to be responsive to the market, but to subsume that market as thoroughly as possible. Galbraith fils summarizes his father's thought:
Large business firms often replace the market altogether. They do this by integration: replacing activity previously mediated by open purchase and sale with activity either internal to the corporation, or between a large, stable enterprise and its small, specialized suppliers, to whom risk is transferred. People reduce uncertainty…by forming up into structured groups large enough to forge the future for themselves. In politics these are countries and parties; in economics, corporations. Once control passes to the organization, Galbraith wrote, it passes completely.
Galbraith called this manifestation a “technostructure”; for him, this was the great narrative of American industrial history. There is much more to Galbraith's theories, but for the purposes of this essay, there are two important points in the above passage. The first is that no established corporation views the free market as desirable. Free markets only lead to uncertainty, threatening profitability and the ongoing viability of the firm. Uncertainty must be quashed at any cost – this includes both new entrants as well as existing consumers. It certainly doesn't work all the time, but it works well enough that companies hewing to this worldview may indeed last for decades. One need only look at Apple, which has famously built its success on designing products that it believes people need – and mercilessly removing functionalities that it no longer considers to be necessary. These deficits are then remedied by an extraordinary PR and marketing machine, which effectively uses the company's pole position to control the market's desires.
This allergy to uncertainty leads to the second insight. The drive to control the future is why cartels and monopolies tend to be the real equilibrium state for most ‘free-market' economies. Price fixing and other forms of cartel behavior are the scourge of free market ideologues, because the fact is that it's much easier to keep the disruptors out and make deals with your pals than it is to geniunely worship at the altar of innovation and entrepreneurship. Put another way, all Objectivists are aspiring oligarchs.
Of course, Galbraith was writing in the late 1960s, when manufacturing was king and digital information technologies were but a distant glimmer. And when we think of cartels, we are usually evoking OPEC and other “old economy” phenomena. Surely the digital economy is moving too quickly for Galbraith's principles to continue to hold fast. But consider this passage from Jason Smith, writing recently in the Brooklyn Rail:
Google's parent company Alphabet speaks in exalted tones of technological moonshots, but ninety percent of its revenue and almost all of its profits still come from advertising, most of it via search engines. It is buying up smaller robotics and AI firms, but not necessarily to ramp up investment: it is to establish monopoly conditions that will guarantee super-profits and higher market share within these stagnant conditions. Today, high profits are assured for firms able to disrupt market dynamics and price signals. Such firms are often “more adept at siphoning wealth off than creating it afresh”; they thrive less through innovation than through exorbitant market shares, and streams of technological rent.
Reading Smith in light of Galbraith, one really ought to replace “Today” with “As always.” And lest it be forgotten, Silicon Valley as a whole is not immune from cartel behavior: witness the $415m settlement reached in 2015 on behalf of a class-action lawsuit that found Apple, Google, Intel and Adobe colluding with one another on the creation of “no-poach” lists, essentially promising not to hire away each other's employees in the never-ending war for engineering talent.
Despite Smith's uncanny echo of Galbraith's half-century-old observations, his larger project is to come to a better understanding of the relationship between technology, productivity and employment. Nevertheless, it is intriguing to view this set of relationships not simply from the point of view of economic data, which is endlessly contested, but rather from the perspective of the stakeholders, that is, firms and, to a lesser extent, the government.
In this sense technology, like the market, is suborned to the drive of firms to neutralize uncertainty. Like capital, it is deployed selectively. The notion that there is a headlong rush to replace everything (or everyone) with automated systems is simply fictitious. Smith cites the fact that since 1999, “private investment in software and computer equipment has fallen precipitously, by a full quarter: it is, today, as low as it was in 1995.” This is coupled with another disquieting fact: that since the financial crisis, the United States has had “the slowest growth in productivity of any decade in American history.” With so much capital on the sidelines, it is easy to conclude that investment in automation is not proceeding at nearly the rate that it could be. Smith concludes that:
Current speculations on both the promise and threat of automation are confronted with an ongoing crisis of accumulation [of capital]. In this climate, a fragmentary implementation of automation is unlikely either to liberate large fractions of humanity from work, or produce mass unemployment of the sort envisioned over and again by commentators for the past century.
At first glance, this “fragmentary implementation” may seem reassuring. As long as firms that occupy the technostructure niche are profitable and happy, the kind of catastrophic job losses implied by the sensationalizing media will occur much more slowly than they might otherwise. Firms will only implement automation to preserve their market positions; those truck drivers may yet have a chance to get retrained! But this is also a cold comfort, since it does not mean that automation isn't continuing to happen. It's just that it's happening at a pace set by the technostructure, and is meant to serve its interests, not the market's, and certainly not the public's.
There is a further, more troubling conclusion to be drawn, though. If the pace of automation is insufficient to dislocate the economy so suddenly, such that the torches and pitchforks stay stashed in people's garages, then what chance does labor have to assert its claims? What, in fact, even are the claims that labor may make, in a context that is bereft of unions and short on organizing? And what does work look like in an economy where automation is eventually, but nevertheless inevitably deployed? Next month I'll look at these issues. In the meantime though, perhaps Steve Mnuchin wasn't wrong when he said that AI replacing workers wasn't even on his radar. As long as the technostructure remains unperturbed, it would be more accurate to say these concerns remain comfortably under the radar. For the foreseeable future, it sounds like that's exactly where he wants them to be.