Heteromachinations

by Misha Lepetic

As for you, my galvanized friend, you want a heart.
~ The Wizard of Oz

Oz1There is an old joke that deserves to be made popular again. A CEO is touring the company's newest factory. The manager, with a great deal of pride, points out how everything is automated. As the tour reaches the final room, the CEO notices a man sitting sullenly in the corner, with a leashed dog sitting next to him. Somewhat surprised, the CEO asks why the man is there, to which the manager responds, "It's his job to feed the dog." Stumped, the CEO asks why the factory would need a dog. The manager responds, quite matter-of-factly, "Why, to keep the man from touching the equipment."

At least one telling of the joke can be attributed to Warren G. Bennis, a scholar of organizational psychology and more-or-less originator of the field of leadership studies. But what is more interesting to me is the fact that Bennis's version of the joke goes back to 1991—an indication that we have been thinking about technological unemployment for a long time. When I originally heard the joke, probably sometime in the mid-90s, I savored it for its absurdist connotations: man and dog, locked in an eternal, Monty Python-esque loop of feeding and guarding, so as to guarantee no interference in the well-tempered functioning of the machine that has almost entirely replaced them both.

But these days what resonates for me more profoundly is the notion that these two still have jobs, regardless of how marginal such jobs may be; someone still has to feed the dog, and someone still has to keep the man from messing up the machinery that does the actual work. The real subtlety in the joke is that any presence should be needed at all, and yet it is somehow still required. The jobs—for both man and dog—are a fig leaf, but ostensibly the owners of the factory have decided that such a fig leaf is necessary, or at least desirable. Why is this?

I was reminded of this joke when recently contemplating the ubiquitous headlines that sensationalize the wholesale replacement of human labor by non-human capital. Unsurprisingly, the mainstream media prefers the drama of entire sectors of labor being sidelined. For example, an evergreen topic is the imminent wipe-out of heavy truck driving, which accounts for 1.8 million jobs in the United States, or nearly doubling to 3.5 million jobs, if you include taxis, delivery vans and the like. But paradigms are rarely overturned quite so rapidly, and the story that is already unfolding before us is much trickier to unravel, and more interesting.

*

Oz2What's necessary to understand from the start is that automation progresses fitfully and partially. It is driven by firms that seek local optima, and once achieved may remain there for quite some time, since a firm has no incentive to categorically replace all possible jobs if there is no reason to do so. Instead, it will approach automation instrumentally, as a way of limiting its exposure to the risk of existing and budding competitors. This risk-based approach was a cornerstone of JK Galbraith's economic thought, which he termed the ‘technostructure', that I explored more fully earlier this year.

But even this, more measured assessment is still a massive sea change in the way that labor is structured. Nevertheless, there are plenty of commentators who maintain the old chestnut that technological change always creates new opportunities for workers. One of the more reliable standard-bearers in this regard is The Economist, which recently wrote that:

…technology is creating demand for work. To take one example, more and more people are supplying digital services online via what is sometimes dubbed the "human cloud". Counter-intuitively, many are doing so in response to artificial intelligence.

Put another way, as automation broadens its interventions from the manufacturing and supply chain industries into services, what we are seeing is a re-emergence of piecework, but in a digital context. What does this look like in practice? Sometimes it is explicit, as evinced by Amazon's Mechanical Turk service, which since 2005 has been doling out ‘human intelligence tasks' that pay meager amounts to home workers for ‘microwork' such as audio transcription, image tagging, and even participation in experiments and surveys run by social scientists.

More interesting is the way in which human work is being concealed behind the facade of automation, or even passed off as the latter. Facebook's personal assistant M made a splash when it debuted in August 2015, as competition to Amazon's Alexa and Apple's Siri. Its selling point was the notion that M could handle a much broader range of requests than either Alexa or Siri, but by November enterprising journalists had deduced that any success M may have had in this brutally competitive field was mostly due to the fact that the technology had an undisclosed number of humans backing it up.

In one case, technology reporter Mat Honan asked M to deliver a parrot to a rival journalist's office. I suppose people do this sort of stuff in the Bay area with some regularity, because M was quickly able to find Happy Birds, a company that rents out birds for events. But Honan's request was built to test what happens "when M ventures in the outside world — which is a large part of its promise. It has to bump up against humans, and when it does, it drops the pretense of AI altogether and becomes just another cog in the gig economy." In this case, M shape-shifted into a human contractor, who called Happy Birds several times in order to negotiate the deal. Additionally, a similar call went out through another gig economy platform to see if anyone could resolve the client's request.

What does this tell us? For one thing, it points out the implicit ways in which we mold ourselves to our tools—even tools as allegedly flexible as virtual personal assistants. If Siri gives up on a request and just searches the web, then we know not to be so complex the next time around. Facebook, however, is hoping to carve out a more decisive market share by addressing a much larger variety of requests. Quoted in MIT's Technology Review two and a half years later, project lead Alex Lebrun noted that:

People try first to ask for the weather tomorrow; then they say ‘Is there an Italian restaurant available?' Next they have a question about immigration, and after a while they ask M to organize their wedding…We knew it would be dangerous, and it's wider than our expectations."

It's also worth remarking that, in that intervening time, M's maximalist approach has allowed its user base to expand only minimally, from a few hundred users in August 2015, to about 10,000 in April 2017, all of whom are based in the same geographic area (guess which one). Obviously, with great scale comes great responsibility.

So here is one scenario where The Economist indeed has a point. As automation seeks to turn all of lived experience into a service, it still needs to have humans behind the curtain, pulling the levers that the software cannot. And even if M gets to the point that it can negotiate a parrot rental via telephone with a human interlocutor, people are still needed to drop off and pick up the birds, and so on; the physical world abides. But at the same time, once M knows how to negotiate for parrots, it need never receive that training again. At a certain point it will be able to translate that knowledge into negotiating for piglets, or hot air balloon rides, or what have you. There will be fewer edge cases, and therefore fewer people needed, although I am convinced that, if M is to maintain its position as the personal assistant that is closest to being a concierge, there will always be a watchful human standing by, ready to catch the occasional request to have my hotel room decorated with pictures of Jeff Goldblum.

*

Oz3At least with M, Facebook has admitted the ruse. Other companies have not been so forthcoming. Expensify, a startup that allegedly ‘automates every step of the expense reporting process', would seem to have the perfect solution for this onerous task. Customers scan their receipts, and the company's SmartScan service uses optical character recongition (OCR) to make everything machine-readable—except for when it's not. Here's a recent tweet I picked up on the matter:

"I just went browsing through @expensify's jobs on MTurk. Found boarding passes, hotel receipts (name, dates, details), medical receipts, addresses, signatures, and a whole lot of burgers and burritos. I also just turned off SmartScan, because that's super creepy."

Expensify knows that too many failures to read a crumpled or blurry receipt will cause customers to abandon its product. You don't sign up for a service that will only be able to handle two-thirds or three-quarters of your month's expenses. So it's sensible to create a backup system to handle the edge cases. But it's no surprise that Expensify, like any other young company, would seek to cut costs and contract out this work—in this case, to some random guy sitting at home in his pajamas, willing to sign a flimsy NDA on Mechanical Turk. The consequences of these decisions may range from simply a sketchy regard for privacy, to all-out alarm bells when one realizes that this leaky approach may provoke everything from HIPAA violations (when it comes to medical information) to outright criminality (consider a network that gathers scanned images from multiple Turk contractors, in search of identity information). In the meantime, Expensify played coy to inquiries as to who, exactly, was eyeballing clients' receipts; not a good look, as the kids say these days.

Other manifestations of this behind-the-curtain phenomenon play a far more important role in technology that is coming down the pike. Specifically, driverless cars not only rely on state-of-the-art instrumentation and machine vision processing to sense their environment, but also require small armies of humans to correctly tag the results for ‘training' the neural networks correctly. This human component tends to be underreported, which is a bit odd, given the homogenous nature of most technology reporting. A common example is this CityLab piece, happy to grapple with the ins and outs of signage recognition and such challenges, but doesn't mention the human component in the data preparation strategy of any of several companies in the industry.

More realistically, the Financial Times gathered a whole passel of quotes from leaders, academics and executives in this field that reflected how much work it actually takes to present these systems with clean, useful training data:

  • "Machine learning is a myth, it's all Wizard of Oz type work…The labelling teams are incredibly important in every company, and will need to be there for some time because the outdoor environment is so dynamic."
  • "We need hundreds of thousands, maybe millions of hours of data" for self-driving vehicles to go everywhere, requiring "hundreds of thousands of people to get this thing done" globally.
  • "It is very hard to get people to talk about this….They all like to say it's machine-learning ‘magic'."

What is striking about all three of these cases—Facebook M, Expensify, and the autonomous vehicle industry—is their reluctance to admit to the continued need for human labor. On the one hand, given the fact that the gig economy is here to stay, one would think this an opportunity for Silicon Valley types to trumpet their job-creating prowess. But this doesn't seem to be happening. Rather, are humans kept behind the curtain because their presence is considered a failure of engineering, code, business execution, or all of the above? Perhaps the world is just not as technologically fungible as we have always been told by these entrepreneurs? Or maybe it's simply that all these jobs just suck, and everyone knows it?

*

Oz4But the kingdom of digital piecework is much more extensive than any of these examples indicate. In fact, we may be thinking about it in completely the wrong way. What if we are, all of us, doing this work all the time? A recent book by Hamid Ekbia and Bonnie Nardi proposes the umbrella term ‘heteromation' to encompass not just the sort of digital piecework discussed above, but any sort of offerings we make while online:

Heteromation includes many kinds of self-service (e.g., check your own groceries), volunteer work such as citizen science, creative engagement (e.g., in computer gaming), microwork (Amazon Mechanical Turk), writing online customer reviews, and a wide range of other activities that provide economic value to companies and organizations, but little or no monetary compensation to laborers —that is, most of us…

People are already "working alongside machines"—but they are not getting paid for it, or not very much. The debate is a false battle between AI and people—when in fact the critical issue is that people are working, but the terms of labor are radically changing. Seduced into (gaming, social media…) or forced into (self-service, gatekeeper apps like Academia.edu…) participation in heteromated labor, we supply [that labor]; capitalism has, remarkably, found its way to free or very cheap labor with absolutely no obligations to laborers as human beings with basic needs.

Consider the platforms that would simply not exist without that collective effort: not just Facebook or Twitter, but also review sites like Yelp, or the crowdsourced traffic displays on Google Maps. The question of how little technological ‘progress' would there be if we paid labor more is only a starting point. It yields to a larger question: would we have any of the current stable of technologies if we valued labor at all? And we cannot even begin to consider questions of how to value our labor if we cannot agree on whether to value it. That is, if we consider the conveniences afforded to us by these platforms to be commensurate with the amount of effort we put into collectively building them, then this moots any further discussion—which seems to be where we are these days. So it is ironic to consider that such a subtle yet thorough restructuring of work is indeed transpiring right before our very eyes, when, at the same time, there are other rumblings that childcare, caregiving and such other, traditionally unpaid activities, should finally be recognized as labor and valued as such.

In the end, it's cold comfort to maintain, however correctly, that workers are rarely shut out of the employment sweepstakes permanently and unconditionally. But as machines learn to do more, labor is relegated to those marginal activities that machines have not yet mastered. Workflows are fragmented and never not under threat. So even if overall automation may take years to encompass an entire process, labor will still lay a claim to it, however diminishing, until its participation is purely administrative, or even incidental. Returning to the place occupied by truck drivers, Flynn Murphy, a truck driver himself and author of The Long Haul, writes:

The only human beings left in the modern supply chain are truck drivers. If you go to a modern warehouse now, say Amazon or Walmart, the trucks are unloaded by machines, the trucks are loaded by machines, they are put into the warehouse by machines. Then there is a guy, probably making $10 an hour, with a load of screens watching these machines. Then what you have is a truckers' lounge with 20 or 30 guys standing around getting paid. And that drives the supply chain people nuts.

Truckers aside, a reader may justifiably grouse that one employee monitoring an otherwise automated process is pretty much an automated process. And while we're not very close to the joke at the beginning of this piece, we're probably not that far off, either. Of course, it would be funnier if there were a dog involved, but there's a slim chance it'll make it through the last round of layoffs.