Why don’t computers work? | Financial Times

It’s a well-researched oddity of the past few decades that as technology gets faster, people get slower. Digitisation of the workforce has failed to do what it promised and there is no agreement as to why.

Economist Robert Solow summarised the problem in 1987, saying: “You can see the computer age everywhere but in the productivity statistics.” Data keep on proving his point. All measures of IT spending have kept trending higher yet since 2005, rates of labour productivity growth at least halved in the US, UK, Japan, Germany and France.

This paradox (sometimes called the productivity puzzle 2.0, in observance of similar trend in the 1970s and 1980s) excites plenty of debate. It might be, as Robert Gordon has argued, that recent technological advances just aren’t that great relative to history. Perhaps, as Jonathan Haskel and Stian Westlake argue, the measure itself is becoming obsolete. Other popular theories involve some combination of structural headwinds, mismeasurement, lag effects, fiscal suppression and a long term return to the mean. What none capture is why constant incremental improvements, rather than arresting a weakening trend, appear to be contributing to it.

A paper from University of Lausanne PhD student Seda Basihos makes an interesting contribution to the debate. (Note: unreviewed preprint, there be dragons.) She argues that because of rapid obsolescence, computing is a uniquely pernicious force.

Computers are the worst thing to happen to the global economy in 150 years because . . . well, you will have probably guessed already. Every digital fix has a knack of creating three new problems. Any tweak threatens to invoke the recursive loop of pointless labour. A PC might look modular but it’s a morass of potential incompatibilities and performance bottlenecks, meaning entire corporate systems are junked whenever a software update or a withdrawal of OEM support prematurely terminates the usefulness of one part. And because of this accelerated replacement cycle, workers have to continually relearn their jobs.

Basihos’ paper takes as its starting point Microsoft’s launch of Windows 95. Exposure to the Brian Eno start-up sound coincided with a brief uptick in US worker productivity. In the longer term, however, the “permanent obsolescence shock” that followed might be responsible for roughly one-third of lost productivity growth, Basihos finds.

She suggests to think of the economy as an airline, where jets are capital stock and pilots are labour. Any replacement jet part that isn’t like-for-like will cause a potential mismatch, and every mismatch raises the likelihood of a plane ending up in the sea.

Airlines will generally try to crash no more frequently than their competitors, because planes that crash are very inefficient both in terms of capital allocation and labour productivity. A well-functioning, competitive market places the onus on airlines to keep pace with whatever incremental improvement any one airline rolls out, even when it requires Ship of Theseus style replacement of the whole fleet.

The pilots, meanwhile, have to retrain on new systems or retire. But retraining pilots isn’t such a priority, because learning to fly takes ages and the CEO keeps on promising Level 5 autonomy. The result: the income share going to capital increases, fewer new labour tasks are created so labour’s income share declines, and measured productivity goes into tailspin.

As well as positing a tidy solution to Solow’s paradox, the paper touches on tech rot as a possible explanation for the dislocations between R&D spending versus GDP growth, wages versus productivity, and business investment versus interest rates. Though it’s some distance from a fully worked thesis, it’s something to consider when contemplating the $10tn or thereabouts of equity wealth created by tech obsolescence (justifiable or by design) over the past couple of decades.

[The author typed this post on a 2008 Lenovo T500 running Windows Vista.]