LONDON – After the 2008 global financial crisis, a consensus emerged that the public sector had a responsibility to intervene to bail out systemically important banks and stimulate economic growth. But that consensus proved short-lived, and soon the public sector’s economic interventions came to be viewed as the main cause of the crisis, and thus needed to be reversed. This turned out to be a grave mistake.
In Europe, in particular, governments were lambasted for their high debts, even though private debt, not public borrowing, caused the collapse. Many were instructed to introduce austerity, rather than to stimulate growth with counter-cyclical policies. Meanwhile, the state was expected to pursue financial-sector reforms, which, together with a revival of investment and industry, were supposed to restore competitiveness.
But too little financial reform actually took place, and in many countries, industry still has not gotten back on its feet. While profits have bounced back in many sectors, investment remains weak, owing to a combination of cash hoarding and increasing financialisation, with share buybacks – to boost stock prices and hence stock options – also at record highs.
The reason is simple: the much-maligned state was permitted to pursue only timid policy responses. This failure reflects the extent to which policy continues to be informed by ideology – specifically, neoliberalism, which advocates a minimal role for the state in the economy, and its academic cousin, "public choice” theory, which emphasises governments’ shortcomings – rather than historical experience.
Growth requires a well-functioning financial sector, in which long-term investments are rewarded over short-term plays. Yet, in Europe, a financial-transaction tax was introduced only in 2016, and so-called patient finance remains inadequate almost everywhere. As a result, the money that is injected into the economy through, say, monetary easing ends up back in the banks.
The predominance of short-term thinking reflects fundamental misunderstandings about the state’s proper economic role. Contrary to the post-crisis consensus, active strategic public-sector investment is critical to growth. That is why all the great technological revolutions – whether in medicine, computers, or energy – were made possible by the state acting as an investor of first resort.
Yet we continue to romanticise private actors in innovative industries, ignoring their dependence on the products of public investment. Elon Musk, for example, has not only received over $5 billion in subsidies from the US government; his companies, SpaceX and Tesla, have been built on the work of NASA and the Department of Energy, respectively.
The only way to revive our economies fully requires the public sector to reprise its pivotal role as a strategic, long-term, and mission-oriented investor. To that end, it is vital to debunk flawed narratives about how value and wealth are created.
The popular assumption is that the state facilitates wealth creation (and redistributes what is created), but does not actually create wealth. Business leaders, by contrast, are considered to be productive economic actors – a notion used by some to justify rising inequality. Because businesses’ (often risky) activities create wealth – and thus jobs – their leaders deserve higher incomes. Such assumptions also result in the wrong use of patents, which in recent decades have been blocking rather than incentivizing innovation, as patent-friendly courts have increasingly allowed them to be used too widely, privatizing research tools rather than just the downstream outcomes.
If these assumptions were true, tax incentives would spur an increase in business investment. Instead, such incentives – such as the US corporate-tax cuts enacted in December 2017 – reduce government revenues, on balance, and help to fuel record-high profits for companies, while producing little private investment.
This should not be shocking. In 2011, the businessman Warren Buffett pointed out that capital gains taxes do not stop investors from making investments, nor do they undermine job creation. "A net of nearly 40 million jobs were added between 1980 and 2000,” he noted. "You know what’s happened since then: lower tax rates and far lower job creation.”
These experiences clash with the beliefs forged by the so-called Marginal Revolution in economic thought, when the classical labor theory of value was replaced by the modern, subjective value theory of market prices. In short, we assume that, as long as an organisation or activity fetches a price, it is generating value.
This reinforces the inequality-normalising notion that those who earn a lot must be creating a lot of value. It is why Goldman Sachs CEO Lloyd Blankfein had the audacity to declare in 2009, just a year after the crisis to which his own bank contributed, that his employees were among "the most productive in the world.” And it is also why pharmaceutical companies get away with using "value-based pricing” to justify astronomical drug-price hikes, even when the US government spends more than $32 billion annually on the high-risk links of the innovation chain that results in those drugs.
When value is determined not by specific metrics, but rather by the market mechanism of supply and demand, value becomes simply "in the eye of the beholder” and rents (unearned income) become confused with profits (earned income); inequality rises; and investment in the real economy falls. And when flawed ideological stances about how value is created in an economy shape policymaking, the result is measures that inadvertently reward short-termism and undermine innovation.
The writer is a Professor in the Economics of Innovation and Public Value and Director of the Institute for Innovation and Public Purpose at University College London, and author of ‘The Value of Everything: Making and Taking in the Global Economy.’
Copyright: Project Syndicate