2013-08-22

There were two interesting articles I read (among others) in recent days that attack mainstream economic analysis in different ways. The first, published August 18, 2013 – Removing deadweight loss from economic discourse on income taxation and public spending – is by Northwestern Economics Professor Charles F. Manski. He wants our profession to dump all its negative welfare analysis about the impacts of taxation (as “deadweight losses”) and, instead, focus on the benefits that come from government spending, in particular, highly productive infrastructure provision. So, an attack from the inside! The second article was a Bloomberg Op Ed (August 21, 2013) – Economists Need to Admit When They’re Wrong – by the theoretical physicist, Mark Buchanan, who has taken a set against my profession in recent years. Not without justification and with some panache, one should add. They both add up to the same conclusion – mainstream economics is defunct and we should decommission teaching programs throughout the world and introduce new progressive approaches to the discipline that will produce a new bread of useful Phd graduates, rather than the moribund graduate classes that get rubber-stamped out of our higher education institutions, ad nauseum, at present.

Charles Manksi article is an interesting twist. He clearly sees the “anti-tax rhetoric evident in much lay discussion of public policy” as being informed by the “prevalent negative language of professional economic discourse”.

He notes that my economists “regularly write about the ‘inefficiency’, ‘deadweight loss’, and ‘distortion’ of income taxation”, which, of-course, stem from the textbook models they deploy and the pedagogy they use and pass on to students.

It is a self-fulfilling dynasty – students are imbued with an anti-government zeal from day one of their economics courses because they are forced to believe that the so-called model of perfect competition is an applicable benchmark of welfare optimality and that government policies – by construction – lead to departures from that utopia.

The language is then all value-laden – these departures are bad – because the benchmark is good. Students are lead to believe these departures are inefficient – which is taken to mean wasteful, destructive and loss-incurring.

Efficiency is the perfectly competitive outcome (except for the concession that in some rare cases there is so-called “market failure” which justifies public intervention). How pervasive might market failure be? No real response – sometimes but clearly rarely.

Charles Manski quotes Harvard academic, Martin Feldstein from 1999:

The traditional method of analysing the distorting effects of the income tax greatly underestimates its total deadweight loss as well as the incremental deadweight loss of an increase in income tax rates.

I provided my opinion of Feldstein in this blog – Martin Feldstein should be ignored. Feldstein was one of the economists featured in the 2011 investigative movie – Inside Job – which the Director Charles Ferguson said was about “the systemic corruption of the United States by the financial services industry and the consequences of that systemic corruption.”

Charles Manski writes in relation to the standard “welfare economics” which provide so-called mathematical proofs that government intervention (taxation in this case) is bad that:

… prominent applied public economists continue to take the theory quite seriously … [and that] … The Feldstein article and similar research on deadweight loss appear predestined to make income taxation look bad … It focuses attention entirely on the social cost of financing government spending, with no regard to the potential social benefits.

A recurring theme relates to whether the public really understand the benefits of government service and infrastructure delivery. Even with a mainstream framework there is virtually no emphasis on these benefits.

From a Modern Monetary Theory (MMT) perspective, the idea that taxation funds government spending is inapplicable. Please read my blog – Taxpayers do not fund anything – for more discussion on this point.

But from a mainstream framework, which pretends to evaluate resource usage on the basis of costs and benefits, the scarcity of research on the benefits of government spending relative to the so-called costs of taxation is telling.

I used to ask students to imagine a world where everything was private and from the moment you left your front gate in the morning to the time that you are right back there in the evening. In this world, all transitions would have to be negotiated via contracts with private providers. Imagine, the possibility that each streetcorner you had to negotiate a price and other characteristics to enter the next stage of your journey. A trivial example but it will only get more complex from there.

But Charles Manski’s point is worth thinking about. He says that:

If applied economists are to contribute fair and balanced analysis of public policy … it essential that we jointly evaluate taxation and public spending within a framework that restricts attention to feasible tax instruments and that makes a reasonable effort to approximate the structure of the actual economy.

He then specified how that might be done drawing on the early 1970s work of James Mirlees. Of this work, we learn that value-laden concepts such as “inefficiency, deadweight loss, and distortion”, which prejudice our thinking against government spending were absent and taxation and spending were evaluated in a consistent framework to consider net outcomes.

While I won’t go into considerable detail on Charles Manski’s recommendations relating to a social-welfare function approach as the “only normative concept required for evaluation of public policy” his main point can be understood when he says:

… the essential feature of my research is the transparent way that it characterises how public spending on infrastructure may enhance private productivity … [and that] … Increasing the tax rate is socially beneficial if and only if the additional infrastructure spending enabled by the tax rise yields a more than commensurate increase in wages.

So while his work remains firmly in the mainstream paradigm it is clearly recommending a break from the manic adherence to the perfectly competitive, welfare economics that dominate public policy making.

I wonder what Charles Manski would think if he dropped “the use of tax revenue to finance public spending on infrastructure” as his starting point?

Once one jettisons that flawed assumption the rest of the analytical framework would also fall apart. But it is salutary than mainstream economist recognises the loaded terminology of the mainstream approach, which distorts the public debate and leads to poor public policy.

The importance of his observation is that most of the many deficit terrorist rhetoric that journalists love to promote is fed from economists who Manski is criticising.

Even within the mainstream approach – characterised by the assumption that taxes fund government spending – the deficit terrorists we have no oxygen if they heeded the words of Charles Manski.

The second article, as noted, is not an insider’s contribution. Mark Buchanan, a theoretical physicist has written a number of highly critical and, at times, humorous articles about the mainstream of my profession.

In this article, he notes that “scientific activity” is not about proving one’s theories are right (the so-called truth pursuit that belongs in religion) but rather:

… what matters most is figuring out what’s wrong — an endeavor in which the economics profession has been failing spectacularly.

I have long taught my students in econometrics that, at best, all they are doing is establishing that the current conjecture is both data-consistent and, hopefully, the best explanation currently around.

The words tentatively adequate are an apt description of how students should think of their applied results.

Mark Buchanan quotes a fellow-physicist:

Unscientific ideas, by contrast, have a bloblike ability to conform to any set of facts. They are difficult to prove wrong, and so don’t teach us much.

He notes that it was a common feeling in the early days of the GFC that mainstream “economic thinking” was profoundly wrong (“profound errors”) and that notions such as the “wonderful efficiency and inherent stability of modern markets, all supposedly supported by volumes of sophisticated mathematics” could be “finally jettison(ed)” due to their deep flaws and new ideas could emerge.

I thought that too and have been somewhat amazed at the capacity of the public to continue accepting to hogwash from my colleagues. How many more times will they be wrong?

Mark Buchanan correctly notes that:

The dominant paradigm in macroeconomics recovered remarkably quickly, leading one to wonder if any conceivable turn of events could falsify the prevailing faith. Much of academia went into complete denial, while some people suggested that a few minor tweaks may be necessary to put things back on track.

He cites an example of recent research that is at the more unbelievable end of the rubbish that comes out every day from the mainstream economists.

This article – Inflation in the Great Recession and New Keynesian Models – from some economists at the Federal Reserve Bank of New York is a classic!

Only read it if you are good at mathematics and want to waste 30 minutes of your life.

Its brief is to resurrect the standard DSGE model, which failed dramatically to predict the crisis. They accomplish this by adding what they call “financial frictions” and then add some variations on the standard “marginal prior distributions” that the previous DSGE models used – what they call a “looser prior, one that has less influence on the model’s forecasting performance”.

They also “fix parameters” to make the theoeretical model tractable as a state-space representation.

In English? The original made up priors (beliefs about model parameters) clearly give shocking forecasting outcomes so they had to loosen them up a bit to get better forecasting performance.

In other words, fudge after fudge with no theoretical or behavioural justification provided for the values they chose.

I have spent many years estimating econometric models and I would like to think I know all the tricks and all of the ways in which specific desired results can be produced from a given dataset.

The capacity to produce whatever is desired increases in a DSGE framework because of all of the numerical priors that can be imposed on the model solution and produced results.

Best practice would tell us that they should be solid grounds for imposing any prior and that the process of model selection, estimation, and forecasting should be entirely transparent and able to be replicated by anyone who has access to the same data.

I doubt very much whether I could replicate the results of this paper very easily. That is not because the techniques used are overly complex or that the data is not available. It is just that the fudge is entirely opaque.

Please read my blog from 2009 – Mainstream macroeconomic fads – just a waste of time – for more discussion on DSGE modelling and New Keynesian economics.

Mark Buchanan says of this paper:

… an economic research team announced that after several years of determined effort, they had found a way that standard theory could explain the aftermath of the crisis after all. They managed, with enough tinkering in the workshop, to hammer one of the profession’s beloved mathematical models — known as a dynamic stochastic general equilibrium model — into a form that could produce something crudely like the 2008 financial meltdown and ensuing recession.

He wonders what motivates “such desperate efforts at rationalization”.

The answer he provides (from the work of economic historian Philip Mirowski) is that any departure from this sort of mdeolling would undermine our status is society:

Much of economists’ authority stems from their claims to insight on which policies will make people better off. Those claims arise from core theorems of mathematical economics — known as welfare theorems — which in turn depend on some wildly implausible assumptions, such as the idea that people are perfectly rational and make decisions with full awareness of all possible futures.

In my blog cited above (Mainstream macroeconomic fads – just a waste of time) I note that the theoretical models that New Keynesian economists build are incapable of dealing with the real world and so ad hoc responses to empirical anomaly quickly enter the fray.

But trying to build real world characteristics (such as a lagged dependence between output and inflation) into their models from the first principles that they start with is virtually impossible.

No New Keynesian economist has picked up this challenge, and instead they just modify their models with a number of arbitrary (empirically-driven) additions.

But the point is that once they modify their theoretical models to include some empirical facts the so-called “desirable” welfare properties of the theoretical models disappear.

So like most of the mainstream body of theory they claim virtue based on so-called microeconomic rigour but respond to anomalies that are pointed out when that “rigour” fails to deliver anything remotely consistent with reality, with ad hoc (non rigourous) tack ons.

So at the end of the process there is no rigour at all – using rigour in the way they use it which is, as an aside, not how I would define tight analysis.

Mark Buchanan correctly notes therefore:

If economists used more realistic assumptions, the theorems wouldn’t work and claims to any insight about public welfare would immediately fall apart. Take a few tiny steps from mathematical fantasy into reality, and you quickly have no theory at all, no reason to think the market is superior to alternatives. The authority of the profession goes up in a puff of smoke.

This is a point not often understood. The real world is nothing much like the theoretical world that mainstream economists hide out in collecting their pay in secure jobs. Talk about inefficiency and unproductive pursuits!

In this blog – Defunct but still dominant and dangerous – I introduced the work of the late Kelvin Lancaster, who was an Australian economist who like many of my profession ventured to the US to graduate school because that was increasingly thought to be where it was at! Cultural and ideological cringe mostly.

In 1956 two economists (Richard Lipsey and Kelvin Lancaster) came up with a very powerful new insight which was called the Theory of the Second Best. This was, in fact, a devastating critique of mainstream welfare economics.

You won’t hear much about the theory any more because, like all the refutations of mainstream theory, it got swept under the dirty neoclassical carpet and economics lecturers using homogenised, ideologically-treated textbooks continue blithely as if nothing happened.

In English, the Theory of Second Best basically says that if all the assumptions of the mainstream theory do not hold in a particular situation, then trying to apply the results of the theory in that case is likely to make things worse not better.

So “if one optimality condition in an economic model cannot be satisfied, it is possible that the next-best solution involves changing other variables away from the ones that are usually assumed to be optimal” (Source).

This is very applicable to the use of the model of perfect competition which requires several assumptions to be valid (for example, perfect information, perfect flexibility of prices, perfect foresight, no market power being wielded by firms, workers or anyone etc) for the main theoretical insights (results) to have validity.

Virtually none of the required assumptions apply in the real world.

Economists often say that governments should try to dismantle real world “rigidities” (as they call them) so that they can move the economy closer (but not to) perfect competition – because in their fantasy comic book texts this is the ideal state.

The Theory of Second Best tells us that if you do not have that “ideal state” and you dismantle one so-called “rigidity” but leave others then you can make things worse off.

The other point is that often it is “better” for governments to introduce new “rigidities” to confront existing departures from perfect competition.

The point is that the theory of second-best destroys the capacity of the mainstream economists to use “perfect competition” models as an authority in the policy debate. The text book models have no legitimacy in the policy domain.

That is why you won’t read much about it in the newspapers or other media where economic policy is discussed.

One of my influences is, as I have noted in the past, the great Polish economist Michal Kalecki.

In the 2010 book – Great Thinkers in Economics: Michal Kalecki – (By Julio López and Michaël Assous) published by Palgrave Macmillan, we read that as a highly skilled mathematician, who turned his interest to economics, Kalecki warned of the misuse of mathematics in economic analysis.

He is quoted as saying (Page 2):

… you should know that you must never use mathematics when you can say the same thing in a simpler way, in common language …

He also was well aware (in the 1930s) of the failures of the textbook models that still dominate today.

In Joan Robinson’s Collected Economic Papers II (page 241) we read:

The general discontent with the complacency of text-book economics found its main expression in Keynes’ General Theory, and the theory of employment was, of course, far more important, both for analysis and for policy, than anything concerned with the theory of individual prices. Keynes himself was not much interested in price theory, but the two streams of thought were combined by Michal Kalecki.

[Reference: Robinson, J. (1960) Collected Economic Papers II, Oxford University Press, Oxford]

The discontent with the standard text-book models is thus not new. We are talking here about a literature from the 1930s. However, my profession was dogged, if nothing else. The fact is that the theories presented in mainstream textbooks, particularly in relation to microeconomics, had little merit then and virtually no merit now.

Not a lot has changed.

In a paper by Michal Kalecki’s biographer, George R. Feiwel – On Kalecki’s Theory of Income Distribution – we read (Page 310):

His theory is not merely a deviation or departure from the neo-classical marginal productivity theory … He simply never started from it, but proceeded from a different approach in building his analytical construct and marginal productivity did not enter into his argument … Kalecki did not simply relax the restrictive assumption of universal rule of perfect competition. The model of perfect competition is foreign to his method of attacking economic problems. He argued that only by dropping the untenable assumption of perfect competition and penetrating the real world of industrial and market structures (imperfect competition and oligopoly) can any plausible propositions about determinants of macrodistribution be advanced … Kalecki’s theory of distribution ‘is important both because his theory is important in its own right and because it focuses attention on an aspect of distribution theory which had hitherto been neglected because of the preoccupation of earlier writers with the production function and perfect competition.’

The section in the above Feiwel quote that is itself quoted is from Frank Hahn’s 1972 book – F. H. Hahn, The Share of Wages in the National Income, London, 1972, pp. 2, 35, passim.

[References: Feiwel, G.R. (1974) 'On Kalecki's Theory of Income Distribution', De Economist, 122(4), 309-325; Hahn, F.H. (1972) The Share of Wages in the National Income, London]

The point of that stroll down Kalecki-memory lane is to reinforce the notion that economics should jettison these flawed starting points. Students should only encounter the model of perfect competition in a course on the history of economic thought.

Holding it out as a benchmark for optimal outcomes, against which all policy has to be judged just sets up the discussion to reject almost any policy intervention as being bad.

The language used is prejudicial (re Charles Manski’s article) and the conclusions drawn plain wrong.

Time to move on.

Conclusion

There is a coherent macroeconomics available – it is called Modern Monetary Theory (MMT).

There is also a good body of microeconomic literature around – that is marginalised – which builds on the sort of insights that Kalecki (and Marx) and others provided to us many, many years ago.

Why do mainstream economists hang on to such defunct approaches? First, they are so arrogant and thick-skinned they don’t feel the humiliation of being wrong all the time. Second, they hate (as in religiously) the implications that arise when a more coherent economic framework, which is grounded in real world realities, is used.

That is enough for today!

(c) Copyright 2013 Bill Mitchell. All Rights Reserved.

Show more