William Paul Bell Queensland University Researcher

Why is mainstream economics not a social science but ideological mathematics?

Posts Tagged ‘globalization

Termination of the history of economics courses contributing to the Global Financial Crisis (GFC)

leave a comment »

Helge Nome : The key to controlling humans does not lie in building fences around them, but to steer their minds away from unwanted questions.

http://wileyeconomicsfocus.files.wordpress.com/2011/01/brainwashing1.jpgThe elimination of courses in the history of economics has contributed to the Global Financial Crisis (GFC) by eroding institutional memory that allowed the dismantling of structures designed to prevent a re-occurrence of the Great Depression.  With little space in the curriculum for reflection on the past, graduate economists feed on a diet of neoclassical mathematics produces an extreme form of bounded rationality where history is both irrelevant and unknown, which makes for a very powerful ideology by steering minds away from unwanted questions. Read the rest of this entry »

Real Business Cycle (RBC) and Rational Expectations Hypothesis (REH) contributing to the Global Financial Crisis (GFC) and the Dynamite Prize

leave a comment »

This article discusses how neoclassical economics has contributed to the Global Financial Crisis (GFC).   In particular, how two neoclassical theories, the Real Business Cycle (RBC) and the Rational Expectations Hypothesis (REH) contributed to the GFC and how these theories are false and unscientific.

Edward C. Prescott and Finn E. Kydland were awarded the 2004 Nobel prize in economics for their work in developing the RBC and Robert E. Lucas Jr. was awarded the 1995 Nobel prize in economics for developing the REH. They have been nominated for The Dynamite Prize in Economics that is to be awarded to the three economists who contributed most to enabling the GFC.  The Dynamite Prize in Economics nominates Prescott and Kydland ‘for jointly developing and popularizing “Real Business Cycle” theory, which by omitting the role of credit greatly diminished the economics profession’s understanding of dynamic macroeconomic processes’ and nominates Lucas for ‘his development of the rational expectations hypothesis, which defined rationality as the capacity to accurately predict the future, both served to maintain Friedman’s proposition that monetary factors do not affect the real economy and, in the name of “rigor”, distanced economics even further from reality than Friedman had thought possible.’ Read the rest of this entry »

G8 or G20 Protests and Computable General Equilibrium (CGE) modelling and its Dual Instability Problem

with 3 comments

This article discusses why Computable General Equilibrium (CGE) models are important to the G8 or G20 protests and why CGE models are unsuitable for policy analysis for the following two reasons, CGE lacking microfoundations and the dual instability problem.

First, why are CGE models important to the G8 or G20 protests?  An example of a global CGE model is the Global Trade Analysis Project (GTAP 2009) coordinated by the Centre for Global Trade Analysis, Department of Agricultural Economics, Purdue University.  GTAP (2009) claims that their model provides a common language for global economic analysis; they cite the use of GTAP in three of the five quantitative studies at the 1995 conference of the WTO’s Uruguay Round Agreement and in virtually all the quantitative work for the 1999 Millennium Round of Multilateral Trade.  This example indicates the credibility and perceived importance of CGE. Read the rest of this entry »

Capital Asset Pricing Model (CAPM) and Efficient Market Hypothesis (EMH) Contributing to the Global Financial Crisis (GFC)

leave a comment »

The Efficient Market Hypothesis (EMH) and Capital Asset Pricing Model (CAPM) are a framework and standard financial tool, respectively. Together, they provide a worldview for financiers and determine their decision-making in the financial markets.

Fama (1965; 1970) introduces the EMH in three market efficiency levels: a strong level where all relevant information regarding a stock is fully reflected in its price; a semi-strong level where all publicly available information is reflected in its price; and a weak level where current prices reflex all past history of the prices.

Fama and French (2004, p. 25) note that CAPM of William Sharpe (1964) and John Lintner (1965) marks the birth of asset pricing theory (resulting in a Nobel Prize for Sharpe in 1990). Four decades later, the CAPM is still widely used in applications, such as estimating the cost of capital for firms and evaluating the performance of managed portfolios. It is the centerpiece of MBA investment courses. Indeed, it is often the only asset pricing model taught in these courses. Read the rest of this entry »

EU acknowledges the failure of traditional economics to predict so adopts agent based modelling

leave a comment »

“This long run is a misleading guide to current affairs. In the long run we are all dead. Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is long past the ocean is flat again.”
— John Maynard Keynes
A Tract on Monetary Reform (1923), 80.

Traditional economics has failed to predict the knock on effects of the financial crisis says the EU. The Eurace project is designed to remedy this failure, which uses an agent based modelling methodology as an alternative to the rational representative agent model that is a cornerstone of neoclassical economics.  The post Progressing from game theory to agent based modelling to simulate social emergence further discusses agent based modelling.   Read the rest of this entry »

GDP as a proxy for well being misses the mark

leave a comment »

The report prepared for the President of France, Nicolas Sarkozy, by two Nobel prize-winning economists, Joseph Stiglitz and Amartya Sen, has proposed ways of improving our measurement of economic performance and social progress (Gittens 2009).  GDP measures the production of an economy.  There are at least three problems with GDP as a proxy for well-being.  First, this proxy may hold for countries outside the OECD membership, where the basics such as shelter, food, access to medical services, and clean water and sanitations are lacking.  Second, what is measured becomes a policy target, in this case a misguided target in OECD countries.  Third, GDP becoming a target circumvents the important discussion of what are suitable measures for well-being.  Equating the level of GDP to the level of well-being reduces the study of economics to an optimisation problem, allowing neoclassical economics the pretence of being scientific.  My post ‘The G8 protests and the logically inconsistent foundations of neoclassical economics’ further discusses this scientific pretence.
556px-Inglehart-Values-Map-Big

Read the rest of this entry »

The G8 protests and the logically inconsistent foundations of neoclassical economics

leave a comment »

Neoclassical economics is deductive, using a mathematical axiom-proof-theory format.   Arnsperger and Varoufakis (2006) list the three basic axioms of neoclassical as methodological instrumentalism, methodological individualism and methodological equilibration.   In such an approach the basic axioms have to be correct otherwise the whole framework becomes unsound.   In contrast to the deductive approach, the scientific approach is inductive, forming theories from observation and using prediction to falsify the theories (Neuman 2003, p. 51).   Neoclassical economists have become adept at avoiding empirical falsification by creating ad-hoc explanations as to why their theories fail to work when confronted with empirical evidence, for example the Efficient Market Hypothesis predicting dividend volatility in excess of price volatility but the converse is observed (Shiller 1981).   Falsification avoidance is the sign of a degenerative research program (Lakatos 1976).   So, rather than use empirical falsification, a more suitable approach to disprove deductive frameworks is to use a logical proof showing their axioms lead to an absurdity.   The Sonnenschein–Mantel–Debreu Theorem (Debreu 1959) proves the basic axioms of neoclassical economics are logical inconsistent.   The Sonnenschein–Mantel–Debreu Theorem (Debreu 1959) shows that starting with the first two axioms leads to a shapeless excess demand curve.   The shapeless excess demand curve means that there are multiple equilibria and equilibrium are unstable making the third axiom untenable.   To fix this problem, it is assumed that all goods have constant Engel curves.   A good would have a constant Engel curve if somebody spends the same proportion of their income on the good as their income grew (Keen 2001).   This is an unlikely scenario as when income grows then people consume more luxury goods and basic goods become a smaller fraction of their income.   Can you think of a good with a constant Engel curve?   Colander (2000, p. 3) equates neoclassical economics “to the celestial mechanics of a nonexistent universe” for using theory outside its domain assumption (Musgrave 1981).   That is neoclassical economics as a pursuit in pure mathematics for intellectual exercise is fine but claiming applicability to the real world is misleading. Read the rest of this entry »