Presenters, titles & abstracts
(as of 16th May 2005)
Aalborg University, Denmark
University of Turin, Italy
Technological knowledge as an essential facility
The application of the notion of essential facility and mandated interconnection to the governance of technological knowledge can be implemented by the adoption of the liability rule and the parallel reduction in the exclusivity of patents. Because knowledge is at the same time an output and an input in the production of new knowledge, this institutional innovation can improve the governance of technological knowledge and increase both its rates of dissemination and generation.
University of Jena, Germany
University Canterbury, New Zealand
CEPREMAP, Paris, France
A theory of effective complexity and its relevance to ill structured situations, including those found in evolutionary economics
Kurt Dopfer and Jason Potts
University of St Gallen, Switzerland and
University of Queensland, Australia
The scientific structure of evolutionary macroeconomics
University of Pisa, Italy
The evolutionary micro-foundations of macroeconomic dynamics
University of Oslo, Norway
The competitiveness of nations
Why do some countries grow much faster and have much better trade performance than other countries? What are the crucial factors behind such differences, and what can governments do to improve the relative positions of their economies? This paper outlines a synthetic framework, base on Schumpeterian logic, for analyzing such questions. Four different aspects of competitiveness are identified: technology, capacity, costs and demand. The framework is applied to a sample of 100 countries between 1993 and 2002.
Horst Hanusch and Andreas Pyka
University of Augsburg, Germany
The Neo-Schumpeterian research programme (provisional title)
University of Manchester, UK
University of Notre Dame, USA
Markets come to bits: evolution, computation and markomata in economic science
Based upon my previous historical work, I attempt to isolate and identify what appears to be a profound shift in the conception of the economy in recent economic research, focusing on five areas: mechanism design, zero-intelligence agents, ‘market microstructure’, engineering economics and artificial intelligence. The shift identified concerns treating markets as diverse algorithms, and will have profound effects upon the conceptual frames used to address the economy. Rather than deal in vague imponderables, in the paper we proceed sketch the emergent outlines of the implicit alternative program of an evolutionary computational economics constructed from the theory of automata which situates the problematic existence of diverse market species at the very center of the research agenda, and not, as happens all too frequently, to relegate it to the margins of modern economic thought. The laws that are sought under the new paradigm are laws of the markets, not laws of human nature.
Volterra Consulting London, UK
Information Cascades and the Distribution of Economic Recessions in Capitalist Economies
I consider in this paper the distribution of the cumulative size of recessions in 17 capitalist countries over the period 1871-1994, using data on annual percentage changes in real GDP. A recession is defined as a year in which GDP growth is negative, and the cumulative change is the change from peak to trough during a recession period. We examine both the whole sample and different partitions of the data.
The null hypothesis that the size distribution of recessions follows an exponential distribution is never rejected at the conventional level of statistical significance, p = 0.05. However, there are always a small number of large recessions, no matter how the data is partitioned, which are not well fitted by a least-squares regression of the log of size on the rank of size. In other words, in a qualitative sense we see a bimodal distribution of recessions, with an exponential fit to the bulk of the data, and a second peak describing a small number of very large recessions.
A previously published agent-based economic theory model of the business cycle, calibrated purely on US data, is able with no change in its parameters to generate an exponential distribution of the size of recessions very close to that which is actually observed. In this model, information flows between agents on a completely connected network.
University of Macerata, Italy
Knowledge and macro-coordination
This paper discusses the conditions for economic coordination at the macro level. Its focus is on the implications for the scope and the features of public action. Its main contention is that cognition has a key role both in terms of the internal consistency of the economy and in terms of the consistency between the economy and the society it is a part of.
The first part of the paper provides a brief outline of how major approaches to economic thought discuss coordination and consistency of the economy at the macro level. It suggests that these approaches can be distinguished in terms of the knowledge that they believe actors should share. This shared knowledge consists in a range of issues that actors need to interpret in a similar way if the economy and/or society is to achieve its goals.
Different theoretical approaches provide different definitions of knowledge. They define their field of inquiry, however, thereby explicitly or implicitly assuming what actors need to know, i.e. what issues are deemed relevant for them. In this perspective, theories range from the neoclassical approach - where agents only need to share data, which they will eventually compute – to the institutionalist approach of authors like K. Polanyi and K.W. Kapp, who suggest that societal disruption may be avoided only if shared knowledge includes what is required for its external consistency, i.e. its relation with the society it is a part of.
Based on this taxonomy, the next step in the paper consists in classifying different types of economic crisis in terms of the failure of the economy to meet some of the shared knowledge requirements. It therefore turns out that the theoretical approaches with a broader shared knowledge requirement are likely to provide more exhaustive accounts of crises and more appropriate policies. They are presented as the alternatives that arise when the more restrictive approaches fail both as explanatory devices and as means to outline a convincing policy strategy.
The last part of the paper focuses on the implications for public policy. It argues that policy makers must enable actors to achieve the shared knowledge they require, as individuals and as members of a community, by acting upon the appropriate institutional contexts. The issue, here, is what the relevant institutional contexts are. This is tantamount to asking what the actors need to be aware of, that is, what choices are relevant to them. Thus, if all that actors need is information on prices, the relevant institutions are likely to be those that enhance market institutions. Conversely, if the relationship between economy and society is at issue, democratic institutions are probably more relevant.
Apparently policy makers should simply enhance shared knowledge in its broadest sense: what is required for the economy’s internal, as well as external, consistency. The problem, here, is that the less restrictive relevant knowledge is, the more difficult it is to reduce it to technical issues, independent of value judgements. Greater knowledge implies greater awareness of alternative economic and societal priorities, and consequent attempts to change the status quo. This, in turn, may affect vested interests and determine social and economic turbulence. A paradox thereby emerges: policy makers should enhance shared knowledge in order to prevent economic crises but, in the process, they may actually determine the crisis they are trying to prevent.
Volterra Consulting, London, UK
Macro uncertainty and monetary policy
The Bank of England has introduced fan charts around the published forecasts for real GDP growth and inflation. These represent the Monetary Policy Committee’s view of the probability of observing outcomes that differ from those of the central forecasts. Specifically, they represent the range within which the Committee expects the outcome to take place 90 per cent of the time.
The fan charts reflect an implied probability density function of the outcomes that is based on the judgment of Monetary Policy Committee members. In this paper we compare the empirical evidence with the judgmental views of the Monetary Policy Committee. We do this by analyzing post-war data on growth and inflation in the UK economy, and use a variety of standard techniques to calibrate empirical probability density functions.
We find that the MPC judges that the probability of growth or inflation becoming negative is very close to zero, whereas the empirical evidence suggests that there is approximately an 11 per cent chance of the former and a 13 per cent chance of the latter.
We also consider why the MPC judgment might differ and apply the same empirical techniques to Australian and US data, to establish whether the risks of recession and inflation/deflation are similar to those in the UK.
INRA/SERD, Grenoble, France
Technical University, Eindhoven, The Netherlands
Towards a framework for Evolutionary Macroeconomic scenario analysis
Evolutionary economics is on a rise in academia as well as (selected) policy circles. Most of the theoretical work is focused on heterogeneity at the micro level, and the way in which meso-level selection mechanisms work upon micro level heterogeneity. This includes a wide variety of simulation models based on population dynamics.
Despite the publication of John Foster’s Evolutionary Macroeconomics (1987), one of the fields in which evolutionary economics has traditionally not been represented very strongly, is the field of macroeconomics, especially applied macroeconomic modeling for concrete policy purposes. Large-scale macroeconometric models are (still) used widely for policy evaluation and forecasting. Recently, the work in this field is shifting towards models addressing macroeconomic interaction at the global scale, i.e., world macroeconomic models have been arising (e.g., the WORLDSCAN model at CPB, the Netherlands).
It is argued that evolutionary economics can offer insightful models for this field, especially with regard to long-run scenario analysis. Using an evolutionary model changes the focus of the macro simulation exercises, from a predictive focus (in traditional models) to an ex ante scenario-based explorative evaluation. I will discuss the methodology common to this field, and how evolutionary economics fits into this. I will then present a concrete macroeconomic simulation model for the economy in the Netherlands. This model is based on a combination of (Post) Keynesian modeling principles (such as Thirlwall’s balance of payments restriction to growth, Pasinetti’s structural dynamics and Leontief’s input-output framework), as well as evolutionary ideas such replicator dynamics and the crucial role of technological change.
Max Planck Institute, Jena, Germany
Evolutionary economics -- how much on demand?
This paper will look into whether there are any insights evolutionary economics can offer for understanding the demand/consumption side of macroeconomics. I will argue that there is probably only a limited contribution to be expected for the short-run equilibrium/-disequilibrium dynamics of the Keynesian effective demand hypothesis type. However, more may have to be said about longer-term effects of the evolution of demand on economic growth, structural change, and employment. (These issues may come closer to Alvin Hansen and the post-WW II debate on satiation and under-employment).