Concern is on the rise regarding financial uncertainty, its form and its nature. Is it merely exacerbated during a temporary crisis? Or, with the most unexpected phenomena always possible, is any attempt at forecasting doomed to be vain? It turns out both commonplace statements are missing the point: financial uncertainty is permanent, malleable, and resistant - it is by no means an abstraction that could be dismissed through means of calculation. Far from being constant, its structure varies throughout history: it depends on the given institutional frameworks that allow the flow and recording of economic information.
In economics, what figures record is not limited to the activity or the agreed value taking place in an exchange. They also translate, in a numerical form, the conditions under which the representation of this value or activity took place. Said conditions are a complex mix of institutional frameworks and regulations, but also of fine tuning arbitration such as accounting entries. The body of these rules and conventions forms a framework on which people agree, and reassessing them is usually regarded as a pointless exercise.
However, while in business practice it is obviously preferable not to constantly re-examine these conditions, economists would do well to take keener interest. As a result of their training, and also out of habit, they have a predilection for calculation. In doing so, they tend to forget that such numerical material is but an abstract commodity.
Admittedly, some economists do pay attention to the influence of institutions. They attempt to measure their impact, and attribute functions to them. But the important point here is not about objects that a scholar would correctly size up at a distance. It has to do, actually, with the very idea of scientific objectification. For the formal abstraction of a phenomenon necessarily involves transformations governed by conditions that are both institutional and conceptual.
It should be made clear that it is not exactly an alteration of the phenomenon. There can be no alteration, given there is no initial and pure economic phenomenon to begin with that would or could be disrupted by the study. What we're talking about here is indeed the intellectual construction of the phenomenon. For example, what makes us tell an activity belongs to labor activity, and another to recreation. Or the distinction that we make between a profit transaction and a nonprofit transaction. Or, finally, the difference established by economists between unemployment and employment statuses, according to the rules established by institutions such as the ILO, the United States Census Bureau or the British OPCS. Extending this logic further, it could well be argued that what we call unemployment, a common notion that everyone agrees on, is merely an intellectual construct.
The point here is not to juggle with paradoxes, but to remind that the abstract construction of whatever objects studied by economists is gained at the cost of a combination of calculations and institutions.
Nonetheless, economic theory claims a scientific aim. So it should seriously consider its own theoretical assumptions, and especially remember this: figures are but the most comfortable grip that phenomena provide, nothing more. In this regard, real progress should be noted. Over the last two decades, researchers have showed new insight, a certain way of questioning the relevance of the categories they use. But we still do not have a new economic framework to replace the standard theory. And even the most active people in questioning the foundations of economics hardly ever venture outside of their field. Historians study the past forms of economic activity, science historians investigate the formation of calculations processes, and sociologists question the relationship between institutions and reasoning. While such deconstructing efforts are useful, we are still a long way from a genuine overhaul. However, a few breakthroughs can be found here and there. Let us take a topical example and focus more closely on the Stock Exchange, and more specifically on a factor that is pivotal both for the comprehension and the functioning of financial markets: the concept of financial uncertainty.
Of the rhythm of exchanges and price memory
However coherent, the activity of a stock market takes different forms depending on the way the information mobilized to describe it is organized, processed, and calculated using abstract indicators. An economist can only grasp it by choosing, in a well pondered manner, a given state of the combination of institutions and calculations. In doing so, he will take the risk of ending up with uncontrolled circular arguments. Another way, which is safer, is to work with controlled experimental variations of such combinations.
Does financial uncertainty retain the same form if one changes the settings within which variations of this activity play out? Spontaneous responses will tend to be affirmative.
Anyone inclined to doubt the impact of institutional givens will say that, with uncertainty being what it is, whether you measure it one way or the other will not change a thing. In doing so that person would revive an unconvincing posture of old – scholastic, as it is – which consisted, as a principle, in ruling out empirical observation...
Whoever has heard of fractals will also be inclined towards an affirmative answer, but for other reasons: if the randomness in question was actually of a fractal structure, it would indeed yield the same shapes at various aggregation levels. Yes, but... the question is not to stick to a powerful and convenient model: it is to empirically acknowledge whether the structure of uncertainty recorded by stock prices depends on the institutional frameworks in charge of its registration or not...
But financial uncertainty, it so happens, does not retain the same form, depending on the underlying conditions that calculate its objectification. This is what empirical experimentation demonstrates, as we are about to see.
This empirical approach was trailblazed nearly seventy years ago by sociologist Maurice Halbwachs, who pointed to the importance of memory in price formation: “In the stock exchanges where securities are traded, whose prices change not only from day to day, but during the same session, from one hour to the next, all the forces that modify the opinion of buyers and sellers are immediately felt. They also immediately convey that there is no other way to guess or predict what the prices are going to be other than guiding oneself after what they were at the closest moment. As we move away from circles where the trade activity is the most intense, economic memory slows down, relies on a more remote past, and falls behind the present time.” (The Collective Memory, repr., 1997, p. 222-223).
That is an observation that is at once quite realistic and still relevant: today, intensity at the desk is of this nature, its pace even faster than in the past for obvious technological reasons. Beyond such highly specialized and interconnected places, instruments that display updated prices at low frequencies have little meaning, except perhaps for ill-timed investors. Similarly, comments on stock markets coming from the general media are always lagging behind actual financial activity.
Empirical experimentation will therefore consist in varying the time frames implicit to the registration of economy data. In this matter, there is little data available, despite the current mass of information: for each transaction or aggregate of transactions, we have a time and a rank in a database. Little information, but it will suffice.
In fact, a date provides a marker inscribed on a universal chronological base: calendar time, clock time. As for rank, it provides another chronological clue: the succession of transaction registration acts, a discrete and cumulative sequential time. One can easily imagine two scenarios showing the difference between the two temporal frames of reference. The first is a quiet period for the security or set of securities considered: exchanges come one after the other. The second case comes at a time of overheating for the security or securities in question: during the same clock time, the very high intensity of exchanges is reflected in an increased frequency for successive registrations.
Let us have two observers, each of them embarked in one of the two temporal frames of reference. The one living at the pace of the clock will perceive a period of tranquility in the first case and, in the second, of frenzy. For the other observer, time will tick as a function of the registration of successive transactions in the database. Lacking any other temporal point of reference, he will only be able to see the following acts: “this transaction, and then this one, now another one... etc.”, and he will be unable to distinguish differences in intensity clearly registered by the first observer.
The hardest part is done. All that remains to be done, as it is, is to apply the conventional calculation of probability distribution – computing, for a given time interval, variations in profitability of a security or of a basket of securities. The calculation will be carried out with the same data, but twice: once at clock time intervals (this is the classic method, implicitly set in the frame of astronomical time); and a second time, according to the number of steps between two transactions.
All stock readings are well suited for such a double computation. Its application to famous indexes suffices to reveal variations in the contour of financial uncertainty. Below are three graphically rendered examples. In each case, the distribution of the profitability index is calculated with both the aforementioned benchmarks and compared with the grayed out area that corresponds to a “normal” distribution, that is to say distribution characteristic of Laplace-Gauss (or Laplace's second law).First example: how was uncertainty evaluated on the Paris Stock Market in the nineteenth century?
Our first example is that of the Paris Stock Market between 1815 to 1848 and up to the eve of the free trade policy allowed by the Second Empire. It is based on the retrospective reconstruction of an index comparable to today’s CAC40 (based on a 2008 publication by economist and historian Pedro Arbulu) and its sequential aggregation through volumes of trade, extrapolated from their taxation (based on the work published by his colleague Angelo Riva in 2009). If we consider the profitability index, even at coarse scales, such as monthly aggregates in the calendar repository and aggregates of the same order of magnitude of 200 000 francs of tax revenue, the assumption of a normal Gaussian distribution of uncertainty cannot be supported (see Figure 1, inset).
Let us briefly get into technical details, which will allow to highlight two points. To assess the “slenderness” of the distribution curve, there is an index called “leptokurtosis” (K). If the distribution is normal according to Laplace-Gaussian rules, then leptokurtosis is equal to 0. The higher the leptokurtic index, the more the distribution is concentrated, close to the mean, trimmed around the edges and with “fatter tails”: the curve is finer and sharper than the famous bell curve which corresponds to the Gaussian distribution.
Practically, this reflects the fact that the security or basket of securities in question has experienced drastic variations. This is precisely what happened in the case of the Paris stock exchange in the 19th century. This observation is also consistent with a finding that several authors have acknowledged following the works of mathematician Benoît Mandelbrot. A finding that has drawn the attention of specialists: it turns out that in things financial, extreme events are less rare than the second law of Laplace would have predicted.
But that is not all. Leptokurtic distribution of probability law, as regards profitability of securities in the Paris stock exchange between 1819 and 1859, is stronger when it is calculated according to a calendar frame of reference than according to a transaction-embedded temporal frame following their rhythm. It is even in this case, i.e. following the transactions’ rate and not the clock rate, that the lowest values of the leptokurtic index (K) are obtained, among the various calculations methods explored. To anyone with an eye firmly fixed on business, such uncertainty had the sweetness of things financial described in the seventeenth and eighteenth centuries: the highs and lows tirelessly balanced out and everything revolved around an average.
However, during this long period that runs from the seventeenth century to the first half of the nineteenth century, corresponding to the slow formation of economic reasoning and of processes and statistical calculations, people did not use (neither at the shop, nor at the Exchange) processes explicitly based on the calculus of probabilities developed after Laplace, which he synthesized in the early 1810s (his Analytic theory of probabilities is exactly 200 years old this year). The theory of averages posteriorly promoted by Adolphe Quetelet was to be admitted by statisticians, economists and heads of financial institutions only in the second half of the 19th century. Economic agents had therefore no means of statistical objectification to rely upon which would have corresponded to the present principles of dispersion calculus, although the latter are more sophisticated – nor of course to the homologous calculations in sequential time that are proposed here.
The knowledge that specialists possessed came from being at the heart of transactions. So that the now classic calculation of uncertainty distribution of profitability in Paris, in the early 19th century, in a calendar frame, does not make much sense, in retrospect, if one seeks to reconstruct the perception of trading activity of the time. Quite the contrary, sequential reference calculation is what best translates what someone familiar with the Stock’s transactions could perceive then. Reasoning relied on guesswork or on average estimation, and roughly addressed the perception of uncertainty displayed by connoisseurs of the Parisian market. The Laplacian-Gaussian process, promoted by Adolphe Quetelet from the years 1820-1830 onwards therefore had every opportunity to match what the experts of the time had in mind and felt about the uncertainty of economic information.
A second example, Wall Street in recent years: up close, apparent calm, from afar, volatility
What of New York in the early 21st century? Homologous calculations are easily made based on the Dow Jones index (DJIA) for which, in this case, we have both the price and volume of recorded transactions. For the period running from January 5, 2004 to April 27, 2009, for example, one can establish a daily interval in calendar time as well as a comparable volume interval, with transactions of $ 7 billion, in sequential time. The curves are similar but they are denser in the central zone, more hollow around (see Figure 2). Leptokurtic indexes confirm that. Again, one finds what various financial mathematicians have been assessing for over two decades: the dispersion of uncertainty in financial profitability is certainly not consistent with normal, Laplacian-Gaussian dispersion. But the comparison between Paris and New York, at an interval a century and a half away, further evidences that the very structure of uncertainty is not even the same anymore: in the meantime, leptokurtosis has increased - dramatically.
So much so that over the course of exchanges, and with astronomical time notwithstanding, the structure of financial uncertainty in New York in the early 21st century bears no similarity to a normal curve anymore. An onboard observer following the rate of exchanges as recorded by the Dow Jones, in other words a trader glued to his screens to the point of forgetting the clock, will be subject to this impression of long periods of calm followed by brief moments of turbulence described by professionals and those who study them – something quite different from the intuition that would be prompted by measured and mundane fluctuations around an easily guessed average.
As for the feeling of a distant observer, who does not follow real time stock quotations but does so “from a distance”, at fixed times so to speak, it is a very unpredictable world, falsely stable, where suddenly, at the most unexpected moments, a given quotations collapses or consolidates itself – but once anyone realizes it, it is already too late. Both the commentator and the principal, far from the desk, will therefore be more nervous than the trader, to his great benefit: they will have no choice but to rely on him. Differentiation of uncertainty rates inherent to the flow of economic information therefore induces the formation of an element of division of labor (i.e. between principals, financial engineers and commentators). It increases the autonomy of experts who are “at the heart of information.”
But one might ask: must we resort to probability distributions calculus that rely on temporal frames of reference that are a tad strange – though very intuitive – to conclude that the stock exchange’s very life is more hectic in New York in 2007 than it was in Paris in 1840, and that in the meantime, financial engineering has stratified into a professional competence? It is understood. However in order to say such a thing it needed to be comprehended, and we hereby have access to indexes that are easily computable and comparable, are consistent with standard calculations and which empirical scope is broader.
Third example, Wall Street, 1973-2003: from a distance, apparent calm; up close, a storm
Taking a look at the previous period, that of the late 20eth century in New York, is probably best to convince yourself that the process is worth your while. The calculation is similar to the previous one but the sequential time interval amounts to $ 1.5 billion, an order of magnitude comparable to the daily number of exchanges during the period at hand, from January 2, 1973 to December 31, 2003 (see Figure 3). This period is characterized by forms of financial uncertainty dispersion that are distinctly different. In fact, such uncertainty about the profitability of Dow Jones, assessed over transactions’ activity flow, has been much rougher than the one appraising it at calendar rate. The gray curve and the black curve are inverted in comparison to the two previous graphs.
“Go with the flow” uncertainty, in times past, was less harsh than that which could be appraised from a distance by referring to the clock and its astronomical time. But in the course of the last decades of the 20th century, it became the opposite: seen up close, financial uncertainty seemed steeper, while from a distance it appeared softer. However the only calculations available among professionals were (and still are) those built around the calendar standard. Specialists who peered at screens were using technical instruments that they felt were smoothing reality. Yet, seen from afar, principals and commentators did not perceive all this harshness and were to trust the most knowledgeable experts. The latter therefore had every reason to rely on their intuition, to defy calculations, and therefore indulge in a seemingly unbridled speculation, even if this meant getting to rationalize such behavior by invoking the emergence of a few black swans. As we see, the high opinion they held of themselves was not entirely unfounded. But a very short while later, with a turning point situated roughly around 2004, the very same intuitions were not so accurate anymore – they probably became more hazardous, and thereafter, more frequently on the losing end.
The relativity of financial uncertainty
Financial uncertainty is not an abstract thing that one could do away with by means of some clever calculation. Neither is it that omnipresent quasi-entity, a power that should convince us that calculation is doomed to misery. Financial uncertainty is a concrete thing, one of the components that are vital to economic exchanges. It is something malleable whose form, as we empirically grasp it, depends on the institutional frameworks shaping the circulation and registration of economic information and of the calculation processes used to this end. This information, its transformations and its calculations are components of economic activity: they are accounting standards and operations ranging from creating double-entry tables to balancing a system of equations, or constructing a stochastic process.
Consequently, scientists must take note that grasping a phenomenon has meaning only in relation to a particular frame of reference, which is not to say – here as in physics – that this phenomenon is inconsistent, far from it. Only by relating it to the various frames of reference and then comparing morphologies can it be circumscribed and can its consistency be highlighted and evidenced. Here, the databases of rates and indices provide considerable empirical material, of which we have commented only two highly aggregated series.
Probably, much remains to be done in economics and economic history. Anyway, there is strong relativity – in the epistemological sense – of financial uncertainty regarding the frameworks implicit to the registration of economic information. Only by taking this strong relativity as a working basis can it be rigorously characterized and therefore, can an analysis of the characteristic movements of financial activities and transformations of social surroundings that contribute to such change be outlined.
But each of us holds an intuitive or practical relation (sometimes called ‘folk knowledge’) and in any event not a learned one, towards this uncertainty whose structure is consistent at once, variable and relative. Here, the social science inquiry finds its material (sociology, ethnography, economics, economic history). The very sophisticated financial calculation processes that we know today finally appear as extensions of these skills that are not formalized: they are only characterized by their affiliation to a particular frame of reference, universal time. Sometimes intuition is better than calculus, and sometimes not...