Skip to content

Scotland’s currency options

One of the controversial topics in the debates of Scottish independence is the currency question. The other day, the Financial Times asked several economists to consider four options available to an independent Scotland: a currency union with the UK, sterlingisation (which would be the continued use of the pound sterling but without backing of the Bank of England), establishing a new Scottish currency, and joining the euro. Not surprisingly, opinions were divided.

However, strictly speaking, the choice is not limited to these four options. Besides, there are others which deserve a closer look as well in this context.

In general, currency regimes differ by

the number of participants. Arrangements may be multilateral (such as the European currency union), bilateral (like the agreement of two countries to accept one another’s currency as legal tender. Think of the Belgian francs which were legal tender inside Luxembourg, as were Luxembourg francs in Belgium, before the introduction of the euro) or unilateral as under sterlingisation or under the current Swiss franc cap vis-à-vis the euro.

the degree of institutionalisation ranging from a managed float over various forms of currency pegs (with or without rule-bound or discretionary parity changes) to a currency board and a fully-fledged currency union.

the currencies involved. Arrangements may refer to an individual currency such as the US dollar or the British pound or to a currency basket like the Special Drawing Rights (SDR).

the degree of flexibility determining the range and frequency of changes allowed within existing arrangements. (Examples are the bands in the European Monetary System (EMS) which started with +/-2.25% in 1979 and ended up at +/-15% in August 1993.)

the criteria on which occasional changes are based.These include economic indicators such as relative prices or current-account data, foreign exchange reserves or market prices.

These are the most common currency regimes:

Free float This is a regime where a currency’s exchange rate is allowed to fluctuate without government or central bank interference.

Free floats are rarely observed in practice as even if authorities abstain from buying and selling their currency in the market they may exert an indirect influence by choosing appropriate economic strategies, signaling their preferences in one way or the other or bluntly trying to talk the currency up or down.

Managed float In principle, under a managed float exchange rates are determined by supply and demand in the foreign exchange markets, but governments and central banks intervene sporadically or systematically for different reasons.

Although exchange-rate volatility is widely regarded as a problem for trade and economic development most countries’ currencies are floating, including the US dollar, the Japanese yen, the British pound and the euro.

Unilateral peg Under a unilateral peg countries fix the parity of their currency vis-à-vis a major currency or a basket of currencies.

The chosen currency is typically the leading reserve currency in a region or worldwide and/or the invoice currency of the main trading partners. For Scotland, fixing the parity of a newly established currency to the British pound would be one option, other candidates could be, for example, the US dollar or the euro, or a basket of currencies of trading partners.

Crawling peg A crawling peg is a fixed-exchange rate system which allows for regular parity adjustments.

Countries which temporarily adopted a crawling peg include Poland (1991-2000) and Hungary (until 2001). According to the IMF, in October 2013 only two countries (Nicaragua and Botswana) had adopted a crawling peg, 15 had a crawl-like arrangement. (Note that the IMF classification of regimes slightly differs from the one chosen here.)

Unilateral currency substitution Under this regime countries adopt a foreign currency in parallel to, or instead of, their domestic currency.

Sterlingisation is one example. Another is dollarisation with the US dollar as substitute.

Multilateral systems of fixed but adjustable exchange rates Countries formally agree to fix the exchange rates of their currencies either vis-à-vis one another or a third currency.

Examples are the Bretton Woods System and the European Monetary System, but also the franc zone. Adjustments usually take place as reaction to currency crises. There is an element of flexibility through fluctuation margins or bands around the parities.

Currency board A currency board goes beyond a mere fixing of the exchange rate. It is a constitutional guarantee of a currency´s foreign value which comprises explicit restrictions on the government´s ability to print money. Currency can be issued only in exchange for the foreign currency against which its rate had been fixed. The advantage of such a system is credibility. The disadvantage is that monetary policy is determined in the country of the reserve currency and the authorities lose the means to shield the economy from shocks.They cannot raise interest rates to defend the value of their currency or to fight inflation or act as a lender of last resort in the local currency. If there is a bank run, banks cannot turn to the central bank. However, given the system’s high credibility crises may occur less frequent and be less severe.

Currency boards exist in countries as diverse as Argentina, Hong Kong, Latvia and Estonia. To mention is the particular success of Hong Kong in withstanding speculative attacks on the HK dollar during various economic shocks and political crises such as the 1987 stock market crash, the Tiananmen event in Mainland China in 1989, the Gulf War in 1990, the speculative attack after the Mexican currency crisis of 1995 and the Asian financial crisis of 1997/98.

Currency union A currency union is a form of currency substitution where all parties agree on the arrangement and adopt a common currency.

As under a currency board, this implies complete surrender of the monetary authorities’ control over domestic monetary policy. In contrast to a currency board a currency union has no quasi-automatic, built-in credibility as the eurocrisis demonstrates.

As the list shows, an independent Scotland would have more options than most observers currently assume. As a rule, the choice for or against a particular regime is between flexibility and commitment with success or failure depending on both the ability to swiftly react to undesirable market movements and to credibly uphold the system.

The problem is that for running smoothly most regimes would require an idea of the main determinants of short- and long-term exchange-rates in order to decide when and how to alter a course, adjust parities or change the rules. In a global environment, where exchange rates have become the game ball of all sorts of interdependent financial markets rather than the trade flows still dominating traditional views, theory fails to provide a solid framework for policy making as I wrote elsewhere. As a consequence, authorities tend to more or less helplessly react to market turbulences without a clear concept instead of steering an active and independent course.

Given these conceptual weaknesses I would generally have a preference for a currency regime which allows for a high degree of flexibility. Weighing the costs of exchange-rate volatility under floating against those which may arise at the breakdown of a system of fixed exchange rates, or the break-up of a currency union, the former seems the more desirable alternative. However, if a more institutionalised approach to limiting currency fluctuations and a stronger commitment is sought, a unilateral or bilateral approach would be preferable to a multilateral regime. My impression is that unilateral or bilateral arrangements are more solid and have a lower risk of failure than multilateral ones. Unilateral arrangements can function well over long periods of time as the example of Hong Kong demonstrates. Panama has used the US dollar for more than a century, as Joseph Stiglitz stressed. It is the big multilateral constructs of heterogeneous countries with diverging interests – the Bretton-Woods System, the European Monetary System, the European Monetary Union – which are fraught with conflicts and uncertainties and always on the brink of collapse.

With respect to an independent Scotland the transition period would be difficult no matter which currency solution would be envisaged. I agree with Frances Coppola that Scotland should have its own currency. In the beginning, a currency union with the rest of the UK could facilitate businesses’ adjustments and limit their costs of transition. But, given the heterogeneity of the two economies keeping the British pound seems no sustainable solution.

I also think that adopting the euro – unilaterally or by joining the eurozone – is no desirable alternative either. The eurozone is an unstable construct which under a resurgence of market volatility or changing political constellations may easily fall apart hurting in particular its smaller members and all those linked to it. The resulting uncertainty is the last thing a new country in transition would need to deal with.

The danger of high currency fluctuations notwithstanding in my view the flexibility of a floating exchange rate would make it easier to cope with the huge challenges of separation since it would provide policy with an additional instrument. Like Frances Coppola, I see the advantages of a currency board, but I would advise to allow for a transition period to decide the matter. The big advantage of a currency board is credibility. Its big disadvantage is loss of monetary control. In a sense, here monetary independence is traded for less dependence on the markets’ varying moods. But who knows? Maybe, in the case of Scotland, after an adjustment period the currency issue would turn out to be less dramatic than expected, the merits of a managed float would have become apparent, and the New Scotland would be spared the choice.

Interconnectedness, uncertainty and innovation – Brian Arthur on economic complexity

The other day, Charles Consult on Twitter drew attention to a fascinating paper by W. Brian Arthur on Complexity Economics.

At a time of never-ending financial crises and lasting imbalances when people are becoming increasingly aware of the shortcomings of traditional economics with its focus on equilibrium as the natural state of an economy, Brian Arthur presents a different approach, a nonequilibrium view of economic processes and structures. In his paper, in acting and interacting firms, consumers, investors and other economic agents collectively create an outcome and then adjust their strategy in response to what they see they have created. The result of these adjustments is another outcome which causes them anew to make revisions – and so forth.

As a consequence, the economy is constantly in motion. It is not “something given and existing but forming from a constantly developing set of technological innovations, institutions, and arrangements that draw forth further innovations, institutions and arrangements.” In this scenario, equilibrium is the exception, not the rule. As the author notes: “For highly interconnected systems, equilibrium and closed form solutions are not the default outcomes; if they exist they require justification.”

 

 

Complexity economics is a field of research which developed at the Santa Fe Institute in the late 1980s and which has won many followers worldwide ever since. As the author emphasizes it is not just an extension of standard economics but an entirely different way of seeing the economy. Agents “buy and sell, speculate trade, oversee, bring products into being, offer services, invest in companies, strategize, explore, forecast, compete, learn, innovate, and adapt. In modern parlance we would say it is a massively parallel system of concurrent behavior [my emphasis]. And from all this concurrent behavior markets form, prices form, trading arrangements form, institutions and industries form. Aggregate patterns form.”

“Complexity is about formation—the formation of structures—and how this formation affects the objects causing it.”

Arthur contrasts this view with traditional economics asking not “how agents’ behaviors would react to the aggregate patterns these created, but what behaviors (actions, strategies, expectations) would be upheld by—would be consistent with—the aggregate patterns these caused. It asked in other words what patterns would call for no changes in micro-behavior, and would therefore be in stasis, or equilibrium.”

He gives three examples:

General equilibrium theory is asking what prices and quantities of goods produced and consumed would be consistent with—would pose no incentives for change to—the overall pattern of prices and quantities in the economy’s markets.

Classical game theory asks what strategies, moves, or allocations would be consistent with—would be the best course of action for an agent (under some criterion)—given the strategies,moves, allocations his rivals might choose.

Rational expectations economics asks what expectations would be consistent with—would on average be validated by—the outcomes these expectations together created.

The advantage of the traditional approach is, as Arthur notes, that it is more amenable to mathematical analysis. He admires it for its finesse and mathematical elegance, but “the construct is too pure, too brittle—too bled of reality.” He argues:

“If we assume equilibrium we place a very strong filter on what we can see in the economy. Under equilibrium by definition there is no scope for improvement or further adjustment, no scope for exploration, no scope for creation, no scope for transitory phenomena, so anything in the economy that takes adjustment—adaptation, innovation,structural change, history itself—must be bypassed or dropped from theory. The result may be a beautiful structure, but it is one that lacks authenticity, aliveness, and creation.”
As agents interact, and in interacting create patterns, and then in turn react to the patterns they created together, nonequilibrium arises endogenously in the economy and is not the result of an exogenous disturbance from outside such as a natural disaster after which the economy tends to return to a state of equilibrium again. Arthur gives two main reasons for endogeneity. One is fundamental uncertainty, the other is technological innovation. Both trigger self-reinforcing processes which hinder the economy to come to rest:

Fundamental (or Knightian) uncertainty arises from the fact that “all problems of choice in the economy involve something that takes place in the future, perhaps almost immediately, perhaps at some distance of time.Therefore they involve some degree of not knowing. In some cases agents are well informed, or can put realistic probability distributions over events that might happen; but in many other cases—in fact in most cases—they have no basis to do this, they simply do not know. I may be choosing to put venture capital into a new technology, but my startup may simply not know how well the technology will work, how the public will receive it, how the government will choose to regulate it, or who will enter the space with a competing product. I must make a move but I have genuine not-knowingness—fundamental uncertainty. There is no “optimal” move. Things worsen when other agents are involved; such uncertainly then becomes self-reinforcing. If I cannot know exactly what the situation is, I can take it that other agents cannot know either. Not only will I have to form subjective beliefs, but I will have to form subjective beliefs about subjective beliefs. And other agents must do the same.Uncertainty engenders further uncertainty.”

Technological change is the other driver that hinders the economy from coming to a standstill. Arthur refers to Schumpeter and his view of technology as a source of energy. But he regards this force as even more disruptive:

“Novel technologies call forth further novel technologies: when computers arrive, they call forth or “demand” the further technologies of data storage, computer languages, computational algorithms, and solid-state switching devices. And novel technologies make possible other novel technologies: when the vacuum tube arrives, it makes possible or “supplies” the further technologies of radio transmission and receiving, broadcasting, relay circuits, early computation, and radar. And these novel technologies in turn demand and supply yet further technologies. It follows that a novel technology is not just a one-time disruption to equilibrium, it is a permanent ongoing generator and demander of further technologies that themselves generate and demand still further technologies … Notice again theself-reinforcing nature of this process. The result is not occasional disruption but ongoing waves of disruption causing disruptions, acting in parallel across the economy and at all scales within the economy. Technology change breeds further change endogenously and continually, and this throws the economy into a permanent state of disruption.”

The result of interconnectedness, uncertainty and innovation is complexity:

“A picture is now emerging of the economy different from the standard equilibrium one. To the degree that uncertainty and technological changes are present in the economy—and certainly both are pervasive at all levels—agents must explore their way forward, must “learn” about the decision problem they are in, must respond to the opportunities confronting them. We are in a world where beliefs, strategies, and actions of agents are being “tested” for survival within a situation or outcome or “ecology” that these beliefs, strategies and actions together create. Further, and more subtly, these very explorations alter the economy itself and the situation agents encounter. So agents are not just reacting to a problem they are trying to make sense of; their very actions in doing so collectively re-form the current outcome, which requires them to adjust afresh. We are, in other words, in a world of complexity, a complexity closely associated with nonequilibrium.”

The main part of the paper deals with the implications of this view for theorizing which are truly challenging. But read for yourself …

W. Brian Arthur: Complexity Economics: A Different Framework for Economic Thought,Institute for New Economic Thinking (INET) Research Note #033‚ March 2013.

The social value of HFT – Salmon, Stiglitz and the problems of HFT reconsidered (2)

In a recent article, Felix Salmon summarized the key points of a widely noticed speech delivered by Nobel prize-winner Joseph Stiglitz on the problems of financial innovation in general, and high-frequency trading (HFT) in particular.

Stiglitz asked Are Less Active Markets Safer and Better for the Economy? and came to the conclusion that they are. His arguments revolved around three aspects: speed, costs and social value.

In the first part of this article, the speed aspect was discussed. This second part deals with the costs and social value of HFT.

As we have seen, HFT proponents emphasize the advantages of reduced spreads, higher liquidity and improved price discovery for individual investors. Joseph Stiglitz, however, doubts that ultimately these advantages will benefit the economy as a whole. In his view, in principle, high-frequency trading is a zero-sum game in that the profits made by some market participants are the losses experienced by others. Taking into account that the technology costs real resources HFT is becoming a negative sum game for the economy. In particular, Stiglitz argued that

(1) there are limits to the social value of faster price discovery. The individual investor may benefit from obtaining information before someone else, but his private return can increase, at least in part, from taking “rents” that otherwise would accrue others:

“… if sophisticated market players can devise algorithms that extract information from the patterns of trades, it can be profitable. But their profits come at the expense of someone else. And among those at whose expense it may come can be those who have spent resources to obtain information about the real economy. These market players can be thought of as stealing the information rents that otherwise would have gone to those who had invested in information. But if the returns to investing in information are reduced, the market will become less informative. Better “nanosecond” price discovery comes at the expense of a market in which prices reflect less well the underlying fundamentals. As a result, resources will not be allocated as efficiently as they otherwise would be.”

(2) Improved price discovery does not necessarily lead to better resource allocations. Stiglitz argued that knowing information this much faster does not improve the allocation of capital to one industry or another:

“Those making real decisions, e.g. about how much to invest in a steel mill, are clearly unlikely to be affected by these variations in prices within a nanosecond. In that sense, they are fundamentally irrelevant for real resource allocations.”

(3) Increased market volatility is harming the economy. The point is made by other authors as well. In Michael Lewis (2014) Flash Boys: Cracking the Money Code Satyajit Das, for example, argued that “increased trading volumes increase volatility, which actually has a detrimental effect on capital formation and investment.” Or, as Felix Salmon put it: “… faster price discovery is generally associated with higher volatility, and higher volatility is in general a bad thing, from the point of view of the total benefit that an economy gets from markets.”

(4) High-frequency trading is producing social waste. Stiglitz wrote: “There is an additional level of distortions: the informed, knowing that there are those who are trying to extract information from observing (directly or indirectly) their actions, will go to great lengths to make it difficult for others to extract such information. But these actions to reduce information disclosure are costly. And, of course, these actions induce … traders to invest still more to figure out how to de-encrypt what has been encrypted.If, as we have suggested, the process of encryption and de-encryption is socially wasteful—worse than a zero sum game–then competition among firms to be the best de-encryptor is also socially wasteful. Indeed, flash traders may have incentives to add noise to the market to disadvantage rivals, to make their de-encryption task more difficult. Recognizing that it is a zero sum game, one looks for strategies that disadvantage rivals and raise their costs. But of course, they are doing the same.”

The first argument reminds of the distinction economists tend to make between “fundamentalists” and “chartists” (as described here in Chapter 5). ”Fundamentalists“ are strongly influenced by economic theories. They gather firm-specific data and information about macroeconomic developments in order to form expectations about future prices. In contrast “chartists”, or investors relying on technical analyses, assume implicitly or explicitly that prices already reflect all relevant information available. They try to identify systematic patterns in financial series which may be exploited for trading.

In practice, there are countless variants of both fundamental and technical approaches. Both are frequently applied side by side– albeit with different weights – at all time scales. In both cases investors spend considerable resources to obtain the respective information. Furthermore, the way they form expectations about future market movements and the time horizon – years, months, hours or less – say nothing about motives. Those gathering information about the “real economy” may be no more involved in non-financial “real” business than those studying patterns, and both may make use of algorithms. Therefore, in principle, the “stealing” of information rents Stiglitz described should work both ways.

The second point, that faster information does not necessarily lead to better resource allocations seems less controversial at first view. For example, a similar argument was made, by James Allworth in High Frequency Trading and Finance’s Race to Irrelevance. He wrote:

“The broader point that has been missed in the discussion around HFTs is that they actually have very little impact on how companies are run. Because HFT firms are holding stocks for milliseconds, they’re not ever in a position where they’re voting on corporate governance issues. They have no real interest in the underlying fundamentals of the stock. As long as the stock is trading — regardless of whether it’s going up or down — HFTs can take their cut. Because of this dynamic, executives have no reason to pay any attention to them when making decisions.

In terms of the real world of building businesses and creating value, basically, HFTs don’t matter.”

The latter statement may hold for very short-term variations of a company’s own stock price although (1) the relation between short- and longer-term market dynamics and the way in which short-term influences feed into longer-term price movements is far from clear and (2) whether a stock is trading at all may depend at times on high-frequency traders’ willingness to make a market.

The argument certainly does not hold for firms in their roles as investors and traders in foreign exchange, financial and commodities markets in course of running their business where they benefit from the advantages of HFT presence. Since many years, finance for “real” business has become a sort of by-product of the financial industry. Producers, merchants, investors and customers are able to trade in markets which are considered offering best prices because they are wide, deep and liquid. No matter whether faster information, reduced costs of trading and higher liquidity are resulting from high-frequency trading or other activities – remember the observation by Maureen O’Hara in High Frequency Market Microstructure that all trading is now fast – for companies these advantages can be expected to make a difference and influence both daily decisions and the firms’ strategic choices.

The third point is based on assumptions about the economy which may or may not hold. Regarding the relationship between speed and volatility Stiglitz referred to insights gained from his own theoretical research:

“There is a big debate over whether HFT or the other areas of hyperactive trading (cross border flows, derivatives) results in more or less volatility. As a matter of theory [my emphasis], we have made two observations: First, improved price discovery at a moment of time, by definition, increases price dispersion at that moment. Second, opening up new betting opportunities among those with heterogeneous beliefs results in more consumption volatility, which in turn can lead to more macro-economic volatility.”

Whether the theory is right in this case, and whether the relation between price discovery and macroeconomic volatility is as described – and is a dominant influence – is open to dispute. In his summary of recent theoretical and empirical research Charles M. Jones, for instance, pointed to another important chain of causes and effects stressing the role of stock market liquidity as “the main channel by which HFT can have societal value”. Accordingly, higher liquidity leads to a decline in transaction costs and

“explicit transaction costs affect share prices, because they subtract from returns every time a share of stock is bought or sold. A buyer knows that she will have to sell one day and incur transaction costs. She also knows that the investor who buys from her will have to pay transaction costs when he buys and again when he sells, and so on down the line. Thus, share prices should be reduced by the present value of all expected future transaction costs. Conversely, anything that permanently reduces transaction costs should permanently increase share prices.”

And:

“When they are justified, higher share prices are valuable for the economy, because they lower the cost of capital for firms. With a lower cost of capital, more investment projects are profitable, and firms should increase their level of investment. Greater investment should lead to higher levels of GDP and a better standard of living.”

But, following Joseph Stiglitz in his fourth point one would argue that social waste may reduce or eat up whatever benefit an economy is getting from high-frequency trading. Felix Salmon quoted the following example he gave:

“If there is one umbrella, and there is a 50/50 chance of rain, if neither of us has any information, the price will reflect that risk. One of us will get the umbrella. If it rains, that person will be the winner. If it does not, the other person will be the winner. Ex ante, each has the same expected utility. If, now, one person finds out whether it’s going to rain, then he is always the winner: he gets the umbrella if and only if it rains. If the other person does not fully understand what is going on, he is always the loser. There is a large redistributive effect associated with the information (in particular, with the information asymmetry), but no real social benefit. And if it cost anything to gather the information, then there is a net social cost.”

In Slow Your Judgments on Fast Trading Noah Smith made a similar point. He asked:

“What is the social benefit of all that expenditure? Does the reduction in bid-ask spreads — the much-vaunted liquidity provision of HFTs — really help companies know when to invest, buy back shares, issue dividends, etc.? Does it really help people use markets to share risk? It isn’t clear. We’ve understood at least since Jack Hirshleifer’s famous 1971 paper that the private benefit of getting information can exceed the social benefit. In other words, the billions HFTs spend in order to beat people to the punch by a couple of milliseconds might not be boosting the economy as a whole.”

In the paper Noah Smith mentioned Hirshleifer, too, gave an example for the discrepancy between private and social benefit of information. He wrote:

“These considerations may be clarified by reference to a well-known activity for the generation of public information – horse racing. Viewed as a research activity, horse racing may be presumed to have a small positive value: the identification of faster horses works “to improve the breed.” This consideration is evidently a very minor motivating factor for the activity in comparison with the opportunity to speculate upon one’s supposedly superior knowledge. Without differences of opinion, it is said, there would be no horse races. That is, the social value is insufficient to motivate research – the activity is founded upon the contradictory expectations of speculative gain.

Suppose that it costs $100 in real resources to run a horse race, and that the social advantage of knowing which is the fastest horse is just $5. Evidently, if the race is run society is engaging in excessive research. Now imagine that the potential speculative gain, to an individual convinced that his horse is truly faster, is just $90 – he could still not earn enough, himself, to cover the costs of the race. But if several individuals are so convinced, each about his own horse, they may cooperate to stage the experiment. So conflict of beliefs may enormously compound the speculative factor that, even from the point of view of a single individual, tends to promote excessive investment in information-generating activity.”

It is hard for HFT proponents to counter these examples of horses and umbrellas as they are crude thought-experiments specifically designed to make the point against the social value of betting which have no equivalent in real markets. But it is interesting to note the different ideas Stiglitz and Hirshleifer have of what determines social value.

In the Stiglitz example, social value is derived from the allocation of an umbrella via price or market mechanism. In the example, this works perfectly well. Two persons are bidding for the umbrella and the one getting it, having more advance knowledge than the other, can make best use of it.

Why does this result not count as social value? Because the motives of the winning person are doubted and his price discovery based on superior information is not considered as price discovery “in the relevant sense” – although the umbrella IS allocated and fulfilling its purpose.

Society as a whole is not benefitting in this case – Felix Salmon is wonderfully clear about this – because among the two bidders one is worth less than the other. It is an “algobot” who benefits from the trade and not a “real-money investor”.

If you ask me, this is not science. This is propaganda.

In the horse racing example, Hirshleifer assumes from the beginning that there is an exogenously determined positive social value independent of the position of individuals and independent of the cost of running a race which, by assumption, is high but borne by the speculators. The author simply states that the social value is small. The cost of production, i.e. of running the race, is contrasted not with this assumed positive social value but with the sum of individual gains which must be expected in advance to cover it, otherwise the race would not take place. Again, the example contains more questions than answers. Shouldn’t society in this case rather encourage betting because it allows it to benefit from a “research” it otherwise could not afford? There are many parallels to the HFT world – but maybe not in the sense Noah Smith had in mind.

What are the costs which in Stiglitz’s and other authors’ views render HFT a negative-sum game for society?  Stiglitz focused on the costs associated with efforts to reduce information disclosure, and the process of encryption and de-encryption. Others stressed the fees exchanges charge for co-location, data feeds, and order types. Izabella Kaminska drew attention to the cost of storing, transmitting and analyzing data which increased with rising quote traffic. In 2011, in Beware the market spam, she cited Eric Hunsader, founder of the data services company Nanex, who wrote that this cost “increases much faster than the rate of growth: that is, doubling the amount of data will result in much more than a doubling of the cost. For example, a gigabit network card costs $100, while a 10 gigabit network card costs over $2,000. A gigabit switch, $200; a 10 gigabit switch, $10,000.”

In the last paragraph of the epilogue to Flash Boys Michael Lewis aroused the curiosity of his readers referring to Federal Communications Commission license number 1215095, a number he detected on a metal plate attached to a fence around a microwave tower in the countryside. In 1215095 – The Flash Boys Mystery Solved Themis Trading, a fierce critic of high-frequency trading, lifted the secret thereby hinting to one of the biggest and most debated sources of cost related to HFT – the efforts to reduce delays in information propagation between physically distant exchanges and to minimize latency – which is defined as the overall time it takes to receive signals from a trading venue, make trading decisions, and transmit the resulting order messages back to the trading venue (Charles M. Jones).

As Themis Trading explained, the microwave tower Michael Lewis referred to is located in Potter Township in Pennsylvania where it is used to beam stock quotes between Chicago, location of the Chicago Mercantile Exchange (CME), and Carteret, New Jersey, the primary data centre for Nasdaq OMX Group (NDAQ).

New York and Chicago are the two great trading places which emerged in the United States by “historical accident”as Jerry Adler put it in Raging Bulls: How Wall Street Got Addicted to Light-Speed Trading. While equity markets reside in the New Jersey / New York area, derivatives such as futures and options are mostly traded in Chicago “720 miles apart as the photon flies – about 3.9 milliseconds at the speed of light” (Jerry Adler quoting Katie M. Palmer). Information between the two cannot flow faster than this.

Analyzing Information Transmission Between Financial Markets in Chicago and New York Gregory Laughlin et al. identified two phases of latency decrease in recent years. The first occurred with the introduction of a latency-optimized fiber optic connection in late 2010. The second is attributed to microwave networks, operating primarily in the 6-11 GHz region of the spectrum, licensed during 2011 and 2012. The authors estimated “the total infrastructure and 5-year operations costs associated with these latency improvements to exceed $500 million.”

In a “race to zero” traders try “to whittle away the difference between the speed their orders travel at and the speed of light. Zero, the point at which that difference would disappear, has become a kind of holy grail”, as Scott Patterson observed in High-Speed Stock Traders Turn To Laser Beams. “It’s all about being as straight a line as possible. Pull the string as tight as you can without causing it to break” wrote Eric Onstad quoting Perseus Telecom CEO Jock Percy from the firm which established the first microwave link between London and Frankfurt in 2012.

Scott Patterson reported that according to estimates by research firm Tabb Group in 2013 market players worldwide spent about $1.5 billion on technology to increase trading speeds, nearly double the amount spent in 2009.

These figures are contrasted with the – highly variable – industry-wide estimates of HFT revenues Laughlin et al. reported: “For 2009, considered a peak year for industry profitability, estimates ranged from $7.2 billion to $25 billion. … A leading market analyst suggests that increased competition and lower trading volume have caused a significant decline … from $7.2 billion in 2009 to $1.8 billion in 2012.”

Initially, the shortest fiber-optic cable connection between Chicago and the New York area which was completed in the mid-1980s had a length of about 1,000 miles consisting of multiple routes following rail lines, with “time-sucking jogs and detours” (Jerry Adler). It rarely reached a minimum round-trip time or latency of 14.5 milliseconds.

In August 2010, Spread Networks completed a new line which by using more-direct routes northwest through central Pennsylvania (see the map in Jerry Adler’s article) shortened the path length to 825 miles and a latency of 13.1 milliseconds. Buying its own rights-of-way the company spent an estimated $300 million to build its new fiber-optic network.

It was this route Paul Krugman referred to contrasting it to a canceled public infrastructure project, a rail tunnel, as he wrote in Three Expensive Milliseconds:

“Even as one tunnel was being canceled, however, another was nearing completion, as Spread Networks finished boring its way through the Allegheny Mountains of Pennsylvania. Spread’s tunnel was not, however, intended to carry passengers, or even freight; it was for a fiber-optic cable that would shave three milliseconds — three-thousandths of a second — off communication time between the futures markets of Chicago and the stock markets of New York.”

He added: “spending hundreds of millions of dollars to save three milliseconds looks like a huge waste.”

Does this ring a bell? What would Krugman have said to the first submarine cable had he lived in the 19th century?

Fiber-optics is not the fastest way to transport a signal. In 2012, firms such as McKay Brothers and Tradeworx started to establish chains of microwave relay towers like the one mentioned by Michael Lewis which allowed line-of-sight communication through the air shortening the round-trip time for data to nine milliseconds and less. The technology is not only faster but also less expensive than fiber optics. The disadvantage is a lower capacity – the technology was once used to carry US long-distance telephone calls until the volume of traffic became too large. Furthermore, microwave signals are easily disrupted by weather conditions, solar activity and even flocks of birds (Scott Patterson). As Eric Onstad reported in Lasers, microwave deployed in high-speed trading arms race “since fiber optic channels typically carry 1,000 times more data than most microwave networks, most HFT firms ration microwave for their most speed-sensitive strategies.” And: “They retain fiber optics for other trades and as a back-up when microwave is hit by weather.”

Other technologies use millimeter waves which have similar advantages and disadvantages. They have shorter wavelengths and can carry more information than standard microwaves transmissions. But they don’t travel as far, have to be reinforced with relay devices at more points and the networks are as vulnerable to weather conditions as microwave systems (Scott Patterson).

Millimeter waves and microwaves were by far not the end of the “race to zero”. A third phase of latency decrease started as companies using US military technology began to install lasers on high-rise apartment buildings, towers and office complexes developing a grid intended to eventually link nearly all US stock exchanges (Scott Patterson). Lasers’ advantage is that they are largely free of weather-related distortions. In addition, the devices come with a stabilizing system which allows putting them in places where microwave dishes cannot be. The result would be an even straighter path and even higher transmission speed.

In addition, developers think about experimenting with drones as platforms for wireless links. Eric Onstad quoted one observer: “A fleet of unmanned, solar powered drones carrying microwave relay stations could hover across the Atlantic. … Someone will do this eventually.” Other possible candidates are balloons and even satellites. For example, Scott Patterson mentioned the idea of “turbocharge intercontinental trading by floating balloons carrying microwave dishes over the ocean”.

Drones, balloons and satellites may eventually contribute to overcoming the biggest obstacle to fast links to other parts of the world: water. So far, fiber optic cables laid across the oceans are the only way to bridge distances connecting the world financial centres in America, Europe and Asia, and efforts to improve communication under these circumstances meet extraordinary challenges.

One project, Hibernia Networks’ Project Express, a $300 million transatlantic cable and “the first attempt to lay a cable across the Atlantic in more than a decade” (Richard Irving), was suspended in 2013 temporarily after becoming embroiled in tensions between the US and China over cyber security. Another, the Emerald Express, will connect North America to Europe through Ireland including Iceland. “Phase 1 of the system, which will connect Shirley, New York on Long Island to its cable landing station in Belmullet, Ireland and branching to Grindavik, Iceland, is expected to be ready by mid-2014. Meanwhile, Phase 2 of the system, which includes a landing in Southern Europe, is also expected to be ready shortly thereafter.” (Emerald Networks)

With the retreat of sea ice, several companies announced plans for fiber-optic cables under the Arctic Ocean between Europe and Japan. “One route skirts the Russian coast and comes ashore on the northern tip of Murmansk; another traverses the Northwest Passage through the Canadian Arctic. When they go into operation around 2014, they will cut latency from about 230 milliseconds on routes through Asia to between 155 and 168 milliseconds.” (Jerry Adler) However, as Jeff Hecht described in Fibre optics to connect Japan to the UK – via the Arctic:

“Sea ice and icebergs pose unique challenges. Ships rated to work in ice-ridden waters are needed to lay the cable, and operations are possible for only a few months of the year. Yet there are advantages to laying cables in the Arctic …. Once laid, the cable should be largely safe from the biggest threats to cables in warmer waters: fishing trawlers and ships’ anchors are extremely rare in the Arctic.”

Enormous efforts are made. The 15,600-kilometre link via the Canadian Arctic, to be built by Arctic Fibre of Toronto, Canada, and scheduled to be in service in January 2016, will cut latency between London and Tokyo to 168 milliseconds. Optical amplifiers will boost signal strength every 50 to 100 kilometres. A tunnel 40 meters deep will be drilled to take a shortcut across the Boothia isthmus in the Canadian Arctic –“a thin strip of land that connects the Boothia peninsula to the mainland. Isolated Arctic communities will also be connected by extra sections of cable that branch off from the main one.” (Jeff Hecht)

The reduced transmission times will be a boon for high-frequency traders who, at least for the moment, will be the main – but not the only – beneficiaries.

Nothing seems unthinkable in this hunt for speed. After nanoseconds picoseconds (trillionths of a second) “loom as the next time barrier”as Brendan Conway wrote in 2011 in Wall Streets Need For Trading Speed: The Nanosecond Age. And Jerry Adler quoted Harvard physicist Alexander Wissner-Gross saying “It is only a matter of time … before some hedge fund decides it needs a particle accelerator to generate neutrinos, and then everyone will want one. Yes, they travel slower than light, but they indisputably can tunnel through the earth, cutting thousands of miles of an intercontinental message.“

Even the speed of light seems no longer a limit. In their study of information transmission between Chicago and New York Gregory Laughlin et al. found indications for “a possible evolution of predictive algorithms that statistically anticipate Chicago futures price changes before the information can reach the equity market engines. We observe a signal consistent with the emergence of such algorithms by documenting correlations that occur over timescales shorter than the theoretical limit of 3.93 ms for light to travel between the Chicago futures market and the New Jersey data centers. Alternately, firms that trade simultaneously in geographically separated markets may coordinate the issuance their orders with respect to the GPS time frame. Such coordinated activity, if it occurs, would make it appear as if information is traveling between Chicago and New Jersey at faster than the speed of light.”

Coming back to the social value of these activities the question is whether contrasting individual costs with hypothetical overall economic benefits is a meaningful exercise. The idea may appear justified in cases where society has to pay the price because this would call for laws and rules to reach a balance between micro motives and macro effects. As a rule, however, even in these cases the focus is not on investors’ costs and related concepts of social waste but on concrete negative externalities – damaging third-party effects of firms’ activities such as air pollution or systemic risk in financial markets – which are not included in market prices, and which would make these activities far more expensive if they were thereby discouraging both demand and production.

As a rule, whether an investment justifies the related costs is first and foremost not a matter of policy makers and economists but of those calculating the risks and being willing to take them in the expectation of respective returns. Usually, governments do not interfere if investors – successfully or not – spend their money on what parts of their subjects may consider to be socially useless and wasteful. Are Barbie dolls socially beneficial? Many would doubt it. But governments would hinder neither Mattel to produce them nor parents to buy them.

Noahpinion in his review of Flash Boys wondered why Michael Lewis did not further elaborate this point as an argument against HFT:

“Lewis, who tells a tale of bad guys ripping off good guys, glosses over the massive waste represented by the bad guys’ own effort! … A tale of social waste is not as exciting as a tale of plucky good guys rebelling against nefarious evil geniuses, but from an economist’s perspective it’s a big problem. The Death Star [a detail from Star Wars], if you think about it, was a massive waste of engineering talent.”

In high-frequency trading the process of encryption and de-encryption, and the competition among firms to be the best de-encryptor, may be socially wasteful as Joseph Stiglitz wrote (as may be the competition for designing Barbie’s dreamhouse or the Death Star example). However, given the economic success in these and other cases, “waste” is clearly in the eye of the beholder. It must be set against

both businesses’ revenues and customers’ benefits and satisfaction;

employment and demand effects in the financial industry with spillovers to other sectors of the economy;

positive externalities such as employment effects in those industries which are boring tunnels, laying cables, installing microwave dishes etc.;

the whole range of related producer services (as we learned from Saskia Sassen) from accounting and legal services over advertising to maintenance and even pizza delivery;

synergies with other economic activities that rely on fast and efficient data transmission and

cost of investing and running a business in other financial and non-financial industries to put these efforts into perspective;

and, eventually, the sum of all this (and more) which can be expected to be much more than its parts in stimulating innovation and technology and contributing to economic growth.

Price discovery is not the only determinant of HFT value for society. But, while each cost argument against the industry is listed meticulously by its critics analyses of the advantages tend to be limited to a small choice of aspects.

In general, restricting economic activities based on a narrowly defined criterion of social value would not only unduly harm the economy by discouraging creativity and stifling innovation in a way reminding of command economies. It also risks containing efforts whose ultimate benefits might become apparent only much later, and often in a different context, as countless examples in the history of engineering and technology demonstrate.

Take once again the development of the telegraph. As Tom Standage showed in his wonderful book on The Victorian Internet the history of long-distance communication is full of unbelievable hardships and obstacles. Huge efforts and large sums of money were involved to keep projects going, and at times, in particular in the early stages, seen from outside the impression of total uselessness and social waste must have been overwhelming.

Another, more recent example is the fracking industry. Supporters and adversaries alike may register with awe the large quantities of up to millions of gallons of water, sand, and chemicals which are injected at high pressure deep underground to break apart shale and release trapped hydrocarbons like oil and gas. Or the millions of tons of sand currently put onto North American railroads, as Thomas Black (Bloomberg) vividly described in Fracking Sand Spurs Grain-Like Silos for Rail Transport. Maybe, their awe is similar to the one instilling economists and other outside observers when they learn of tunnels bored and new fiber cables laid to allow stock trading in nanoseconds.

The Inside Story Of Yo is one of the latest examples illustrating how views about social value may differ. Is the hype – any hype – social waste or benefit or simply private folly? Should investors be hindered to offer millions of dollars for a “stupid app”? Should they be encouraged to contribute to economic growth in this way?

In their paper on Global HFT Regulation: Motivations, Market Failures, and Alternative Outcomes Holly Bell and Harrison Searles remarked that like the first people who experienced the steam engine and its then-inconceivable speed of 30 mph, individuals and institutions may be concerned they will be harmed in some way by market speeds they cannot comprehend.

At all times, new inventions were greeted with both enthusiasm and distrust. Here are several quotes from Tom Standage’s book which remind me of current discussions:

“… after a while he [Samuel Morse] realized that everybody still thought of the telegraph as a novelty, as nothing more than an amusing subject for a newspaper article, rather than the revolutionary new form of communication that he envisaged. … Sending messages to and fro was merely thought of as a scientific curiosity; the telegraph was evidently not regarded as a useful form of communication.”

“The news that the Atlantic cable had failed caused an outcry, not to mention a great deal of embarrassment. Some even claimed the whole thing had been a hoax – that there had never been a working cable, and it was all an elaborate trick organized by Field [the man behind the project] to make a fortune on the stock market.”

And, comparing the telegraph with the Internet:

“After a period of initial skepticism, business became the most enthusiastic adopters of the telegraph in the nineteenth century and the Internet in the twentieth. Businesses have always been prepared to pay for premium services like private leased lines and value-added information – provided those services can provide a competitive advantage in the marketplace. Internet sites routinely offer stock prices and news headlines – both of which were available over 100 years ago via stock tickers and news wires. And just as the telegraph led to a direct increase in the pace and stress of business life, today the complaint of information overload, blamed on the Internet, is commonplace.”

There is much truth in a general observation made by Joel Mokyr in his book on The Lever of Riches:

“Technological change is a game against nature rather than against other players, what von Neumann and Morgenstern have called a “Crusoe game.” Invention occurs at the level of the individual, and we should address the factors that determine individual creativity. Individuals, however, do not live in a vacuum. What makes them implement, improve, and adapt new technologies, or just devise small improvements in the way they carry out their daily work depends on the institutions and attitudes around them. It is exactly at this level that technological change is transformed from invention, a game against nature, to innovation, a complex, positive-sum game with many players and very incomplete information. As C.S. Lewis pointed out “Man’s power over Nature often turns out to be a power exerted by some men over other men with Nature as its instrument”.

One might be tempted to agree with an anonymous comment to Noah Smith’s blog post on Economics, geekiness, and distraction from productive activity:

“Social Waste – this is, I am sorry to say, bad economics.

When private money is spent in the course of competing for profits in any industry, lots of positive externalities are generated. For example, HFT industry spends a lot of money on high performance hardware, thus effectively subsidizing its development. Or let’s take Spread Networks, the line that was installed at a cost of hundreds of millions, between Chicago and NJ. You realize, this is just a fast, high performance telecom line? Right? it can be used for voice, video, whatever you like. The existence of this line, which was heavily subsidized by HFT, is a great thing !”

But, of course, the matter is more complex than that.

A comparison with Bitcoin comes to mind. As Matthew C. Klein (Bloomberg) wrote in Is Bitcoin Like High-Speed Trading? “both are seen as either a huge waste of resources or as a useful new technology that will lower costs in the financial industry for the benefit of consumers.”

And one must see that in both cases their social value, however defined, is diminished by illegal market practices and abuses. As Clifford S. Asness wrote in Why I Love High-Speed Trading: “while defending HFT broadly, we can’t and wouldn’t vouch that each HFT trader acts lawfully and ethically.”

This is a general phenomenon. Tom Standage remarked in his book on the history of the telegraph that every innovation provides “new ways to cheat, steal, lie and deceive”.

And:

“The hype, scepticism and bewilderment associated with the Internet [and with bitcoins and high-frequency trading one would like to add] – concerns about new forms of crime, adjustments in social mores, and redefinition of business practices – mirror precisely the hopes, fears and misunderstandings inspired by the telegraph. Indeed, they are only to be expected. They are the direct consequences of human nature, rather than technology. … Given a new invention, there will always be some people who see only its potential to do good, while others see new opportunities to commit crime or make money. We can expect exactly the same reactions to whatever new inventions appear in the twenty-first century.”

In reaction to unwanted activities regulators all over the world have begun to redefine the boundaries for high-frequency trading. However, concepts and attitudes differ. Holly Bell from the University of Alaska, Anchorage, observed that “globally, not all countries agree that HFT has a negative impact on market quality. While the European Union has proposed aggressive regulation, countries including Japan, Singapore, Russia, and Mexico are embracing HFT. These countries are seeking to create regulatory and market infrastructure environments conducive to a growing HFT presence in their markets to attract capital and improve liquidity.”

(For those who want to know more Holly Bell’s and Harrison Searles’ paper on Global HFT Regulation: Motivations, Market Failures, and Alternative Outcomes gives a good overview with a long list of further references.)

Coming back to the concept of social value, to me the crucial aspect in this debate seems to be the point of reference. Proponents of HFT argue that this should be the status quo ante and its economic effects and not a hypothetical socio-economic optimum derived from theory. For example, in High Frequency Trading, Good or Bad? Sam Seiden wrote:

“If you think this is bad for the average investor, think again. HFT traders are having a big impact on markets but not in a negative way … Years ago, buying 100 shares of a blue chip stock would cost you around $60.00 or so in commissions and another $0.50 or more in the spread. Today, commissions have never been cheaper and spreads have never been tighter than any time in history for the average investor. Spreads have become tighter and tighter over the years simply because of the competition for profit of the spread. … Thanks to HFT traders, market makers, and other groups competing for pennies and fractions of pennies in the market, spreads have never been tighter for the average trader/investor. Remember, HFT traders also compete with each other which again is only going to lead to tighter markets. The group that is being hurt is not the average investor but instead, banks and financial institutions who used to enjoy massive spreads and huge commissions.”

A group, one would like to add, which apparently have come to terms already with this fast new world and – as the recent example of Barclays sued by New York attorney general over alleged ‘dark pool’ fraud indicates – have found new ways to take their cut.

The stakes are high in this debate and both science and media must be careful not to become an instrument of one side or the other. As Cliff Asness and his co-authors emphasized in High Frequency Hyperbole, Part Deux “most of the participants in this high-decibel debate are “talking their book,” meaning arguing for their own interests. While some are just trying to sell their latest yarn, too many are intentionally scaring the public, lobbying for special advantages and calling for rule changes that could hurt investors by raising costs for all, undoubtedly declaring a victory for “fairness” in the process.”

An example from the history of medicine Deanna Day gave in How to Tell If You’re Dead comes to mind:

“Many physicians objected to the introduction of the thermometer in medicine because they perceived it to be a threat to their expertise. If a simple tool made of mercury and glass had the power to indicate whether someone was healthy or sick, or alive or dead, then physicians were concerned that they would lose both their cultural authority and their livelihoods.”

In his book Inside the Black Box Rishi Narang made a similar point: “Many of the most active opponents of high-frequency trading are primarily fighting … because their livelihood is threatened by a superior way of doing things. This is understandable, and fair enough. But it’s not good for the marketplace if those voices win out, because ultimately they are advocating stagnancy. There’s a reason that the word Luddite has a negative connotation.”

Maybe Tyler Cowen was right when he remarked in A study of limiting HFT: “There are a variety of significant problems on Wall Street, but this really isn’t one of them.”

I would like to end with a quote from Noam Chomsky I found in Rishi Narang’s book:

“The major advances in speed of communication and ability to interact took place more than a century ago. The shift from sailing ships to telegraph was far more radical than that from telephone to email!”

It’s high time to adapt to the next stage.

—–

You may also be interested in

Salmon, Stiglitz and the problems of HFT reconsidered (1) – Speed

A perfect storm – first reactions to Michael Lewis’ Flash Boys

Michael Lewis’ Flash Boys – further comments and links

Inflation indexing and Bitcoin rhetoric

Bitcoin – one foot in the door to Europe

Bitcoin follow-up

Now on CARTA! Bitcoin – Nicht mehr nur für Träumer, Nerds und Spekulanten

Bitcoin update

Bitcoin, WIR, and Boston Bean. Parallel currencies in the economies of the world – an annotated link list

Salmon, Stiglitz and the problems of HFT reconsidered (1) – Speed

Building bridges between scientists and the public in communicating research findings is one of the most rewarding activities of bloggers and journalists. Felix Salmon is an outstanding example in this respect. In a recent article, he summarized the key points of a widely noticed speech delivered by Nobel prize-winner Joseph Stiglitz on the problems of financial innovation in general, and high-frequency trading (HFT) in particular.

Stiglitz asked Are Less Active Markets Safer and Better for the Economy? and he came to the conclusion that they are. His arguments revolved around three aspects: speed, costs and social value.

Read more…

Michael Lewis’ Flash Boys – further comments and links

The publication of Michael Lewis’ book “Flash Boys” renewed the general interest in high frequency trading and fuelled the debate about its potential use or harm. After finishing my first compilation of comments and reviews of the book I came upon further interesting and challenging contributions which I would like to share with you.

Read more…

A perfect storm – first reactions to Michael Lewis’ Flash Boys

Since many years, there is a controversy about high-frequency trading which just reached the wider public with the publication of Michael Lewis’ latest book Flash Boys. His conclusion that US stock markets are rigged by high-frequency traders rose strong emotions on both sides. With the following links and excerpts I would like to give an impression of some of the first days’ contributions and comments.

Read more…

Paul Einzig and the foreign exchanges

The other day, I came upon an interesting article on Vox about The Returns to Currency Speculation: Evidence from Keynes the Trader where Olivier Accominotti and David Chambers described the results of their research on currency speculation in London in the 1920s and 1930s. Looking for more, I found a preliminary unpublished paper by the same authors on the topic in the internet (which unfortunately must not be quoted). In that paper, describing the London foreign exchange market in the period under consideration the authors drew heavily on Paul Einzig’s The Theory of Forward Exchange of 1937. This is one of several rich and fascinating books Paul Einzig wrote (actually, the author himself considered it his best effort, as he mentioned in his autobiography), which, although from another century (in 1919 the telegraph had just begun to change the nature of foreign exchange dealing in London), bear many lessons, not only for historians.

Paul Einzig was born in 1897 in Braşov in Transylvania, in a “quiet little backwater, a small town at the foothills of the South Eastern Carpathians” (Einzig 1960), at a time when this Romanian region was still a part of Hungary. In 1919, after studies at the Oriental Academy of Budapest and first steps as a financial journalist in his home country, he came to London.

The world Paul Einzig was set to conquer: 2nd October 1919: London Dandies attired in menswear by Beau Brummell promenade in London’s Fleet Street.

Read more…

Inflation indexing and Bitcoin rhetoric

The other day, the New York Times provided us with an example of how fads and fashions can be used to draw attention to, and win acceptance for, an economic argument. In his article In Search of a Stable Electronic Currency Nobel laureate Robert Shiller proposed the introduction of an inflation-indexed unit of account similar to the Chilean unit of development or unidad de foment (UF) which is existing since the 1960s. The article is in large parts a summary of the ideas of an academic paper the author published in 1998. In short, its main argument says that recent progress in computer technology has considerably widened the possibilities of inflation indexing which would allow for a better pricing, contracting and risk management in an economy.

Read more…

Regulators’ games

Are US regulators corraling their financial system with the latest financial-safety rules for foreign banks as Patrick Welter (Frankfurter Allgemeine Zeitung) and others argued or will their move eventually even pave the way for closer cooperation and a revitalization of the worldwide regime of bank supervision?

Read more…

Sneak previews: Deutsche and RBS

Let us hope that this will not become a habit. As David Enrich (The Wall Street Journal) wrote the other day on Twitter

The tactics to choose a favorable moment (in case of Deutsche Bank a Sunday) ahead of the regular presentation of results to confront the markets with bad news illustrates how much, five years after the peak of the financial crisis, both institutions are still struggling to explain their activities and performance to the public in a damage-limiting way. What happened to the two banks which were once the biggest in the world? The following is a short compilation of information from banks’ press releases, media comments and other readily available sources to find out what the main problems are. Of course, this is no substitute for a thorough analysis. If not mentioned otherwise, data are from the banks’ official websites.

Read more…

Follow

Get every new post delivered to your Inbox.

Join 2,378 other followers