Flash Crashes and Market Design

 

In my book, The Crisis of Crowding, I spoke about the crowding of market makers in the Flash Crash and the sudden disappearance of market liquidity.  This led to people buying Apple stock (Ticker:  AAPL) for $100,000 per share and selling Accenture (Ticker:  ACN) for 1 cent.  Many other trades occurred at ridiculous prices.  Since the writing of the book, there have been several other mini crashes where liquidity in the market seemed to vanish.  Many people believe that improvements in technology and traders known as high frequency traders (HFTs) are part of the cause of these mini crashes.  That is, they attempt to use their speed advantage in trading to either pick off slower market makers or manipulate the market place to make a profit. 

As I traveled around the world speaking about my new book, often times I would get questions  about whether we needed to trade so frequently.  That is, why do we need to have a system where trading can occur continuously.  I conjectured to people that on the surface this made sense and perhaps trading should be limited to a discrete interval.  Of course, without any research to rely on, it was difficult to know what that interval would be and whether it would be better.  Every second?  Every minute?  Every hour?  What would make the appropriate interval for the efficiency of the market place?

Recently, new research argues that current market design is not beneficial to investors.  In order to understand this, you must remember that information does not travel instantaneously from one place to another.  For example, today it takes about 13 milliseconds for information to travel from New York to Chicago and back again.  Thus, anything that happens in New York trading will be known in Chicago only with a delay and vice versa.  To give you an idea how fast this is, it takes 400 milliseconds to blink the human eye.  Thus, there is an incentive for traders (e.g. HFTs) to invest in high-speed technology, not so much to make the market more efficient, but to to make a profit at the expense of others in the market.  This might be a waste of resources for society as a whole.

This new study takes two very related financial instruments, the SPY (the ETF that trades based on the value of the S&P 500), and the S&P 500 futures contract.  In theory, these instruments should be highly correlated as they represent the value of the same underlying portfolio except for time value.  It turns out that they are highly correlated over one day intervals, one hour intervals, and even one minute intervals.  However, they are very uncorrelated at smaller time intervals, like 10 milliseconds.  This was something that we saw repeatedly in The Crisis of Crowding, but for a different reason.  When crowded spaces unwound, fundamental value was irrelevant, only supply and demand in that space at that time mattered for market prices.  The new study estimates that these spaces where similar instruments behave very differently could lead arbitrage opportunities between the two markets totaling $75 million per year.  In fact, there are about 891 arbitrage opportunities per day in just these two securities!  Over time, this arbitrage opportunity has not disappeared, rather it has simply occurred at shorter and shorter frequencies.  Thus, HFTs have had to continuously invest in speed to capture the never-disappearing arbitrage opportunity. 

So why should we care?  Spreads have declined over time and so all is well in the world.  This study argues that the current market mechanism, the continuous limit order book, is not an optimally designed market mechanism.  They propose that our financial system should have a uniform-price sealed-bid double auction conducted at frequent intervals.  This interval should be sufficiently long so that  there is ample time for systems that receive information slowly and systems that receive information faster can analyze the data and the market before making their trades.  For example, the interval might be 10 seconds.  Thus, every 10 seconds all the market and limit orders or demand and supply schedules[1] for a stock trade would go into a batch system to find the market clearing price (typical in an auction).  Once this was complete, the market would be informed of all orders and all trades that occurred in that batch.  The market participants could then prepare for the new batch 10 seconds later. 

The authors claim that this system will be better for the market in the following ways.  First, there would be less opportunities for faster traders to take advantage of slower traders.[2]  Second, it would create competition between the fast trading systems so that in equilibrium, faster traders will bid correct prices rather stealing profits from the slower traders.  Third, the authors believe that bid-ask spreads would be narrower than with the current system and the market would be deeper.  That is, there will be more ability to trade in volume away from the best bid and offer price.

So have we found the solution to flash crashes?  Simply find an optimal trading interval of 1 to 10 seconds and replace continuous trading with batched auctions?  Not necessarily.  More research still needs to be done.  For example, will this new system lead to HFTs vanishing from the market place?  And what effect will this have on overall liquidity and depth?  What is the optimal level of the trading interval to insure access to liquidity versus manipulation of the market prices?  Will markets be more or less stable under such a system?  Will HFTs simply spend more time gathering faster computers to process news and other sorts of information?  Taiwan used to have a market similar to this and they switched to real time continuous auctions.  The Philadelphia exchange went from price-time priority to price-size priority and so far there has been limited success.



[1] I have a patent on an idea for creating these type of demand and supply schedules for discount brokers needing customer liquidity. 

[2] There is still an advantage, because in any interval, there may be some news that is released to the market place, where fast information systems can execute on that information before the new batch starts, while slower information systems cannot.