Computer Games
September 2015
Ivan Obolensky
Not many understand what High-Frequency Trading (HFT) is, and how it evolved, simply because it has grown quietly behind the scenes, out of the public eye, at least until several computer glitches and flash-crashes put it in the public spotlight.
High-Frequency Trades are initiated by computer programs that operate at speeds of a millionth of a second. With over half the trades on most stock exchanges being generated in this way, it is important to understand the impact of this type of trading because it affects us all directly, or indirectly, one way or another.1
The first thing to know is that High-Frequency Trading is big business.
For example, Virtu Financial, a High-Frequency Trading group, in its 2014 Initial Public Offering, noted that during the five-year period since 2009 it showed trading profits on 1,277 out of the 1,278 days it traded. The group lost money on only one day in a five-year period and reported revenues of almost 500 million dollars for the first half of 2015. This is an impressive performance.2
HFTs, like Virtu, have become huge players in markets all over the world. HFTs accounted for 60-70% of the New York Stock Exchange volume in 2009.
To understand what High-Frequency Trading is, how it works, and why HFTs can generate such huge profits, it is necessary to review some history of the U.S. stock market and get some context.
Originally stock markets were conducted at physical locations by members of an exchange, such as at the New York Stock Exchange (NYSE). There, specialists would handle incoming trade orders for specific stocks. For instance there was a specialist who handled only the buy and sell orders for IBM shares. Their purpose was to ensure an orderly market. If everyone was selling, the specialist was required to buy. If everyone was buying, he had to sell to them. He became known as a market-maker. The specialist made sure there was a market regardless of whether the market was going up or down. Specialists had to make money during normal market periods to compensate for occasional adverse market moves. This was done by creating a ‘spread’. As an example, suppose a stock last traded at $10.00. The specialist would offer 100 shares for sale at $10.25 and be willing to buy 100 shares at $9.75. The spread (50 cents) was the difference between the two prices. By offsetting buy and sell orders as they came in, the specialist would pocket the 50 cents. If he bought and sold 100,000 shares that day, the specialist would make around $50,000.
Besides the spread (the difference in the buy price and the sell price), the specialists had other advantages. The first was being on the floor itself and being able to observe the ebb and flow of market prices. Second, the specialist maintained a book of all existing orders above and below the current market price of a stock. If he knew that there were massive orders to buy above the current price, the specialist could accumulate shares in his own account in order to sell them at a profit when the price moved in that direction. The specialist had inside knowledge of future order flow by monitoring his book. Lastly, specialists controlled the size of the spread, that is whether the difference between the buy and sell price was 25 cents, or a dollar. The width he set was based on his perception of the amount of risk in the market at any given time.3
The above was typical of the state of markets before the 1970s and the dawn of the computer age. Many outside of this privileged network felt that the entire system was a monopoly hiding in plain view and were determined to force the exchanges to become open to all. Over time they were successful.
Several things happened to change the existing system.
The first was that the demand for stocks and other financial instruments (such as options on individual stocks) increased as the ’70s turned into the ’80s, and the US economy expanded. The NYSE handled only the shares of large corporations. All else was traded either on the American Stock Exchange, or through a system called the National Association of Security Dealers which operated over the telephone. (These exchanges merged and eventually formed NASDAQ.) The number of market-makers and broker-dealers (broker-dealers are those who trade shares on behalf of the public) multiplied as demand for financial products grew.
Eventually, the transaction volume increased to such an extent that some sort of automation became necessary. Computers started to be used to record orders and display transactions in real time. As computer usage became more mainstream and user-friendly, traders began to notice that there were price differences between the options market, the stock market and the futures markets where there shouldn’t be any. These could be taken advantage of, provided an order was put in fast enough to sell one and buy the other. Computers were programmed to automate many of the orders so as to profit from these price discrepancies. This was the start of computerized trading, or what became known as program trading. It was not yet fully automated.
In 1987, the stock market crashed. Program trading was blamed. In addition, many investors were left high and dry because they couldn’t reach their broker to place an order, let alone execute a trade even when they did so. The volume at the time of the crash was so huge, many of the fledgling electronic systems and manual order systems broke down. With much regulatory scrutiny, markets were retooled and an electronic order entry system became a requirement to ensure transactions could take place regardless of the volume. All market-makers and broker-dealers were connected to the system.
With so much data now available, and with so many market-makers having to show the prices they were offering to buy, or sell on the shares they traded, there were often price differences from one dealer to the other for the exact same stock. Once again, electronic traders began to take advantage of these disparities. It was free money and with better computers overall, electronic traders took advantage.
After the ’87 crash, investors returned to the markets. Trading volume picked up, and there was more activity than ever, particularly with mutual funds. Institutions such as Fidelity, pension fund managers, and other large asset managers, became the big players.
With the rise of mutual funds came much larger order sizes. The institutions mentioned above bought and sold millions of shares in large blocks. These institutions and the managers who transacted this large order flow did not want to pay high commissions set by broker-dealers or the wide spreads set by specialists and market-makers, nor did they want other traders to discover they were taking a large position in say, Microsoft, and have others profit from the inevitable move higher that occurred from all their buying. Institutions needed a place to trade off the exchanges and away from public view. Computing power had by this time increased to a point that it was possible to create Electronic Communication Networks, or ECNs. ECNs were strictly electronic and matched buyers and sellers. They did away with the middleman, the market-maker as well as the broker-dealer that charged them high commissions.
This was the start of what is called “Dark Pools”. A pool is a market. A “Lit Pool” is where all trades and quotes are open to view. A “Dark Pool” is where transactions take place with no public viewing. Prices were posted on exchanges, but only after the fact, and sometimes at better prices than a public investor could possibly receive. The first of these exchanges was Instinet, followed by Island. Both of these ECNs began to transact business without a middleman, reducing costs for institutional traders.
Many considered this unfair practice and clamored for an investigation by regulators.
As a result, in 1996, the NASDAQ was accused of price fixing. Regulators felt that to ensure investors received fair prices, broker-dealers must post all competing quotes (offers to buy or sell) and choose the one that gave the customer the best price. This required a great deal more computerization to ensure that these quotes could be made available to all who bought and sold for the public. Additionally the ECNs, which were the exclusive province of institutional traders, had to make their quotes visible as well. No more Dark Pools.
This was a game changer.
Institutions could no longer hide their order flow.
Further, by demanding the interconnection of all the sources of quotes in a common computer language, regulators made truly electronic trading possible. Of course, the purpose of the rule-making was to give the consumer better pricing and it worked, but it also had consequences: electronic platforms became mandatory. This benefitted established Electronic Communication Networks (ECNs) but those who ran the ECNs had a big problem: how to make money from all these trades if there was no market-maker, no middleman, no spread?
The answer was to make a mini-spread for every transaction that occurred. The thinking went this way:
If a market-maker put up a quote to buy or sell, it was providing liquidity to the system (more opportunity for a transaction to occur when it was needed). This was good because it allowed someone to trade. The liquidity-maker should be compensated for this. On the other hand, if someone purchased or sold a stock, the buyer/seller was taking away liquidity because the quote disappeared when it was filled and denied those that wanted to transact an opportunity. Liquidity was reduced. This was bad. They should have to pay for taking liquidity away.
Those who ran the ECNs decided to charge liquidity-takers 0.003 cents to accept an offer to buy or sell and rebate the liquidity-maker who made the offer 0.0028 cents for each share traded. (The ECN pocketed the 0.0002 difference.) Although the fee was small, it validated those who put up quotes (offers to buy or sell) and penalized those who accepted them. It also set the stage for what was to come. The solution, for those who owned and ran the ECNs, the exchanges that eventually merged with them, and the traders that profited from using them, was simple: create more volume. They did not have to wait long. The rise of the Dotcoms and the frenzied trading concerning them took care of that.
By the year 2000, two events happened that once again greatly influenced both the market and the infrastructure that supported it. The first was the bursting Dotcom bubble, and the second was, once again, the regulatory fix.
Several new rules were put together. The first was decimalization. Stocks previously traded in minimums of 1/8th (12.5 cents), or even a 1/16th (6.25 cents), rather than simply 0.12 or 0.06. Regulators decided that it was in the best interests of the public that stocks be quoted in decimals rather than these archaic fractions left over from past centuries. The rule would drive the spread from the minimum 1/16th (6.25 cents), as per the above example, to a penny, saving the consumer money.
Decimalization had two consequences. The first was that spreads narrowed, which benefitted the consumer. The second was that many market-makers were driven out of business. The spreads were too narrow for them to make a living. Taking their place were more Electronic Communication Network (ECN) platforms. They sprouted everywhere. There was no coordination, just tons and tons of data. It became extremely difficult for a broker to make sure he got the best price for his customer because of the rising number of ECN venues.
Shortly thereafter, a new set of regulations was implemented called Regulation National Market System, or Reg NMS. This set of rules required broker-dealers to access the very best price available before executing a trade, or risk fines and sanctions. By ordering that all broker-dealers have access to every available quote, they indirectly mandated that all market venues be interlinked. Again regulators felt that this could only benefit the public, but this created difficulties for those who placed trades for the consumer. The first was that all broker-dealers who traded for the public had to monitor thousands of stocks on hundreds of venues all the time. Trades sent to be executed were often put in queues awaiting their turn, but if prices changed while waiting in the queue they had to be re-routed to another venue to get the better price. There, the same process repeated. Orders could take much longer to execute, and the price of waiting could be expensive as those at the front received the better price and the trade was bumped to another queue that offered the best one but at a less advantageous price. Unless one knew how to place an order exactly in the language that the computer networks understood, executions could happen at prices different than what was expected. But there was one group that understood the significance of this and how to manage their orders so they didn’t get bounced, and that was the electronic traders that flowed a lot of business to these electronic platforms.
While the markets were retooling, traders had grown in sophistication. Their computers were replaced with the fastest available. Programmers fine-tuned small programs called algorithms (algos) which consisted of several logic steps parsed in computer language. A simple algo might be “If the price is greater than 10.25 then offer to sell 100 shares. If the price is less than 10.00 offer to buy 100 shares”.
Traders using algos could take advantage of the price discrepancies between the thousands of quotes that were now available to be processed.
Over time the algos grew much more complex so they could adapt to multiple situations. Since many of the traders generated the volume the ECNs needed, they also became privy to the type of order entry that could jump to the head of the queue, buy at a lower price, and then sell it at a higher price to the order waiting behind it. Secondly, by offering to buy and sell, the electronic trader created liquidity and therefore could earn the liquidity-maker fee of 0.003 cents per share if their order to buy or sell was executed. By analyzing all the available quotes, the algos made sure their offer to buy or sell was the best available, even if the algo simply made no profit on the trade. The trick was to offer liquidity across the market and earn the fee. It was risk-free and extremely lucrative if one could trade a billion shares a day.
This meant that electronic traders required speed, the fastest available. They coveted ‘low latency’. Latency is the time it takes to run an order from a computer to the ECNs that matched and executed the trade. Exchanges and their ECNs began to rent server space next to their own servers so trading firms that wanted the fastest execution of trades possible could do so. This also required that the algos that offered liquidity, jumped order queues, or executed trading strategies be fully automated. Trading institutions researched machine learning (see Artificial Intelligence and Language) that raised the complexity of algos to a new level. Algos were programmed to change their executions with changing market conditions.
Meanwhile institutions had not stood still during this time. They needed to buy and sell large numbers of shares without having to pay the higher prices the queue-jumping algos made them pay by buying the shares first and reselling them to the institution at a higher price. They built computer systems of their own and created different algos that now broke up large orders into small chunks and traded them across multiple venues in random patterns.4
If this sounds like the Wild West, or some kind of cyber war between dueling computer systems and programs on a micro level, you would be correct. Volume went through the roof. Eventually major investment banks such as Goldman Sachs and other financial institutions saw the profit potential and either bought existing ECNs for large sums, or built their own.
But there was a problem. It is called uncertainty. Uncertainty is the bane of HFTs because it increases volatility (price movement per unit of time). It is like noise on a phone line. Too much noise, and it is impossible to extract a signal. You hear only static. When volatility rises above a certain point, the algos can make mistakes, and therefore, they have instructions to stop trading immediately and cancel all bids and offers.
What this means on the macro level is that in the event of a major shock that creates uncertainty and large volatility, HFTs will shut down their trading system. What happens to all the liquidity that was being offered just seconds ago? It evaporates. What happens to stock prices? They drop. They drop hard.
Conclusions and observations:
- Regulation even with the best of intentions can often lead to unforeseen consequences. This is not to say that there should be no regulation, but rather that those who regulate must be aware that regulations can create as many problems as they solve. As a rule, regulation is best in small doses.
- With the lure of big money it is likely that competition between HFTs will intensify causing some to go out of business. Note that the NYSE volume of HFT trades has decreased from 60-70% in 2006 to 50% as of 20125. It is possible that overall HFT volume has peaked although it is highly unlikely HFTs will disappear anytime soon. They are now necessary under the current market framework simply because exchanges need the volume that HFTs provide to survive as for-profit entities. Note: Exchanges were not always for-profit publicly traded companies that depend on order volume to remain viable. The NYSE was recognized as a Not-for-Profit organization in 1971, but became for-profit in 20056.
- In the same way, HFTs need the exchanges and the liquidity maker-fees exchanges offer in order to make money. It is a symbiotic relationship. One can’t exist without the other.
- As an observation, with so many HFTs in competition, the easy money has probably already been taken. New HFTs will find it more difficult to make the profits needed to justify the effort and outlay for new systems. Of course, if an HFT trading group can create systems that operate at the nanosecond level, then they have the advantage, and the cycle repeats. Regardless, there is a limit to data speed, and that limit is the speed of light.
- Complexity theory points out that highly networked entities are prone to create wide market swings similar to the population changes in predator-prey relationships. In spite of this tendency, such networks are extremely robust and tend to correct back the other way rather than simply collapse7. The flash-crash of the US market in the summer of 2010 is such an example.
- Referencing complexity theory again, it is likely that volatility will increase going forward. This brings up the most interesting and unknown factor of HFTs. HFTs reduce the spread between the bid and ask, which damps down short-term volatility; however, if volatility is a natural phenomenon inherent in all system interaction, similar to noise on signals, or the vibrations of molecules, damping it down in one form can lead to more volatility in another, similar to taking two pieces of bread with jam in the middle and pressing them together. The jam leaks out the sides and makes a mess.
- In and of themselves, HFTs are not inherently bad; they operate at such a small time scale they do not often affect those trading on a longer time frame with some exceptions. Algos can create order imbalances by generating many large orders above, or below, the market that trick other traders into anticipating a move higher, or lower. The orders are then cancelled when the market starts to move. This is called spoofing and is illegal8. Although this is a danger, the close mutually supporting relationship between HFTs and exchanges is perhaps the greater one. The vested interest to maintain the relationship at any cost could prove hard to unravel if required. Secondly, if a choice must be made between the public and the relationship, it is not difficult to predict which way the decision will go.
- Institutional traders have also gotten into the Algo game by spreading out their orders across multiple venues at multiple times and in small order size reducing the impact of HFTs jumping the queue and taking advantage of the large order by buying first and selling to the large order behind them. The playing field is more even than it was.
- Regulation is always the sword of Damocles that hangs over the heads of HFTs. What has been given can be taken away.
HFTs are a factor in the investing/trading game and should be understood, but they are not the key player. Central Bank interventions to preserve market trends in the face of increasing debt ratios and tepid economic fundamentals are the elephant in the room. Comparatively, HFTs are small potatoes.
Still, if one sits down at a poker table, it is best to know who one is playing against.
- Patterson, S. (2012) Dark Pools, the Rise of the Machine Traders and the Rigging of the U.S. Stock Market. New York, NY: Crown Business
- A. (2015). Virtu Financial, Inc. Form 10-Q Quarterly Report. Retrieved September 11, 2015 from http://files.shareholder.com/downloads/AMDA-2PR09O/687627359x0xS1104659-15-60023/1592386/filing.pdf
- Harris, L. (2003). Trading and Exchanges: Market Microstructure for Practitioners. New York, NY: Oxford University Press, USA.
- Patterson, cit.
- A. (2012). Declining U.S. High-Frequency Trading. The New York Times. Retrieved September 11, 2015 from http://www.nytimes.com/interactive/2012/10/15/business/Declining-US-High-Frequency-Trading.html?_r=0.
- A. (2015). NYSE History, The History of the New York Stock Exchange. Retrieved September 11, 2015 from http://www.stock-options-made-easy.com/nyse-history.html.
- Page, S. E. (2009). Understanding Complexity. Chantilly, VA: The Teaching Company
- Levine, M. (2015) Why is Spoofing Bad? BloombergView. Retrieved September 11, 2015 from http://www.bloombergview.com/articles/2015-04-22/why-is-spoofing-bad-
If you would like to sign up for our monthly articles, please click here.
Interested in reprinting our articles? Please see our reprint requirements.
© 2015 Ivan Obolensky. All rights reserved. No part of this publication can be reproduced without the written permission from the author.
Great article. Very informative and educational. Thanks.
Thanks, Craig I hope it helps. It is a confusing area. To add one point: Virtu is now heavy in the currency markets.
This is a field I have, literally, no knowledge of and consequently little understanding. However, this article gave a narrative of this
evolution, which I though is a piece of history, and somehow gave me a clear glance of the changes of technology as related
to stock markets and other enterprises. What I like the most is that Ivan writes clear and to the point thus making it possible to
assimilate information that one may not know nothing about it. Thank you for spending the time in doing so and I await the next
ones. Silvia LL
It was a pleasure to read this article : History, money & speed of light …
Thank you for your perspicacity.