Algorithmic trading may be used in any investment strategy, including market making, inter-market spreading, arbitrage, or pure speculation (including trend following). The investment decision and implementation may be augmented at any stage with algorithmic support or may operate completely automatically.
A third of all EU and US stock trades in 2006 were driven by automatic programs, or algorithms, according to Boston-based consulting firm Aite Group LLC. By 2010, that figure will reach 50 percent, according to Aite.
In 2006 at the London Stock Exchange, over 40% of all orders were entered by algo traders, with 60% predicted for 2007. American markets and equity markets generally have a higher proportion of algo trades than other markets, and estimates for 2008 range as high as an 80% proportion in some markets. Foreign exchange markets also have active algo trading (about 25% of orders in 2006). Futures and options markets are considered to be fairly easily integrated into algorithmic trading, with about 20% of options volume expected to be computer generated by 2010. Bond markets are moving toward more access to algorithmic traders.
Program trading is defined by the New York Stock Exchange as an order to buy or sell 15 or more stocks valued at over $1 million total. In practice this means that all program trades are entered with the aid of a computer. In the 1980s program trading became widely used in trading between equity and futures markets.
In stock index arbitrage a trader would buy (sell) a stock index futures contract such as the S&P 500 futures and sell (buy) a portfolio of up to 500 stocks at the NYSE matched against the futures trade. The program trade at the NYSE would be pre-programmed into a computer to enter the order automatically into the NYSE’s electronic order routing system at a time when the futures price and the stock index were far enough apart to make a profit.
At about the same time portfolio insurance was designed to create a synthetic put option on a stock portfolio by dynamically trading stock index futures according to a computer model based on the Black-Scholes option pricing model.
Financial markets with fully electronic execution and similar electronic communication networks developed in the late 1980s and 1990s. In the U.S., decimalization, which changed the minimum tick size from 1/16th of a dollar ($0.0625) to $0.01 per share, may have encouraged algorithmic trading as it changed the market microstructure by permitting smaller differences between the bid and offer prices, decreasing the market-makers' trading advantage, thus decreasing market liquidity.
This decreased market liquidity led to institutional traders splitting up orders according to computer algorithms in order to execute their orders at a better average price. These average price benchmarks are measured and calculated by computers by applying the time weighted (i.e unweighted) average price TWAP or more usually by the volume weighted average price VWAP.
As more electronic markets opened, other algorithmic trading strategies became possible including arbitrage, statistical arbitrage, trend following, mean reversion. These strategies are more easily implemented by computers because machines can react more rapidly to temporary mispricing and examine prices from several markets simultaneously.
Algorithmic trades require communicating considerably more parameters than traditional market and limit orders. A trader on one end (the “buy side“) must enable their trading system (often called an “Order Management System” or “Execution Management System”) to understand a constantly proliferating flow of new algorithmic order types. The R&D and other costs to construct complex new algorithmic orders types, along with the execution infrastructure, and marketing costs to distribute them, are fairly substantial. What was needed was a way that marketers (the “sell side”) could express algo orders electronically such that buy-side traders could just drop the new order types into their system and be ready to trade them without constant coding custom new order entry screens each time.
FIX Protocol LTD http://www.fixprotocol.org is a trade association that publishes free, open standards in the securities trading area. Members include virtually all large and many midsize and smaller broker dealers, money center banks, institutional investors, mutual funds, etc. This institution dominates standard setting in the pretrade and trade areas of security transactions. In 2006-2007 several members got together and published a draft XML standard for expressing algorithmic order types. The standard is called FIX Algorithmic Trading Definition Language (FIXatdl). Currently targeting March 2008 for final release, FIXatdl is now in broad beta testing with the following firms participating: Barclays, Bloomberg Tradebook, Cheuvreux, Citigroup, Credit Suisse, Fidelity Investments, Goldman Sachs, ITG, JPMorgan Chase, Merrill Lynch, Morgan Stanley, NeoNet, Pragma@Weeden, and UBS AG.
More information on FIXatdl, including example XML files and sample code may be found at: http://www.fixprotocol.org/working_groups/algowg/documents
Many different algorithms have been developed to implement different trading strategies. These algorithms or techniques are commonly given names such as "iceberging", "Dagger", "Guerrilla", "benchmarking", "Sniper" and "Snif-fer".
Large orders are broken down into several smaller orders and entered into the market over time. This basic strategy is called "iceberging". The success of this strategy may be measured by the average purchase price against the VWAP for the market over that time period. One algorithm designed to find hidden orders or icebergs is called "Guerrilla".
A classical arbitrage strategy might involve three or four securities such as covered interest rate parity in the foreign exchange market which gives a relation between the prices of a domestic bond, a bond denominated in a foreign currency, the spot price of the currency, and the price of a forward contract on the currency. If the market prices are sufficiently different from those implied in the model to cover transactions cost then four transactions can be made to guarantee a risk-free profit. Algorithmic trading allows similar arbitrages using models of greater complexity involving many more than 4 securities.
Market making involves placing a limit order to sell (or offer) above the current market price or a buy limit order (or bid) below the current price in order to benefit from the bid-ask spread. Automated Trading Desk, which was bought by Citigroup in July 2007, has been an active market maker, accounting for about 6% of total volume on both NASDAQ and the New York Stock Exchange.
A "benchmarking" algorithm is used by traders attempting to mimic an index's return. An algorithm designed to discover which markets are most volatile or unstable is called "Snif-fer".
Any type of algo trading which depends on the programming skills of other algo traders is called gaming. Dark pools are alternative electronic stock exchanges where trading takes place anonymously, with most orders hidden or "iceburged." Gamers or "sharks" sniff out large orders by "pinging" small market orders to buy and sell. When several small orders are filled the sharks may have discovered the presence of a large iceburged order. They then front run the order.
“Now it’s an arms race,” said Andrew Lo, director of the Massachusetts Institute of Technology’s Laboratory for Financial Engineering. “Everyone is building more sophisticated algorithms, and the more competition exists, the smaller the profits.”
More sophisticated models and intelligent programs have created the question of whether the models will break down.
“The downside with these systems is their black box-ness,” Mr. Williams said. “Traders have intuitive senses of how the world works. But with these systems you pour in a bunch of numbers, and something comes out the other end, and it’s not always intuitive or clear why the black box latched onto certain data or relationships.”
Regulators in Great Britain are watching the development of algo trading.
“The Financial Services Authority has been keeping a watchful eye on the development of black box trading. In its annual report the regulator remarked on the great benefits of efficiency that new technology is bringing to the market. But it also pointed out that ‘greater reliance on sophisticated technology and modelling brings with it a greater risk that systems failure can result in business interruption’.”
Other issues include the technical problem of latency or the delay in getting quotes to traders, security and front running, and the possibility of a complete system breakdown leading to a market crash.
The cost of developing and maintaining algorithms is still relatively high, especially for new entrants, as the need for stability, bandwidth and speed is even higher than for regular order execution. Firms which have not developed their own algorithmic trading have had to buy competing firms.
"Goldman spends tens of millions of dollars on this stuff. They have more people working in their technology area than people on the trading desk...The nature of the markets has changed dramatically.
“Computers are now being used to generate news stories about company earnings results or economic statistics as they are released. And this almost instantaneous information forms a direct feed into other computers which trade on the news.”
The algorithms do not simply trade on simple news stories but also interpret more difficult to understand news. Some firms are also attempting to automatically assign sentiment (deciding if the news is good or bad) to news stories so that automated trading can work directly on the news story.
“There is a real interest in moving the process of interpreting news from the humans to the machines” says Kiristi Suutani, global business manager of algorithmic trading at Reuters. “More of our customers are finding ways to use news content to make money.”
An example of the importance of reporting speed to algorithmic traders was an advertising campaign by Dow Jones (appearances included page W15 of the Wall Street Journal, on March 1, 2008) claiming that their service had beaten other news services by 2 seconds in reporting an interest rate cut by the Bank of England.
In July 2007, Citigroup, which had already developed its own trading algorithms, paid $680 million for Automated Trading Desk, a 19 year old firm that trades about 200 million shares a day, accounting for about 6 percent of trading volume in U.S. markets. Citigroup had previously bought Lava Trading and OnTrade Inc.
Though its development may have been prompted by decreasing trade sizes caused by decimalization, algorithmic trading has reduced trade sizes further. Jobs once done by human traders are being switched to computers. The speeds of computer connections, measured in milliseconds, have become very important.
More fully automated markets such as NASDAQ have gained market share from less automated markets such as the NYSE. Economies of scale in electronic trading have contributed to lowering commissions and trade processing fees, and contributed to international mergers and consolidation of financial exchanges.
Competition is developing among exchanges for the fastest processing times for completing trades. For example the London Stock Exchange, in June 2007, started a new system called TradElect, which promises an average 10 millisecond turnaround time from placing an order to final confirmation, and can process 3,000 orders per second.
Spending on computers and software in the financial industry increased to $26.4 billion in 2005.
Brokers have found it more difficult to monitor the risk of their clients' positions, especially for clients such as hedge funds.