Algorithmic trading, a systematic method that utilizes mathematical models for making transaction decisions in the financial markets, is a global phenomenon, but the subject is a complex one1.
Stock exchanges began transitioning from a traditional auction to computerized transactions in the early 1980s. In the late 1980s and early 1990s, Electronic Communication Networks (“ECNs”) became increasingly popular for traders looking for more efficient access to the markets.
In 2001, IBM reserchers published Agent-Human Interactions in the Continuous Double Auction2. The research paper found that in a Continuos Double Auction market process, simple software bidding agent- strategies were able to outperform human subjects by a clear margin, setting the stage for the high-frequency trading, an algortithmic trading approach characterized by high speeeds and widely used today in the financial markets.
Last month I interviewed, in occasion of the publication of his last book “A guide to creating a succesful algorithmic Trading Strategy”, Perry Kaufman. He began his career as a “rocket scientist,” first working on the Orbiting Astronomical Observatory (OAO-1), the predecessor of the Hubble Observatory, and then on the navigation for Gemini, later used for Apollo missions, and subsequently in military reconnaissance. There is a certain connection between the construction of a trading program and the world of rockets; in fact, the earliest systematic programs used exponential smoothing, a technique developed in Aerospace for estimating the path of missiles. In the early 1970s, he started trading using automated systems while the idea was demeaned by professional traders as “ridiculous”, “the market just doesn’t work that way”, “you can’t make money if you don’t know the value of the stock”8. Now that opinion seems to have been turned upside down