“As a quick background, I’ve been working in HFT since graduating with an undergrad in mathematics and CS, I trade primarily equities (stocks, basically), market making (posting bids and offers and collecting the spread in between) in ~1700 different stocks…”
(click title for article)
“We think that HFT strategies, in particular the trend-following ones, are playing a key role…the very existence of cross-market correlations at high frequencies favours the presence of automated trading strategies operated by robots on multiple assets. Our analysis suggests that commodity markets are more and more prone to events in global financial markets and likely to deviate from their fundamentals.”
In this TED talk, Yan Ohayon demystifies and shares his experience with algorithmic trading and its impact on markets, our lives, and everything in between.
“We examined 50-years of historical S&P 500 Index data and compared the actual tail risk frequency and magnitude to the expectations of a typical investor operating under modern portfolio theory. The difference between the two is surprising, and it suggests that investors have significantly underestimated tail risk frequency and severity”
Kevin Slavin’s TED talk on high-frequency trading. He shows how these complex computer programs determine: espionage tactics, stock prices, movie scripts, and architecture. And he warns that we are writing code we can’t understand, with implications we can’t control.
“The ‘flash crash’ of May 6th 2010 was the second largest point swing (1,010.14 points) and the biggest one-day point decline (998.5 points) in the history of the Dow Jones Industrial Average. For a few minutes, $1 trillion in market value vanished. In this paper, we argue that the ‘flash crash’ is the result of the new dynamics at play in the current market structure…”
“CEP is a new paradigm of computing that allows organizations to quickly respond to data that is continuously changing. CEP algorithms are structured as sets of event-based rules. These rules monitor items in incoming data streams—termed “events”. Each event represents an update within the system…”
February saw the market continue its slow trickle up, “climbing-the-wall-of-worry” behavior. Movement in the gaps and anemic intraday ranges were the order of the day, and unfortunately such an environment doesn’t offer intraday systems many opportunities. The system’s equity curve saw most days coming in under +/- 0.50% and many offered no trades at all, leaving the system flat (-0.40%) for the month. Yawn.
It seems this system is not alone, as a fellow algo trader friend of mine joked “Welcome to the $300 club!” after listening to me complain about February’s returns. His system has been seeing anemic returns as well, yielding him pocket change of plus/minus a few hundred dollars a day (vs the thousands per day he was seeing last Fall). I guess misery loves company, lol.
Do we really know what we think we know? White’s famous paper on his “Reality Check” procedure:
“Data Snooping occurs when a given set of data is used more than once for purposes of inference or model selection. When such data reuse occurs, there is always the possibility that any satisfactory results obtained may simply be due to chance rather than to any merit inherent in the method yielding the results. Our new procedure, the Reality Check, provides simple and straightforward procedures for testing the null that the best model encountered has no predictive superiority over a given benchmark model…”
“…whereas the normal distribution of the daily return of the S&P would suggest a negative three-sigma event (between -3.56% and -2.36% daily returns) should have occurred 27 days over the last one hundred years, this has actually occurred over a hundred times in the 81 years since 1927. And the “normal” likelihood of a negative four-sigma event is one day every one hundred years; yet we have seen this take place an astounding 44 times since 1927…”