Uncommon Data is the New Frontier in Risk Management

Minimizing volatility is important to investment managers focused on capital preservation. After all, lower volatility helps protect capital and improve the key portfolio performance metric, the Sharpe Ratio, which is equal to the average annualized return divided by annualized volatility. An acceptable Sharpe Ratio for a portfolio starts in the 1.8 range. Some high-frequency trading funds produce Sharpe as high as 20. Even very small positive returns can produce large Sharpe ratios that attract investors, but only if the volatility of the portfolio is tiny.

Minimizing volatility is a challenging task. In a nutshell, to minimize volatility, one needs to:
1. Identify the conditions that result in high volatility in the portfolio.
2. Correctly predict when the conditions identified in 1. above are about to occur in the near future, and
3. Select an action for managing this volatility. The appropriate actions may include trimming riskier portfolio holdings, or counterbalancing the riskier instruments with offsetting or protective financial instruments, such as futures or options.

Much of the research of the 1980s and 1990s focused on managing volatility with a single approach for all market conditions. By designing all-weather securities that can be bought once and held passively until the securities' expiration (as is the case with futures and options), the new breed of financial engineers advanced portfolio frontiers, and also generated much excitement and revenues. For example, collar options protected against too much downward as well as upward movement in portfolio value. Other exotic derivatives helped manage changes in interest rates, oil prices and things like weather changes in Florida and mitigate the resulting swings in orange juice.

With time, many of the all-weather securities proved to be imperfect, failing to protect their owners' portfolios when it mattered most, typically in extreme market conditions. Anyone remember Enron? Many smaller investment firms toppled like dominoes in the market crises of the early 2000, and seemingly unending lawsuits followed with the consumers of exotic securities suing issuers and underwriters of options for misleading them about one aspect of the package or the other. The bottom line is that all-weather strategies did not hold water against extreme events.

For a while, people tried to address the issue by deploying theories about extreme events and modeling the unlikely "black swans". However, modeling proved to be too difficult and unrealistic, and the extreme events too rare and unusual to be forecasted with ease. New solutions have to be found and deployed.

In a pretty simple form, premium data came to play since the late 2000s. As a complete surprise to some, premium data sources have emerged carrying previously unthinkable highly-granular real-time front-line information about the number of cars shipped from factories to dealerships from companies like ThinkNum.com, or the number of orders placed online for particular products byReturnPath.com, or the number of high-frequency traders operating in the markets by AbleMarkets.com.

This front-line information, collected and channeled directly from the source to portfolio managers, has been made possible by the evolution in technology. The Internet has enabled software programmers to scan previously disorganized data sources to draw value through synthesizing information into meaningful inferences, directly predictive of near-term market conditions. The technological advances in computing themselves have enabled firms to deliver the data in a super-fast (often, real-time), reliable and, above all, inexpensive fashion.

How does this uncommon data help manage portfolio volatility? Acquiring the data is only the first step in the process. Next comes the understanding of how and how far in advance the particular data is able to detect the onset of certain market conditions, and the most appropriate now-traditional financial engineering approaches are selected for each volatility-inducing market condition. Then, when the reliability of prediction is firmly established, the streaming data is deployed to raise warnings about one market condition or another, and the previously-chosen volatility mitigation method is quickly deployed to safeguard one's portfolio.

Welcome to the new data-driven world. Whether data-enhanced portfolio management is revolutionary or evolutionary, it is no longer optional, but mandatory for sound portfolio management.

Irene Aldridge is a financial researcher dealing with Terabytes of market data every day, so you can do other things, like safeguard the value of your clients' holdings. She is Managing Director at AbleMarkets.com, a provider of real-time indexes on current security interest across the Internet, participation of aggressive HFT in the markets, onset of flash crashes, presence of runaway algorithms, real-time institutional flow of funds in various financial instruments, and levels of toxicity in observed in individual markets. All of the above offerings are strong predictors of short-term and medium-term volatility (from the next 20 minutes to 2 days). Irene is the author of High-Frequency Trading: A Practical Guide to Algorithmic Strategies and Trading Systems (Wiley, 2nd edition) and a co-author of Real-Time Risk: What Investors Should Know About FinTech, High-Frequency Trading and Flash Crashes (Wiley, forthcoming later in 2016). She can be reached at Irene@AbleMarkets.com. In addition, Irene is involved in organization of Big Data Finance seminars at New York University. Please reach out to Irene via IAldridge@BigDataFInanceConference.com to sponsor one or a series of high-quality educational seminar on the exciting topic of Big Data Finance.