Finding Alpha in the New Normal

Adapting to the new normal of the equity markets, diversifying into other asset classes, and confidently managing the risks for healthy returns can be achieved by understanding, managing and analyzing data.

Louis Lovas, Director of Solutions, OneMarketData

Within financial markets, understanding data is a game changer. Financial institutions are awash in data, it’s coming at them from every direction ebbing and flowing like the tide. But it is a difficult challenge to take full advantage of the ocean of information and navigate a path to capitalize on market opportunities. Under the crush of regulatory scrutiny, increasing competition, tighter spreads, thinner margins and a lower risk appetite, firms are researching deeper into big data to create smarter algorithms, squeezing harder on costs and reaching out beyond single-asset safe harbors for alpha.

Yet in a time when big data has become all the rage garnering enormous media attention as the next big thing as a source of insight and revenue, we are witnessing a strange paradox in the equity markets.

The 2013 January effect has become a sugar-high. It has been phenomenal - the current market exuberance as we’ve seen since the beginning of this year shows the S&P 500 is up over 1500, a 5.19% increase over the close of 2012. This winning streak is the longest in eight years and has the S&P 500 just 4% shy of its all-time closing high of 1,565.15 set back in 2007. Adding to the enthusiasm, it is anticipated that Hedge Fund Research (HFR) will report that the average hedge fund returned a positive 5.5% in 2012, a year/year increase from 2011.

The backdrop to this fervor is a more sobering phenomenon. Notwithstanding the rally, there have been disappointing volumes into equity markets throughout 2012, spilling over to 2013. Even the low bond yields fueled by the U.S. Federal Reserve’s Quantitative Easing policy have not had any influence to boost equity volumes. NYSE’s volume composite index (MVOLNYE) indicates a 42% decline year-to-date.

This negative divergence in the equity markets between volume and price could usher a quick and hasty end to the January rally. Divergences can signal a coming change and could be a sign that a reversal of the price trend is at risk. A thin market also has the potential to whipsaw wildly since one transaction can have a disproportionate effect, making it difficult to determine whether the bid/offer represents the real market value. And of course the big unknown is the possibility of a change in policy as new SEC chairman, Mary Jo White, looks to make her mark on the industry. Pressing ahead with ill-conceived regulations could have the same halting effect.

Yet with this market uncertainty, defining a new normal does not warrant an emotional capitulation to lower returns and/or higher risk ahead.

In a recent survey by Linedata, a goal of maintaining investment performance was indicated to be a critical concern in the coming year with hedge funds (68%) and asset managers (59%) stating this as their top challenge for 2013. Falling volumes equates to thinning margins – paradoxically, that translates to the increased use of algorithms as firms look to squeeze alpha out of a diminishing pot. Adapting to the new normal of the equity markets, diversifying into other asset classes and confidently managing the risks for healthy returns can be achieved. Understanding, managing and analyzing data is the key. Data is the fuel for the engine of the trade life cycle beginning with research and discovery of new trade models and measuring executions against benchmarks for trade performance.

The salient fact is that managing data across a multitude of markets is a messy business. Financial practitioners’ worst fears are spending more time processing and scrubbing data than analyzing it. One significant challenge is managing scale. Poorly designed or legacy systems can easily translate to spending inordinate amounts of time and resources processing data, outfitting new storage and scrambling to deploy new systems. As trading firms diversify, the demand to capture and store more and more data from numerous sources across many asset classes places enormous demands on IT infrastructure. The improvements in compute power and storage per dollar have made the consumption both technically and economically possible. Cloud deployments provide advantages in managing the scale through higher levels of data protection, commoditization and fault tolerance at a cost savings.

Firms are capturing data for a number of applications including: back-testing new and evolving trade models in the never ending hunt for alpha; for risk management to monitor and control exposure; and transaction cost analysis to measure trade performance. The challenge is to achievethe needed timeliness and quality of data in the data dump. That requires dealing with the vagaries of multiple data sources across global markets; the U.S. alone has 13 public exchanges. Data quality also means managing a sea of reference data for mapping ticker symbols across the global universe, tying indices to their constituents, tick-level granularity, ingesting cancelations and corrections and accurately inserting corporation action price and symbol changes. Any and all of these factors are vital to data quality and the science behind smarter algorithms. Profitable algorithms are part genius, part inspiration and part perspiration and their complexity is accelerating as markets evolve. They are born out of the mathematical ingenuity of quants and become the lifeblood of trading firms.

Quantitative analysts apply an empirically-tested and rules-based approach to exploit perceived market inefficiencies manifested by human behavior, geo-political events and complex market structure. Nonetheless, it’s a documented fact that trade models have a short shelf life. The evolving market conditions as shown by the present negative divergence in the equity markets can stress model performance. The side effect of this is increasing demands for deep historical data over longer time periods. Recognizing similar market conditions occurring in the past as revealed by historical analysis can shed light on their consequences. Back-testing models through these past market conditions will provide a means to fine-tune algorithms to manage the inherent risk and reveal alpha.

We are witnessing a new normal defined by diminishing volumes, wild rallies and uncertain regulatory policy. This does not signal an end to profitability and the discovery of alpha hidden within the depths of our markets – on the contrary, this is where it begins.