Quants: (Big) Data Whisperers

Tue, 10 Feb 2015 05:45:44 GMT

Quants are interesting people. Their presence spans both new and old analytics systems, architectures and cultures. Once hailed “masters of the universe”, quants apply mathematics to drive economic growth. They help create liquidity through derivatives, trade efficiently, manage portfolios systematically and assist the financial sector in risk management. Two events have disrupted their world.

First, there was the Global Financial Crisis. A heady mix of toxic derivatives, flash crashes and “Flash Boys”, inflated pay-checks and market manipulation ruined reputations. Science and humanity collided. Quants, I argue however, were part of a system not creators of it. Sure, they applied their mathematical prowess, too secretively perhaps, but they were part of a system designed by others, silo-ed across profit centres rather than collaborating for the greater good. They battled inflated IT legacy and bureaucracy on complex (and risky) software development programs, particularly on the sell-side. Their managers perhaps also saw profit rather than algorithms, insight and technology.

However, Quants continue to be respected, appreciated and in demand, because they have good broad skills sharpened by experience. They are well placed to navigate the second disruptive event: technological change fuelled by “big data” and “data science”, worlds of machine learning, HADOOP, MapReduce, NoSQL and “business insight”.

37% consider risk management to be the main driver for their model development, while 27% note profit potential and 15% reference regulation.Risk management driving on its own account rather than enforced by regulatory compliance is positive.

31% report their institution has implemented a project reacting to a specific big data challenge, while “data quality” (36%) and “creating effective models” (36%) are the greatest big data challenges in financial services. Big data is still a minority discipline in the industry.

12% report using machine learning methods while 40% want to learn more about them. 22% and 23% worry about over-fitting and “black box” concerns respectively.

68% identify a “lack of agility to respond to market changes” as the biggest opportunity cost of slow model development.

Over 60% reported their institution encourages collaboration between model developers and business practitioners. This is positive, particularly post Global Financial Crisis

Remember too that Quants saw the rise of SQL, statistics, Monte-Carlo simulationand high-performance computing. “Revolution” is not new. New Quant kids on the block thus combine fresh ideas with mature discipline. Paraphrasing a leading (youngish) Quant I saw on a heated panel debate last year, “we have the right skills: we understand algorithms, we program and we have much to be proud of”. I think he’s right. If we are beyond the big data hype cycle peak, Quants can add clear-sighted focus to the vibrant anarchy of big data and data science.

Second, good modelling is not just about the methodology, it’s also about good implementation. A brilliant analytic implemented badly is misplaced dynamite. Society, consumers, regulators, even executives and managers demand great analytics, agility and quality.

Global banks have struggled for years to maintain risky time-consuming “over the wall” development involving multiple languages and platforms. Smaller institutions have often compromised quality for speed. Most survey respondents seek a “single software environment”, taking model insights into production quickly, but with appropriate testing, maintenance, scale and transparency, in part to reduce model risk.

Other industries - automotive, aerospace, telcos – show this is a realisable ambition. Financial services has learned the hard way, and Quants are modelling the way. Data scientists take note of the Quant (big) data whisperers.