What did you say you were doing?

Our global survey of algorithmic and high-frequency trading, conducted online through April and May 2010, has given us a fascinating insight into industry trends and best practice. Here, Bob Giffords presents his detailed analysis of the data.

Geographical Region

Type of Firm

Assets Under Management

Job Roles

HFT as Percentage of Algorithmic Total

High frequency trading is highly competitive, so firms are often
loath to reveal much about their strategies and technical
architectures for fear of eroding their advantage. Only a few
hundred firms are said to be serious players, although rather
more are now using it as a niche overlay strategy to diversify
sources of alpha. Large brokers and market makers have been
forced to use high frequency strategies due to latency arbitrage,
market fragmentation and the sheer aggregation of algorithmically
driven flow volumes. Thus indirectly most traders now make use of
high frequency strategies for a growing proportion of that flow,
although they may only have a vague awareness of the details
involved.

Technology advantages come and go, as new and faster routes are
engineered, technology firms leap-frog each other, and new types
of participants enter the market. Many of these are smaller, more
agile firms as technology and outsourcing have drastically
reduced the cost of entry. They have democratised the markets, as
some have described it. So secret sauces and constant innovation
are crucially important.

Given the hush-hush nature of the beast, evidence on the ground
about high frequency trading practice is fairly thin and
anecdotal. To remedy that, Automated Trader organised a global
survey through their website during April and May 2010, assuring
participants of full anonymity in the results and allowing them
early access to survey conclusions in exchange for their
participation.

The survey set out to gain a comprehensive picture of the speed
and extent that algorithmic and high frequency trading is
evolving amongst different types of market participant across
different asset classes, together with some indications on the
challenges respondents face as they seek to find and keep their
edge. What emerges is a vibrant picture of a growing and
confident community of traders transforming global markets, yet
there's a warning as well that that speed implies risks that will
need to be addressed. The survey attracted reasonably complete
responses from 171 global participants from a wide range of sell
and buy side firms, augmented with the input of a number of
technology and market infrastructure companies.

Survey participants were self-selecting, freely responding or not
to the web survey. E-mail invitations were sent to the full
subscriber list of Automated Trader and to the 1,100+ people who
registered for two webinars on high frequency trading that took
place during the survey. Given the detailed and competitive
nature of the data, participants in the survey were free to skip
any questions for which they either did not know the answers or
preferred not to comment. Therefore, unless otherwise stated all
distributions are compared to the total number of responses to
the question, rather than the whole of the relevant survey
population. Any conclusions need to bear these caveats in mind,
and any quantitative results must be deemed at best indicative.
However, they appear remarkably consistent with the picture of
high frequency trading that has been building up over recent
months, and so give a fascinating new perspective on this 'high
tech' market space.

Individually the participating organisations reported liquidity
contributions ranging from one order per minute to over five
thousand orders per second across more than 100 market centres.
45% of traders estimated their cumulative high frequency flow
across all algos as less than 5 orders per second, while 18%
estimated their firms contributed over 500 orders per second to
the markets.

Only 40% of responses revealed their assets under management. Of
these 60% claimed they were less than $100m. However, this may
relate to the individual fund sizes using high frequency
strategies as much as to the firm's total mandates. Of buy side
firms only 32% were in this smallest category. However, nearly 6%
of responses claimed to manage over $10 billion in assets. For
firms providing balance sheet size, 18 participants or over 10%
of total survey responses claimed balance sheets over $5bn, and
most of those exceeded $500bn.

A wide range of people participated. Trading related roles
dominated with over 58% describing themselves as traders, quants
or other front office managers including the 10% who described
themselves as owners. Next came technology and other market
infrastructure roles with over 38% of responses, while the
remaining 4% came from the middle or back office including risk
specialists.

61% of responses described themselves as using high frequency
strategies for less than a third of their overall trading
activity. This probably ignores some discretionary trading that
may use high frequency broker algos on the way to market. Over
27% indicated they were heavy users, with such high frequency
strategies dominating two-thirds or more of their flow, while the
remainder, around 12%, was in between. This is probably a good
reflection of the overall market distribution given the growing
interest in high frequency as an overlay strategy.

Greatest usage appeared perhaps predictably in hedge funds with
heavy high frequency usage rising to 45% of firms with nearly as
much for participating day traders. Meanwhile sell side firms
reported using high frequency more as a niche activity.
Predictably North American traders used the most high frequency
strategies with Europe not far behind, while Asia Pacific and the
rest of the world tended to use these strategies less
intensively.

There were no real surprises on the current asset class
distribution. Three quarters of responses involved equities, a
little over half used equity derivatives, while well over a third
traded currency products. Commodities and their derivatives made
up nearly 29% followed by fixed income cash and derivatives (19%
and 25% respectively). Buy side firms tended to mention more
asset classes than sell side or other firms, but that may just
reflect a less siloed approach to trading and a more horizontal
view of what is happening. The North American responses included
overall much more emphasis on fixed income and commodities than
seen from other regions, although on the buy side Asian Pacific
traders actually reported the widest current use of equity
derivatives and commodities, for example.

Over the next couple of years, trading in non-equity instruments
is expected to grow strongly, with multiple asset classes
catching up with equities in some regions, while fixed income is
expected to lag a bit behind. The strongest growth is foreseen in
buy side trading of FX and commodities' instruments in Asia
Pacific and North America with two thirds of firms expecting soon
to trade them, but equity derivatives also are forecast to catch
up with equities in most jurisdictions.

The survey reflects strong expectations of growth reported by
over 80% of participants. Buy side firms were the most bullish
with over half expecting their high frequency flow to grow next
year by over 25%. The strongest growth is forecast in the EU and
Asia Pacific, while in North America growth appeared a bit
weaker, perhaps due to the higher base. Indeed, around 10% of
responses there actually forecast a decline next year.

Asset Classes

Predicted HFT Growth Over Two Years (all respondents)

Typical Strategies

Participants were asked to describe one of their 'typical' high
frequency trades to illustrate the range of practice. 35% chose a
market making algo, 37%, a systematic trading algo, 24%, a smart
order routing and slicing algo, while a few described a dynamic
hedging algo. This suggests that the survey was at least
representative of a wide range of experience and usage.

Algo Purpose

The compute platform for these typical trades was usually
described as having between 2 and 16 cores, ignoring data feed
handlers, with a majority of participants using not more than
four. Around a quarter of day traders used a single core, while a
few investment banks and others said they used over 64 cores
spread across a number of servers. Larger configurations tended
to be in North America. As the number of cores on a chip
continues to grow, this suggests technology costs for high
frequency trading should similarly continue to decline, further
reducing the barriers to entry.

45% of typical trades involved only one or two markets. A further
45% apparently managed up to 10 liquidity centres, while only
around 10% managed more than 10 centres.

Systematic algos tended to be somewhat slower, with around half
generating only up to 1 order in the peak second. Market making
algos were much faster with nearly two-thirds peaking at more
than 50 orders per second. Smart order routing and slicing algos
were described right across the order generation spectrum with
around half claiming to peak below 5 orders per second and half
above. Based on the small sample of dynamic hedging algos they
appeared typically to peak below 1 order per second, but at least
one claimed to generate over 500 per second. All regions provided
examples of all frequencies, although North America and Asia
Pacific both tended to describe algos somewhat faster than the
European examples.

The algos varied hugely in terms of the number of passive orders
a single algo might manage concurrently and the expected ratio of
orders per trade. Market making algos tended to manage more
passive, executable orders with higher ratios of orders per
trade. 37% were managing more than 100 exposed orders and 12%
were managing more than 1000, peaking at over 10,000. 50% of
these market-making algos issued more than 20 orders per trade.
In comparison only 23% of the systematic algos managed more than
100 exposed orders and only around 5% managed over 1000. They
also tended to have higher hit ratios, with 70% issuing less than
10 orders per trade. Smart order routing and hedging algos were
again spread broadly over the spectrum with some managing as few
as one exposed order while others were claimed actually to manage
over 10,000. Both tended to have higher hit ratios, the vast
majority (90 to 100%) issuing less than 20 orders per trade.

Processing Resource

Alpha horizons also varied with market making algos typically
working in milliseconds to minutes, while systematic algos might
more typically hold assets for minutes to hours and sometimes for
weeks or months. Smart order routing algorithms were of course
not typically concerned with alpha horizons, although some people
suggested a wide range of alpha horizons were used.

System Architecture

Technical architectures are clearly advancing rapidly. Just over
half of all responses already had distributed their trading
servers, while the rest still traded from a single location.
Across all regions a single trading centre typically appeared to
mean the firm's proprietary data centre, but for high frequency
work it might sometimes be colocated at the exchange or in a
multi-tenanted proximity hosting centre. Where distributed
trading servers were deployed, there was a small preference for
exchange colocation, but proximity data centres and sometimes
even proprietary data centres were also used. Around one third of
these distributed trading servers were actively collaborating in
real time with each other, while the rest were operating
independently. Asian traders had more distributed servers working
independently, typical for the region with long inter-market
latencies, while in Europe and North America with more clustered
markets the more typical architecture was still to have trading
servers in one location, although more distributed architectures
were also described.

Where multiple trading engines were used, nearly half of
participants said they dynamically positioned individual orders,
while a further 40% positioned orders by a fixed rule typically
with a manual trader override. Less than 10% claimed there was no
real-time decision at all regarding algo positioning. Somewhat
surprisingly, perhaps, Asia Pacific appeared to have relatively
the most dynamic positioning, Europe and the rest of the world
had the least, while North American responses were evenly split.
The buy side also appeared to use more dynamic positioning than
the sell side.

Co-Location

Market access
was provided by in-house algos and broker DMA in over half of
cases, although for North American buy side firms there seemed a
preference for sponsored access. Direct exchange memberships were
also used in nearly half of cases across all types of trader. On
the buy side direct memberships were somewhat more prevalent in
Europe than in North America. Third party broker algos or
sponsored access were generally said to be the least common
methods of high frequency access across all regions. That may
reflect more on the particular sample captured by the survey,
since many long-only buy side firms who use broker algos, might
not have responded to the survey or, if they did, may not have
considered themselves to be using high frequency techniques for
this flow even though they were.

The range of market data being consumed by high frequency algos
is also diversifying rapidly. The survey highlighted two
important trends: increasing use of full depth and cross-asset
pricing data along with a range of new, more exotic data feeds.

Traditional market data includes top of book and full depth
pricing data, plus post trade feeds. Of those responding to
market data questions, 63% said they used cross asset data in
their high frequency algos. This was fairly consistent across
regions ranging from 52% in Asia Pacific, to 65% in Europe and
71% in North America. For full depth pricing data the responses
were somewhat higher and clustered tightly around 70% in all
regions with only the rest of the world 'region' dropping to 60%.
However, in only 30% of cases did participants report use of post
trade data in their high frequency algos.

Dynamic positioning

The use of exotic or non-traditional data feeds is clearly
another growing trend. 43% of responses on market data indicated
some use of technical (32%), newsflow (21%) or liquidity
indicator (12%) data feeds. Technical signals included latency
measurements (22%), or queue lengths and similar metrics (19%).
Liquidity measures focused on IOIs or similar indicators, while
newsflow signals were either event or machine readable newsfeed
data (15%) or else sentiment or other news metrics (10%).

Exotic data feeds in Asia Pacific tended to be used more heavily
on the buy side, while in North America it was much more on the
sell side and in Europe, split fairly evenly between buy and sell
side. Note also that no one reported using news sentiment in Asia
Pacific, while otherwise the exotic feeds were used generally
across all the regions.

Most participants said they developed their high frequency algos
in-house, while one-third use a third-party modelling framework
for this. When it came to production, just over half use a
third-party OMS/EMS framework, although a few have customized
this heavily. European traders were more inclined to use an
internally developed run-time environment (59%), whereas traders
used third party products for this in North America (58%) and
especially Asia Pacific (63%).

Risk and Performance Management

Overall the most important challenge was said to be dealing with
data volumes, mentioned in half of responses (50%), with a
fraction less mentioning the achievement of low latency (over
47%). However, in Asia Pacific these priorities were reversed.
The buy side was also a bit more focused on latency than the sell
side. Controlling DMA and sponsored access was the third greatest
concern (just under 47%) or otherwise managing the real time
risks (39%). No one claimed that finding alpha was a challenge!
However, resolving errors and technology failures was a worry in
around a quarter of cases.

Two thirds of high frequency traders used dynamic hedging to
manage risk, with around half of those always hedging. Buy side
traders were more likely to hedge than the sell side. Not
surprisingly, those who typically do market making were much more
likely to hedge than those doing systematic trading or smart
order routing.

Data Types

Where traders typically used long-short market neutral strategies
or otherwise claimed to typically hedge their high frequency
flow, they also claimed, not unreasonably perhaps, their
strategies tended to consume less capital than their low
frequency activity. So risk strategies appear to be tightly
linked to the particular trading style. While over half of
traders claimed they might equally apply high frequency
strategies to long only, short or market neutral trading styles,
many confirmed that they either applied the same risk strategies
to both high and low frequency flow or else that the risk
strategy depended on the specific trading strategy rather than
just the speed of trading operations.

In any case 87% of high frequency traders still said they managed
risks in real-time. The numbers were a little higher for the buy
side, but rather lower (70%) for the sell side. Those whose
typical high frequency strategy involved market making or
systematic trading also did rather more real-time risk
management, while those doing smart order routing and transaction
hedging did somewhat less. Perhaps if the trading algos are doing
their own dynamic hedging, separate real time risk management is
less critical. This in any case might help to explain the lower
sell side figures, which otherwise seem a little puzzling and
perhaps accidental.

Top Business Challenges

Just under half (46%) said they held capital only against their
executed real-time position, while over 30% held capital against
both their executed and potential exposed position, continuously
assessed in real-time. The remainder just calculated their risks
against executed positions periodically or at end of day. Asia
Pacific and Europe had the highest rates of real-time risk
management for market making (88% and 82% respectively), while
systemic traders in those regions had relatively lower rates (56%
and 67% respectively). North America had somewhat lower rates for
market makers (75%), but higher for systematic traders (82%).
However, since we are talking about rather small sample sets for
each sub-category, the broad trends are probably more significant
than the precise numbers.

To measure performance the participants mainly tended to use
Sharpe Ratios (50%), net absolute returns (45%), maximum drawdown
limits (39%) or risk weighted rates of return (34%). Only around
a quarter said they used value-at-risk measures for their high
frequency trading with rather a greater emphasis in Europe than
elsewhere. The Sortino method, which is similar to Sharpe but
aims to differentiate good (upside) from bad (downside)
volatility, was used in 23% of cases overall and more in the US
on the sell side than anywhere else. A few traders used
time-weighted rates of return (20%), index comparisons versus
published indices (14%) or comparisons versus competitors where
fund classification benchmarks were available (8%). One firm used
return versus the number of messages sent, another arbitrage firm
compared the required funding rate to a hurdle rate, while a
third used a variety of unspecified in-house measures. No other
specific performance measurement techniques were mentioned.

When asked to specify where regulators might usefully intervene
in the market, a majority actually called for minimum levels of
risk management for trading and clearing members of exchanges or
other electronic trading platforms and also for minimum capital
requirements for intraday exposures under various stress
scenarios. Clearly they felt that would level the playing field.
Flash orders were also seen to be ripe for regulatory
intervention by 40%. So-called naked sponsored access, more
prevalent in the US rather than Europe, was also considered
potentially in need of intervention. 28% felt that co-location
and equal access rights for high-speed traders might be useful.
However only 12% thought that short selling uptick rules were
needed. Note that the SEC had only recently brought in a new rule
here to deal with rapidly falling markets. So the view on short
selling might have been influenced by the current proposals. In
any case, the community does seem to be open to discussion on
common rules on quite a range of issues.

The Way Ahead

Participants in the Automated Trader survey were a self-selected
community of people using these techniques, so it is not
surprising that advanced methods appear with some regularity.
However, since high frequency flow is starting to dominate many
markets, the absolute size of the community does not detract from
their importance.

It is clear that the leading edge of this group is comfortable at
speeds above 500 or even 5000 orders a second, maintaining
thousands of open positions across quite a wide range of markets
distributed across a growing number of trading servers and cores,
co-located at exchanges or in proximity hosting centres. The
number of firms already using distributed yet collaborating
trading engines across multiple asset classes and large numbers
of markets is perhaps the most dramatic finding. While larger
firms often achieved higher scale operations, the survey also
showed smaller firms to be at the forefront of innovation and
also speed. The competitive barriers are well and truly coming
down.

Therefore, best practice has to involve a growing mix of data
feeds across asset classes, including news, technical and
liquidity data. It probably also needs to demonstrate ultra
real-time performance and risk management including automated
hedging with capital held against both exposed positions as well
as actually executed trades.

Given the projected growth for existing players and the
inevitable attraction for new players with falling technology
costs and ever more services offerings, market data rates are
likely to continue to rise exponentially especially once cross
asset trading and news flow really kick in. As competition
intensifies, the challenges will thus grow quickly even for the
fleet of foot or deep of pocket.

During the recent flash crash on 6 May in the US markets, we saw
what can happen when high frequency flow either spikes, betting
on reversion, or simultaneously withdraws, avoiding the
maelstrom. We can confidently expect many more exciting roller
coaster rides to come. Regulators should take note of the current
openness of the community to sensible rule making, before future
crashes bring more painful but predictable surprises. More
importantly regulators need to understand where the market is
heading as well as whence it has come.

Yet the survey clearly illustrates the many risks lurking both
for the unwary, slower trader and for the high frequency players
themselves. High frequency flow provides liquidity, cross-asset
predictability and dramatic economies of scale for the investor
at the cost of embedded leverage on the tail risks. Thousands of
messages per second and tens of thousands of exposed open orders
per robotrader assume liquidity will persevere or an orderly
disengagement can be managed. When sentiments darken and that
liquidity begins to melt away, the cherished predictability can
suddenly disappear. Without an appropriate market infrastructure
the regime can break down in seconds.

Market organizers and regulators must now make the markets safe
for high frequency. Indeed this survey should give food for
thought for all market participants. The skills demanded are
obviously ramping up with grim determination. Change continues to
accelerate everywhere. That, perhaps more than just the speed of
today's traders, is the most important conclusion for us to draw
from this survey. By lifting the veil on all this highly
secretive innovation, the Automated Trader 2010 survey on high
frequency trading will undoubtedly spur on some to invest,
encourage others to rethink their strategies and hopefully call
market organizers and regulators to action stations. Markets are
evolving very quickly indeed.