This summit will cover the latest trading and technology challenges affecting the buy-side in an ever-changing financial and regulatory landscape, as well as innovative strategies for optimizing trade execution, managing risk and increasing operational efficiency, while keeping costs to a minimum.

WatersTechnology and Sell-Side Technology are pleased to present the 7th annual North American Trading Architecture Summit.
Bringing together technologists, architects, software developers and data center managers from the financial community to discuss the latest issues in trading technology.

Hosted by WatersTechnology, the Waters Rankings voting will be open to every investment firm, hedge fund, and exchange across the globe and will give recognition to technology and service providers for their offerings.

The aim of the awards is to recognize the leading technologies and vendors in their area of expertise, through an auditable and transparent methodology underpinned by the input and experience of six judges - four buy-side-focused technology consultants and Buy-Side Technology's editors.

This white paper considers grid's place in a cloud-enabled world, addresses options for optimizing onsite system performance, and discusses alternative on-premises solutions such as hardware acceleration and supercomputers.

Stress testing and scenario modeling is essential for any financial institution (FI) that wants to survive market shocks and increased regulatory scrutiny. This report tracks developments in the marketplace, suggests best practice and provides an overview of the available risk technology support systems.

Data Channel: The Defining Moment

Vicki Chan

Vicki Chan

02 April 2012

Tweet

Facebook

LinkedIn

Google plus

Save this article

Send to

When I first joined Inside Market Data, the top item on my to-do list was to understand just what, exactly, is market data. I'll admit that the first thing I did was to Google it, and did you know there's a Wikipedia page for market data?

But nothing helped more than actually speaking with people in the industry, from exchanges and vendors supplying market data to the end-users consuming that data and the technology providers supporting its delivery. After all, these are the people working with market data on a daily basis.

Maybe the US Commodity Futures Trading Commission had a similar idea in mind when it enlisted founder and chief executive Larry Tabb of research and consulting firm Tabb Group for its Technology Advisory Committee's new subcommittee on automated and high-frequency trading, which is tasked with, among other things, creating a definition for high-frequency trading. And I suspect this will require more than Wikipedia-which, by the way, does have a page for high-frequency trading.

The current lack of consensus around what constitutes high-frequency trading poses a major challenge, but establishing a common foundation for everyone to work from is a crucial first step towards the creation of any rules around it. Surely, Symmetricom, a provider of time-synchronization systems, could relate to the importance of ensuring everyone-or rather, every application-is working from the same information. This week, the vendor is releasing a clock card to help sync time across all of a customer's servers to within 600 nanoseconds of global time.

Having synchronized time across an infrastructure also means latency monitoring systems can more precisely identify delays, and if there's one thing perhaps everyone can agree on, it's that speed is key to high-frequency trading.

Fixnetix and NovaSparks are delivering on that speed with an alliance that sees the latter's FPGA feed handlers delivering sub-microsecond latency as a managed service, opening up access for those firms that may not have the resources or expertise to maintain such a deploy¬ment on their own. As Fixnetix's business development director Alasdair Moore notes, "While FPGA solutions generally deliver about a hundredth of the latency of software-based solutions, managing an FPGA device is very different from managing software. There's a whole new skill set involved."

Meanwhile, Tim King of Level 3 Communications in this issue's Open Platform examines how demand for low-latency network routes is extending beyond traders' home markets and increasingly to "secondary" financial centers across Asia, Eastern Europe and the Middle East.

In a bid to capture some of this growing demand, Hong Kong Exchanges and Clearing last week announced the launch of its HKEx Orion program, a $380 million project to overhaul its technology and deliver new market data distribution and market access services, connectivity networks and a new datacenter, which will ultimately improve the exchange's overall performance. HKEx has already recruited vendors-including Fixnetix, IPC, KVH and MarketPrizm-to be part of its "hosting ecosystem" in the new facility, which will go live at the end of this year, and will leverage NYSE Technologies' Exchange Data Publisher (XDP) data distribution platform for its HKEx Orion Market Data Platform, scheduled for launch in the second quarter of 2013. XDP will give HKEx the ability to distribute over 100,000 messages per second at microsecond latency.

So whatever definition the CFTC ultimately comes up with, I imagine it will need to leave room to account for continuing technological advances that keep shaving off latency, which then leads to the commoditization of what was once considered cutting edge. For some, low latency isn't fast enough anymore and they require ultra-low latency, but where is the line between low latency and ultra-low latency? And as high-frequency trading gets more advanced, and even mainstream, will we eventually see ultra-high-frequency trading?