On Wall Street, having access to large amounts of pricing data is a key building block for successful trading strategies. In-memory data models are able to collect, analyze and derive trends from the flood of information in the market. Companies then use these models to drive algorithmic programs that automatically conduct trades based on pre-defined criteria. These algos, in turn, can power any number of profit-generating activities, from finding arbitrage opportunities to speculating on emerging trends.

Speed Matters

Today, approximately 73 percent of the daily volume on U.S. equity markets is driven by high-frequency trading (HFT) platforms. Millions of algorithm-based offers (and trades) take place every single day. A few microseconds can make the difference between a profit and a missed opportunity.

But even the best data models won’t generate results unless data is processed in a timely manner. New data must be received, analyzed and acted upon in real-time. If data models and the algorithms which act upon them take too long to update, firms can’t execute in time

If exchanges don’t perform predictably, the results can be disastrous. Likewise, financial firms must trust their systems to perform trades or send new data in a timely (and accurate) manner, otherwise they can incur heavy losses, or in some cases won’t survive.

Wall Street is Memory-Intensive

Over the years, programmers have been utilizing data models that are processing larger and larger in-memory data structures. The financial industry has had to optimize its systems in order to handle that larger data capacity, and sometimes the predominant programming language of choice, Java, has gotten in the way.

Java is the language and environment that powers the majority of programs and applications in the financial sector. Java-based apps must manage large in-memory data sets in real-time in order to enable business-critical transactions. But Java comes with built-in delays, sometimes simply referred to as “jitter”. Low-latency applications with large in-memory models can stall for a variety of reasons – and in some cases even a few milliseconds of delay can be disastrous.

The Evolution of Java in the financial sector

Recently, our prospects and customers are outlining more rigorous service-level agreements (SLAs), especially for maximum transaction latency. Predictable performance and low jitter, in other words, are becoming new standards. We see this every day as our customers describe how they are using a combination of aggressive JVM tuning, specialized low-latency development frameworks and alternative JVMs to deliver the predictable performance their applications require. Even Oracle is merging key features from JRockit into its HotSpot JVM, a process that will continue through Java SE 8 and beyond.

As the core Java platform evolves and developers continue to push the language features and its runtime, we can readily see a time when IT managers won't have to ask their teams, “When will Java jitter cause our system to miss an opportunity?” Instead, they can enjoy asking a new question, “How much money are we making?”

Scott Sellers provides strategic leadership and visionary direction as the CEO and co-founder of Azul Systems, which delivers high-performance and elastic Java Virtual Machines (JVMs) so enterprises can effectively scale while meeting business objectives.