I'm running the latest 64Bit OP version from the most recent download.

Anyhow, I'm trying to optimize this hardware for the better backtest/simulation speeds. I'm testing on bid tick data, for futures such as SPX and ND100. Right now it's been taking about 8-9 min to backtest a single year which is about 27 Million lines of data uploaded from a data file.

During a backtest run, I'll check with the Window 7 hardware use monitor and it says that OP is consuming about 5-6G of RAM, but then I look at the core usage and it's only running at about 5-15% throughout each of the 6 cores, with single cores occasionally spiking slightly, but for no significant time. I can also monitor the temperature of the chip and it does not increase a single degree - so it's clear to me that the chip is not being used anywhere near it's total capacity.Consequently, I'm looking for suggestions from the user group here as to how I could adjust the machine or software setup to make better use of what is one of the most capable consumer computer chips ever produced. Should I add more RAM memory? I'm running window's professional, should this be changed? Should I be coding my actual scrip in a particular way to take more advantage of the hardware, and is this even possible when the C# code is running through OP?

I would like to be able to run 10-100X the number of data points I'm currently using, but would also like to be able to significantly speed this process as well.

This probably means that your processor is much faster than HDD, so that IO operations (i.e. reading data points from the disk) take much more time than strategy calculations.

We are going to add a performance monitor window in OQ that will show how many events per second your startegy can process, as well as memory usage and other statistics. This should help you (and us) to profile OQ/hardware.

When data is loaded into the OpenQuant software, is that data file's contents transferred to be stored on the same disk among the other program files, or is OpenQuant reading from the original data file in it's original location every time a backtest is initiated?

OpenQuant uses a custom high performance streaming binary database for market data storage and replay (called QuantServer). It's optimized for backtesting and can I/O ~million+ ticks per second but of course it's slower than in-memory data processing.

I've got a related question. I'm planning to buy a workstation (something in the $1200-1800 range) to use mainly for backtesting and autotrading, and I'm curious where I get the most bang for the buck.

As background, I currently work with daily bars, but am beginning to test with tick level strategies as well (quasi market making and pairs trading) on a sizeable universe (ideally 300+ equities) via IB TWS API. My internet connection latency is normal home internet, so I'm not winning any speed battles with the collocated folks, but would like to make sure my hardware doesn't bog down. I'd also like to be able to crunch backtests as efficiently as possible

Q#1: Based on my profile, am I wasting money here? #2: Is there a ceiling to how much RAM the application can use in Win7-64 bit? Is it RAM sensitive? is 12GB sufficient/excessive?#3: For processor speed, any general comments like # of cores, Xeon vs i5/7? Phenom? How sensitive to CPU?#4: For HDD, the earlier poster had an SSD and sounded like he may be bottlenecked by I/O. Do I need to invest heavily here for good backtest speed? is 10K vs 7200 rpm important or is it SSD vs spinning disk that matters?#5: If I can run the system in "XP Mode" is that faster/more stable? I've seen some comments on other msg boards to that effect.#6: Anything else in a config that matters?

OpenQuant uses a custom high performance streaming binary database for market data storage and replay (called QuantServer). It's optimized for backtesting and can I/O ~million+ ticks per second but of course it's slower than in-memory data processing.

Regards,Anton

Anton, has the market data storage been replaced by SQL Compact or something?

That's what I thought, but was confused to see .sdf files in C:\Users\mylogin\AppData\Roaming\SmartQuant Ltd\OpenQuant\Framework\data

They're actually related to execution, instruments definition and portfolio, not market data.It seems that market data is stored in the .data file in the same folder.Is there an API I could use to programmatically extract data from the .data file if required at some point? Not urgent, just curious.

Take a look at DataManager.GetXXX in OpenQuant API. There is also a low level API but it's not accessible in OQ.

Thanks.Two choices then if one wants to write standalone applications (outside of OQ) that access OQ's data:- have the VS2010 project reference DataManager- find which low level API you're using and reference it directly

Who is online

Users browsing this forum: No registered users and 1 guest

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum