Using the absolute minimum of one frame rendered for each test. (unless the benchmark allows rendering of a partial frame and still gives a valid score?, in which case all my calculations are invalid).

Dividing that by the length of the test for each. (from observing the timer, the lengths are as noted).

I believe (clarification needed) that the score is rounded up to the nearest whole number, which would be two.

Click to expand...

that could be true if you are "lucky" enough to get to 1 fps per test and if the ORB will show it in the end, I got 2 frames at the first test and the ORB didn't show any result as if like it didn't run at all

that could be true if you are "lucky" enough to get to 1 fps per test and if the ORB will show it in the end, I got 2 frames at the first test and the ORB didn't show any result as if like it didn't run at all

Click to expand...

Actually, I was referring to a total of 1 frame, not 1FPS.

And yes, the benchmark is flawed, as several times times I have personally observed a test render two to four frames, and still give a N/A. It may or may not be possible to get a valid score with only one frame rendered.

I believe it bears repeating; there is as luck much involved in obtaining a successful result as there is skill.

I´m able to nearly control the Benchmarks for each frame and can tell you that 1 Frame (not FPS) will always result to N/A.
with two to three frames it`s a bit tricky, sometimes i get n/a , sometimes i get a valid value.
the problem is to get them at all four GT`s at the same run

how exact after the decimal point does the benchmark calculate?, or is it possible to view the 3dr with an other editor so see more accurate values ?

I´m able to nearly control the Benchmarks for each frame and can tell you that 1 Frame (not FPS) will always result to N/A.
with two to three frames it`s a bit tricky, sometimes i get n/a , sometimes i get a valid value.
the problem is to get them at all four GT`s at the same run

how exact after the decimal point does the benchmark calculate?, or is it possible to view the 3dr with an other editor so see more accurate values ?

mfg Chri

Click to expand...

1st of all , open the results in excel , ull see detialed numbers up to 10 decimals.

And yes, the benchmark is flawed, as several times times I have personally observed a test render two to four frames, and still give a N/A. It may or may not be possible to get a valid score with only one frame rendered.

I believe it bears repeating; there is as luck much involved in obtaining a successful result as there is skill.

Click to expand...

There is NO luck involed , simply conditions for each test to give a result ,
and there is no rounding after the summ of the 4 parts , the rounding is before the summ , its a Very different result that way.

Benched @ 5.5k i down clocked it a bit and now the sound test wont work but the sound works fine on youtube and stuff. The old card this ibm came with was only dx7 ready lol.
It clocked @ 1.7ghz atm will down clock it and test it again when i have the time.

Well if u still not understand how u get your scores lets start at the Very begining.

Run 3dmark03 with commandline parameters.
3dmark03.exe -verbose
Ull get a report about the run , open it , this should Tell u how long the actual tests WAS , not like it matters at ALL , and just forget the timmer on screen during tests , will also tell u some more.

With the numbers in the report thats actually the same as the numbers in excell , just use the math explained before and ull end up haveing the proper numbers.

Well if u still not understand how u get your scores lets start at the Very begining.

Run 3dmark03 with commandline parameters.
3dmark03.exe -verbose
Ull get a report about the run , open it , this should Tell u how long the actual tests WAS , not like it matters at ALL , and just forget the timmer on screen during tests , will also tell u some more.

With the numbers in the report thats actually the same as the numbers in excell , just use the math explained before and ull end up haveing the proper numbers.

PS: RTFM (ReadTheF*ckingManual)

Click to expand...

Thank you for trying to help me, I am sure this is all very frustrating.

Unfortunately, I only have the free version, for which cmdline options are unavailable.

well u could save yourself a lot of trouble if u not just run the tests and relax but in the meantime u take the effort to read the topic, the manual and the whitepaper, its a great disadvantage to slack on those while also useing the free edition only.

ofc if u still have questions after u done with the "homework" say so, someone will be able to help

well u could save yourself a lot of trouble if u not just run the tests and relax but in the meantime u take the effort to read the topic, the manual and the whitepaper, its a great disadvantage to slack on those while also useing the free edition only.

ofc if u still have questions after u done with the "homework" say so, someone will be able to help

The graphics driver settings should be set for maximum quality, since mipmap bias can othwerwise be downgraded and other similar tweaks producing less than the desired image quality.
If a separate slider is available for texture quality or mipmap settings, these should be set to maximum quality or 'bias=0', or the value that produces the DirectX default mipmapping.
No graphics card tweak software may be used when running a default score. The system should be freshly booted, and as many background programs as possible should be switched off or disabled.
Forced AA or higher quality texture filtering should be turned off, since this might produce a lower score than the system is capable of. All this kind of settings should be set to 'application specific'.
The graphics driver may not display the content differently than how it was originally meant to be displayed / rendered. For example textures that are not originally compressed may not be compressed by the driver.
Vertex shaders may not be forced to be run on the CPU in a default run. There is an option in 3DMark03 to run all vertex shaders on the CPU, but the graphics driver may not alter this setting in any way.
In general, the graphics driver may not have any 3DMark specific settings, it should run in a mode as default as possible. 3DMark is meant to measure general 3D gaming speed, and this is not gained in 3DMark specific driver modes.

I would like a clarification from W1zzard on this point specifically:

No graphics card tweak software may be used when running a default score.

Does this include RivaTuner, NVidia Inspector,et al?

Also, does this:

In general, the graphics driver may not have any 3DMark specific settings, it should run in a mode as default as possible.

mean using nVidia Control Panel to adjust AA, AF, etc. is not allowed?

things you cant do (not a complete list, just examples):
modify 3dmark's executable to give a different score
modify the saved result with a hex editor or similar before uploading
change 3dmark graphics settings so they are no longer at default