Im thinking the new tessellation that will be in the release will kills 5xxx series cards

Click to expand...

Wrong. Tessellation killing anything 5000 based will depend entirely on what AMD does with its driver-side "AMD Optimized" initiative, which we have yet to see the fruits of.

Other than Civ5, they never really had a game that needed this initiative to do anything, but considering the large marketing push nVid is doing for GTX 580s in Battlefield 3, this may be the game that makes that team start earning their paycheques.

I have been playing BF3 and its super fun. I have been playing on Xbox 360 though, I can easily get kills on the console(22K/10D), I play as Engineer, but whenever I play these games on pc i get killed all the time.

Click to expand...

Console play is much slower, I tried it out last night. I'm assuming you are just camping? Seems easy to hold a great KD on consoles if you do that.

BF3 will have join able and creatable squads soon! You have asked an DICE has listened. They are implementing squad system similar to BC2 in the next day or so!

Click to expand...

Thats nice to hear, but doubtful on news. People just figured since you can't squad manage in the beta, that that was it, none in game either. Everyone thinks betas mean demo now, just look at the last page in the thread

Hmmm I have a I7-920 OCed to 4ghz and a ref 5850 and I dip low in FPS on ultra? are you guys using a tweaked driveR?

Click to expand...

I've played it both with Crossfire On and Off and I believe I'm using the Auto graphic selections which put everything on High except for Texture Quality, which went Ultra. I'm only using the 11.10 preview they released for BF3 and no additional CAPs as was recommended. And my 920 is only at 3.33GHz.

Not sure what to tell you. I'll edit this post once I've confirmed exactly what my specs are.

Lot of reviewers that reviewed the game for mid/high end setups at the end of the reviews said that it was pointless using the beta as a benchmark as the games performance in general is so inconsistant.

Lot of reviewers that reviewed the game for mid/high end setups at the end of the reviews said that it was pointless using the beta as a benchmark as the games performance in general is so inconsistant.

Finally, for the record, all of these performance reviews both linked and elsewhere while interesting for the curious, are all rubbish to the objective. With the beta comprised of one map (I would've loved to see data on Caspian Borders, though I can imagine the difficulty a tester would've had pulling that off), incomplete Ultra engine implementation, rushed Beta drivers from both nVid and AMD and a bug filled play environment with map clipping, unoptimized foliage and animation glitches abound, none of this means anything until its November, nVid and AMD put out WHQL drivers (or at least further optimized preview ones) and a comparison is done between the FPS results of a tight Squad Deathmatch versus a 64 player Conquest map taking place in the desert or urban environments EA were so happy to market in Fault Line and E3.

Just wanted to quickly point out that Techspot's review states, or at least implies via its wording, that they were using the Ultra and High presets. As they are presets, the game... and I quote from the testing methodology page...

"When set to Ultra every setting is maxed out with the exception of anti-aliasing post which is set to medium. The anti-aliasing deferred settings is set to 4xMSAA while anisotropic filtering is set to 16x. Other quality settings such as texture, shadow, effects, mesh, terrain and terrain decoration are all set to Ultra.

The Ultra preset was extremely demanding, so we also tested using the High quality preset. This turned anti-aliasing deferred off and left the anti-aliasing post settings on medium. All other visual quality settings as detailed above are turned to high. We'll be looking for an average of 60fps for stutter-free gameplay."

In other words, the High preset is what is turning off MSAA, not the tester. Furthermore, anything the tester did change from the preset would thus change the "Graphics Quality" setting in the screenshots to 'Custom' from 'Ultra' or 'High', indicating tickering.

I'm not saying the techspot review is perfect either, but it certainly does seem to be the data with the most integrity in that group of links. Even more so because it compares different CPU performance (a lot of people, including myself, still have those legendary i7 920s and didn't have a reason to upgrade to every testers poster child Sandy Bridge) really well, and is something not even my preferred GPU testers (Anandtech/Tom's) do often enough.

Click to expand...

Keeping strict to presetting(s) from a video card driver is more common then with games I would think. It should have been tested with MSAA as illustrated here. How else would any reader define "demanding" for that game if there is no numerical data to show the reader? Furthermore, I could understand MSAA not being used at 2560 resolution but at 1920 and 1680 some of the video cards should have been capable of offering playable results. This is why I pointed out there was no AA used.

But like I said before, lets wait until the final release to see what happens. There are few things not mentioned yet that I would like to know. For example:
-Results using MLAA/SRAA/FXAA/MSAA. IQ vs Performance, which will be better (if any are used).
-Will any of the drivers from AMD or Nvidia use multithreaded rendering? If so will we see any performance improvements?
-Does having Vram over 1gig make a difference in performance?
-etc