Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

MojoKid writes with Hot Hardware's summary of what it takes to run the newest Crysis: "We've been tracking Crysis 3 for a while, from the trailer a few months ago to the recent alpha multiplayer preview. The game is available for preorder and it will launch in February. Crytek has now listed the minimum system requirements for Crysis 3 and they're as follows: Windows Vista, Windows 7 or Windows 8, DirectX 11 graphics card with 1GB Video RAM, Dual core CPU, 2GB Memory (3GB on Vista). Those aren't particularly stringent parameters by any means, but as we all know, 'minimum requirements' rarely are. Crytek suggests upgrading to a quad-core CPU, 4GB of RAM, with examples of CPU/GPU combinations that include Intel Core i5-750/NVIDIA GTX 560 and AMD Phenom II X4 805/AMD Radeon HD5870."

IF I had pirated it and played about half the campaign already (which I haven't I'm too moral!), I would say it runs perfectly on my system. Quad core i5 2500k and Geforce 670, but that is fairly high end, no idea how it would run on a lower one. Or mine..since I haven't played it.

I never heard of a crysis 3 leak, I think you are confusing it with crysis 2, there was an almost open multiplayer alpha(nvidia would give away key to pretty much everyone) at the beginning of november, but the performance was quite awful on my system(3930k @4.4ghz and GTX580), I think it was around 18fps with everything maxed out but I guess it was probably a debug build so it's hard to say how it will run when it gets released

IF I had pirated it and played about half the campaign already (which I haven't I'm too moral!), I would say it runs perfectly on my system. Quad core i5 2500k and Geforce 670, but that is fairly high end, no idea how it would run on a lower one. Or mine..since I haven't played it.

There was NOT a crysis 3 leak.

There was a multiplayer alpha which I was part of, and the game ran at 60fps highest settings on my i7-920 and a Nvidia GTX460.

These really aren't much in the way of system requirements. Which just shows how this extended console generation has had an affect on PC graphics development. Though I'm not complaining it saves me money in the long run, and forces programmers to learn how to do more with less hardware which isn't a bad thing for the most part.

This says it only has 512 mb:
http://www.anandtech.com/show/2939 [anandtech.com]
"Coupled with the GT218 GPU on the G210 is 512MB of DDR2 RAM, using the customary 64bit memory bus. Interestingly, unlike most other entry-level products, the G210 only comes in 1 memory configuration: 512MB."

You're looking at an MSI 512MB model, he linked an eVGA 1GB model. Also, the GT210 was low end when it came out - not something you should be expecting a good gaming experience from. Either way, it's an extremely old card. I just bought an evga 620 a few weeks ago to add a few more monitors to my pc for under $50 and it had 1GB. Current high end cards have 2-4GB onboard. Even my (also outdated) gtx 480 from a few years ago had 1.5GB.

I bought a low end card as a temporary replacement when my main one went in for warranty work and that low end card had 2 gigs of video ram. In fact I have seen lots of low end cards over the years with over 1 gig of video ram, but it's not as fast as the ram that is on the high end cards.

LOL wut? Informative? You can get a 5650 with 1 Gb of RAM for less than $40, keep an eye out on the sales it can be had for less than $20. Hell if you don't mind refurbs (and I don't, never had a problem with 'em) you can get a 6850 for $100 and a 6870 for $120, and that's for a 256bit card with GDDR 5. Gaming, even with all the purty, has never been cheaper friend.

Back in the day you were lucky if a card that cost less than $200 would last you a year, I'm just now getting ready to swap out my 4850 for a

The bad part is the "recommended" graphics card is now the upper level of the mid-range, the Nvidia 560 or 660, and the ATI 5870.

This is becoming a real big issue for Graphics cards, far more than video RAM or any other part of the system.

The problem is that the upper-mid-range cards now require *very* significant power. The 560/660 and 5870 above really require TWO 6-pin supplemental power connectors, since they're now pulling 200W under load. The problem there is that this means a 500W+ power supply, and ONLY high-end workstations or custom gaming rigs have those, so you're inherently cutting out the section of the population which games, has a pretty beefy rig, but got a pre-made system from HP/Dell/whomever, none of which have more than a 400W (and usually a 300W) power supply.

I'm a excellent example: I happen to have a HP Z210 workstation - that's a Xeon E3-1200-class CPU (which kicks the crap out of everything consumer-class, including the i7 series), 16GB of RAM, and an SSD. Yet, it was only designed with a 400W power supply, as it was targeted for mid-level pro graphics. I've been looking, and the absolutely fastest GPU I can use is the Nvidia 650 Ti; everything else draws too much power. Consumer PCs are in an even worse situation, since they might have a high-end i5 Ivy bridge CPU, but they've only got 350W power supplies, which probably can't even drive my 650 Ti, let alone a 660. So, you're looking at having to buy a system for $1500 (sans graphics card) rather than $500 to play these games.

Realistically, game makers need to target the lower-mid-range cards - at least, they have to be able to play very well at around 1680x1050 or 1440x900 on one of those lower-power-draw cards (e.g. Nvidia 650 or AMD 7850).

Frankly, I think this is going to be a *big* drag on the PC Gaming industry, since unless they can convince Nvidia/AMD to cut down on the power-draw requirements, or somehow get PC makers to beef up their PS more, new games won't be able to run reasonably on ANYTHING not a custom gaming rig. And that's a *tiny* portion of the market.

The bad part is the "recommended" graphics card is now the upper level of the mid-range, the Nvidia 560 or 660, and the ATI 5870.

This is becoming a real big issue for Graphics cards, far more than video RAM or any other part of the system.

The problem is that the upper-mid-range cards now require *very* significant power. The 560/660 and 5870 above really require TWO 6-pin supplemental power connectors, since they're now pulling 200W under load. The problem there is that this means a 500W+ power supply, and ONLY high-end workstations or custom gaming rigs have those, so you're inherently cutting out the section of the population which games, has a pretty beefy rig, but got a pre-made system from HP/Dell/whomever, none of which have more than a 400W (and usually a 300W) power supply.

I'm a excellent example: I happen to have a HP Z210 workstation - that's a Xeon E3-1200-class CPU (which kicks the crap out of everything consumer-class, including the i7 series), 16GB of RAM, and an SSD. Yet, it was only designed with a 400W power supply, as it was targeted for mid-level pro graphics. I've been looking, and the absolutely fastest GPU I can use is the Nvidia 650 Ti; everything else draws too much power. Consumer PCs are in an even worse situation, since they might have a high-end i5 Ivy bridge CPU, but they've only got 350W power supplies, which probably can't even drive my 650 Ti, let alone a 660. So, you're looking at having to buy a system for $1500 (sans graphics card) rather than $500 to play these games.

Realistically, game makers need to target the lower-mid-range cards - at least, they have to be able to play very well at around 1680x1050 or 1440x900 on one of those lower-power-draw cards (e.g. Nvidia 650 or AMD 7850).

Frankly, I think this is going to be a *big* drag on the PC Gaming industry, since unless they can convince Nvidia/AMD to cut down on the power-draw requirements, or somehow get PC makers to beef up their PS more, new games won't be able to run reasonably on ANYTHING not a custom gaming rig. And that's a *tiny* portion of the market.

-Erik

Seriously, pre-made systems from HP/Dell/Whoever have not been gaming systems EVER. 500w has been a bare minimum for any gaming system for several years now. It's also worth noting that 500w power supplies sell in the $30 price range. Other than the fact that it'll cost you a lot for electricity and contribute to pollution on some level the power requirements for current gen cards are not a big deal. High end cards these days only draw 195 watts (source: http://www.evga.com/Products/Product.aspx?pn=04G [evga.com]

at least, they have to be able to play very well at around 1680x1050 or 1440x900 on one of those lower-power-draw cards (e.g. Nvidia 650 or AMD 7850).

I'm not sure what your desktop resolution is (I'm guessing it's around there). I feel like that's a bit much to expect a computer speced to run a desktop operating system (when using the 3d portion it's only doing basic texturing/compositing) being asked to run modern 3d game at full resolution. Commodity desktop computers have always lagged behind even mainstream modern games. Quake 1 required a floating-point math co-processor I didn't have, then games required 3d cards. There was usually a transition

These really aren't much in the way of system requirements. Which just shows how this extended console generation has had an affect on PC graphics development. Though I'm not complaining it saves me money in the long run, and forces programmers to learn how to do more with less hardware which isn't a bad thing for the most part.

Honestly, I'm disappointed. GPU advances seem to have been driven at least in part by game development. With new big name titles like this coming out with such low end requirements the game certainly won't be driving too many upgrades. This means the only reason AMD or nVidia have to innovate is simply to stay a little ahead of each other.

It has only hurt if that's what you mean, but PC Graphics and abilities are still miles beyond console.
All the current video game generation has done is stint and screw over the growth of PC hardware and practically destroyed the entire PC gaming market by only offering terrible, terrible second-hand ports of console games to PC, instead of the game being developed separately like used to be the case. Because of it you get utter crap in the way of customization in most PC ports - you get to "choose" high.

it seems the game consists of walking/running around with only part of your weapon visible on the screen and shooting stuff with the object to save the planet or the galaxy or something else. anything different then all the FPS games over the last 20 some years?

or are people going to spend close to $1000 upgrading their computers just to be wowed by some extra graphical detail?

The Mona Lisa is not highly regarded because it is detailed. There are many similarly detailed paintings, and many far more detailed paintings. A high-resolution photograph of a sitting woman would be far far more detailed than any of those paintings. That's not what adds value.

There comes a point of diminishing returns where increasing levels of realism adds less to the experience. Artistic touches go a long way in defining a distinctive and memorable look for a game. Battlefield 3, Call of Duty Modern Warfare ___, Medal of Honor, they are all working off the same modern-day source material and have only minor visual details to distinguish one from another. Kane & Lynch 2 : Dog Days, which had terrible reviews (deservingly so), and Splinter cell: Conviction are two other games also set in the modern day but have taken effort to add stylistic touches. KL2: DD for all of it's flaws implemented a distinctive "caught-on-camera" perspective throughout the game, as though the viewer was watching the protagonists by chasing them with a camcorder, shaking as they run, static distortion in the camera when explosions go off, and film bleeding effects for emphasis on the sleazy scraped-from-the gutter atmosphere they sought to achieve. They put thought into the game's visuals, not just time. Splintercell conviction projects objectives, text, and video of events happening elsewhere onto surfaces in the world that the protagonist moves through the environment, and mapped the timing and positioning of each of these to coincide with the player's likely orientation and pacing through that environment. Both games identified a theme to differentiate themselves, even if they only wanted a subtle touch, and made efforts to maintain thematic consistency throughout the game. This is very different than a simplistic dogged adherence to replicating what already exists in the real-world.

Stepping outside of the realm of modern-day game settings. Katamari Damacy or Okami has a tiny fraction of the budget spent on graphics that these other games do. But both have a far more memorable visual experience. One glance at a screenshot of these games and there's no mistaking what you're looking at. I'd rate the visuals of these 2 games above all others mentioned here, despite less technically complex.

are people going to spend close to $1000 upgrading their computers just to be wowed by some extra graphical detail?
My two year old machine is still better than the higher recommended specs. I just bought a $600 system for my kids that has better specs than the recommended specs. If I can get a whole system for $600, than it shouldn't cost that much.
Let's check Newegg:
Intel Core i5-750 - apparently there is no such thing, but the most expensive I5 is $250.
or
AMD Phenom II X4 805 - apparently there is no

are people going to spend close to $1000 upgrading their computers just to be wowed by some extra graphical detail?
My two year old machine is still better than the higher recommended specs. I just bought a $600 system for my kids that has better specs than the recommended specs. If I can get a whole system for $600, than it shouldn't cost that much.
Let's check Newegg:
Intel Core i5-750 - apparently there is no such thing, but the most expensive I5 is $250.
or
AMD Phenom II X4 805 - apparently there is no such thing, but the most expensive AMD Phenom II X4 is $85.
NVIDIA GTX 560 - The most expensive of these is about $250, but they can be had for less than $200.
AMD Radeon HD5870 - No longer available, but faster cards are available for less than $100.
4GB Memory? $50, assuming your computer doesn't already have that much RAM. It is not easy to find a computer these days with less than 4 GB.

About the time that the specs caught up with the hardware available. For Crysis 1, this was somewhere around 2-3 years AFTER it's initial release. If only the Crysis devs had been smart enough to NOT release the debug build as the RTM.....:)

Oh come on, you can dismissively summarize anything like that if you want to sound like some type of elitist (AKA: a douche).

Super mario brothers? I've heard that game consists of EXTREMELY poor graphics, jumping on stuff, and occasinally breaking bricks with your head, to save a princess or something else. And it's only in 2D!

Minecraft... that's basically just legos, with exploding cacti. No thanks.

Well, I enjoyed Crysis 2. It was nothing too special, but I got it on a Steam sale for somewhere under $10, so it was worth it. It actually had a decent gameplay mechanism for allowing multiple different kinds of approaches to areas, from stealth to brute force, and the level design facilitated this aspect. I'll get Crysis 3...but not until it's on sale for $20 or less. Given how quickly PC games drop in price, I'm expecting that to be only a couple months after release (or sooner! Some games have steep discounts if you preorder them).

If you had said Unreal Tournament, you may have had a point, because UT was quite a bit ahead of Quake (new equipment like translocator & new gameplay options like CTF & a multitude of fan-made maps). Well, then later versions added vehicles, always fun to get a little road-rage out, this is a gameplay advance. Oh, and there are awesome co-op games like L4D, quite a bit ahead of Quake. Come to think of it, you're full of ****. There have been several significant forward movements since Quake, yo

Those seems like pretty low recommendations to me. Certainly relative to what was needed for the original Crysis when compared to the hardware at the time. I haven't replaced my entire system (bumped my ram up from 4 GB to 8 GB two years ago) in several years and haven't had any difficulty with games at all, not that I have time to play them often these days. I have a GTX 250 that I put in the system when I originally built it and still haven't had the time (or need actually) to put in the GTX 465; that's been sitting on my desk for close to two years now.

My guess is that due to the need to run on laptops, most game manufacturers are not pushing the limits of bleeding edge hardware anymore. No one is going to replace their entire laptop every year just to play the latest and greatest game.

My guess is that due to the need to run on laptops, most game manufacturers are not pushing the limits of bleeding edge hardware anymore. No

Are you sure it's laptops, or is it consoles? I imagine that companies with the resources to create assets detailed enough to tax an enthusiast PC also have the resources to qualify to be licensed developers on Xbox 360 and PlayStation 3.

I was just thinking to myself the other day it used to be every year if not most certainly every other year I found myself dropping anywhere from half to a full grand on upgrades pretty consistently.
Its been at least 2 1/2 years now and I'm fairly certain an i7 930, 12g tc ddr3, 2x 6870's, and a SSD blow those recommended specs out of the water.
It feels silly posting those specs too as if its some kind of boast, I can only imagine anyone else who builds their own rigs is probably in the same boat. I thi

PC gaming should be using ray-tracing by now, all these 1000 core GPU's and multi-card colutions should be able to process ray tracing calculations, yet there are no ray traced games out showing that there has been little innovation in PC gaming for the last 10 years. Who cares if you can run a game at 300 fps on a 2560 x 1600 screen?

I would return to PC gaming in a heart-beat if they started using ray-tracing in games and created some truly stunning and realistic graphics. You

Do you have any idea how much power ray-tracing costs? Obviously not, but the short answer is "a lot more than we have." You can't do it with anything available at the consumer level, not in real time with any decent level of rays or bounces.

On the other hand, some games really have pushed the PC gaming envelope (I'd say Planetside 2, for example). It's just getting pretty rare, since console level graphics look "good enough."

PC gaming should be using ray-tracing by now, all these 1000 core GPU's and multi-card [solutions] should be able to process ray tracing calculations, yet there are no ray traced games out showing that there has been little innovation in PC gaming for the last 10 years.

No, wrong, Carmack has explained the issues involved with ray-tracing at least a dozen times. But clearly since you've worked out a better solution, maybe you should sell it and get rich?

I want to be excited about buying a $600 liquid cooled video card again. But when a $300 game console gives mostly the same graphics quality and performance as PC games, meh.

Yeah because gaming should be all about who has the most badass hardware and has overclocked their CPU/GPU for those extra FPS, not if the game is any good or whether you actually play it well. Yes new hardware isn't that exciting anymore when I already have a quad core with gigs of memory and solid graphics, but I don't miss those not-so-good old days.

Dude its the level, its coded for shit. I have a hexacore with 8gb of RAM and an HD4850 OCed and it'll still bog when it gets to that ONE level, while everything else runs perfectly. Compare this to something like Just Cause II where i can set charges all over a compound and do my own "cool guys don't look at explosions" with smokestacks falling and fuel tanks blowing and fireballs that block out the sky and it still don't chug.

But go onto any forum and mention the carrier level and watch all the pissed o

For a long time I've had the impression that these developers only put marginal effort in optimizing code because the goal is to offer a game that's a resource hog. As long as the game is halfway decent you've given yourself months of free marketing. In an effort to stay relevant publications will immediately include these games in performance testing.

Some of us also like to play games using the latest hardware. There are still a lot of games using UT3 engine that will run just fine on older hardware. There is no need for the whole game industry to wait for every single Joe to buy new hardware.

Anyway, it's not like the game will rot or anything, it will still be there when you finally get a new machine and then you can play the game using your new PC at his best.

Then your customers shouldn't be wanting to play Crysis 3 now should they? this is like bitching that your new econo car won't keep up with a 911 in the quarter, well no shit, you didn't pay 911 money either!

They can play tons of games on that rig, just not Crysis 3. If they want to play even more games they can add a cheap discrete, but for Crysis 3 they should be looking at an HD5850 minimum, HD6850 better. BTW why are you being such a cheap bastard when it comes to PSUs? The only things I put less than a

You do know that they basically took a Radeon graphics card and put the entire GPU into the CPU without changing much, right? So it has 384 cores, direct memory access, and just no GDDR5 (which kinda kills it based on my experience with a GT440 that had DDR3 onboard instead of GDDR5).

But still, they should have a quick and ugly mode so users can play it on BRAND NEW, gaming-oriented systems like this. They're definitely losing customers, seeing as how this can play basically any other game ever made on

You do know that graphics are usually very constrained by their memory bandwidth and that putting in much faster ram will give you better results, but it still will not compare to an actual dedicated card even if it has the same piece of silicon at its heart because everything else is different.

Starcraft II is hardly the most graphically-challenging game. Neither is R6: Vegas. And the WEI scores are essentially useless.

The A10 uses a Radeon 7660D which, in "real"-card terms, fits somewhere between 7570 and a 7470. However, the integrated Radeons are known to be extremely memory-bandwidth-bound, enough that they're frequently used in RAM benchmarks. So in practice you're looking at a graphics processor that's already weak, and further crippling it by bottlenecking its

APUs are not really integrated graphics. They took a GPU core and put it inside the CPU. It's no different than the actual card itself, except for memory bandwidth and speeds. Those FPSes are not acceptable to me but I just had a customer who was running 20FPS in WoW on a laptop with Intel GMA965 and a core 2 and said it was "fine." To run an older Crysis at 48FPS, this thing definitely has some horsepower so to completely eliminate it from playability seems like a stupid decision to me on the game make

Are they integrated into the CPU or chipset? Then it's integrated graphics. And while they may be the best integrated graphics on the market, and can certainly handle light gaming and media playback, they are far from the level a discrete processor is at.

Yes, the cores on a Fusion processor are essentially identical to those in the discrete card. But they are clocked lower, are heavily memory-constrained, and to put it bluntly, there's just not enough of them. The 7660D in the top-end A10 has 384 shader cor

Kid, you are just showing your ignorance. ANYTHING can do 20fps with World of Warcraft. That is one of its strong selling points. If you don't know the difference between WoW and ANY of the Crysis games, then you just don't know what you are talking about. And as I said earlier, you are not in market segment for this game. Go ask your daddy for a raise. Or get a real job and not one that pays in peanuts.

And I'll tell you just like I tell MY customers which is "If you wanna game then discrete all the way". Seriously you can get an HD4850 for $35, or an HD6850 for $100, so there really isn't a point in trying to play games on an IGP.

No matter how you slice it its the shared memory that makes those worthless. You just can't pump enough through bog standard DDR 3 to make a killer gaming system with an IGP, its just not gonna do it. Great for video, fine for casual, sucks balls for modern shooters and the lik

No, they don't want you as a customer because if that is all you can afford a normal PC, you can't afford PC games. It is actually a pretty common comment on piracy forums,"I downloaded this but it doesn't run". Turns out some kid is trying to run a bleeding edge game on a decade old dell he got from his daddy.

That is what consoles are for. And then you get to pay the money you save on your PC on the 10-20 dollar console fee instead.

Crysis does want customers, it just wants customers who actually got mone

They're saying "bare minimum" is 1GB video memory so if it gets stuck in a dump and load loop because total texture size on minimum is 650MB and I have 512, it will not launch or skin everything and will crash. I think they're lying. I don't know if the APUs can grab more memory on the fly like Intel chips though. They do have an absolutely segregated 512MB that the system can't touch instead of 100% on the fly like Intel so who knows.

the XBOX 360 is basically Dx 9.1 with a bit of shader stuff from 10. Why would the heck the minimum requirement be DX11 on PC ? I smell a rat here, as I doubt they will make a new DX11 only engine. i think they simply set it so as to avoid supporting XP or Vista.

I assume it requires hardware Tesselation support.

I just hope they do it better than Arkham City did... enabling Hardware Tesselation brought my framerate down to an inconsistent amount around 40fps on a Core i7 2600K with dual nVidia GTX 570s running in SLI. I'm sure I could have lowered the graphics from the "Ultra" setting to make it work, but isn't that missing the entire point of doing things in hardware?

Uhh...that's kinda the POINT with Crytek, hence the meme. you buy the game now, play it on low, and then have your mind blown when the tech finally catches up and you can run it maxed out.

The nice thing about this is it makes their games some of the most awesome looking for years and years, fire up Far Cry 1 on ultra everything and it looks so damned good you want to just sit there for an hour watching the fly by. I personally LOVE this about their games as it gives me that sense of awe like I had the fi