I'm a firm believer that the problem here isn't people switching up what they want from a video game, but rather over saturation on the market of overly bland looking games. This ranges from your typical AAA game to even your most indie of indie titles. At the time of quake, brown was used to give it a more gritty and realistic appearance, which worked very very well for it. I think the palette choices in games now still lean themselves towards that trend to make things gritty and realistic without taking into consideration just the mass amount of video games that are doing the same exact thing.

I don't think all video games need to be colorful. In fact, for some games you can say a lot with a simple palette choice compared to a colorful one. It usually just depends on the game. However, using brown as a cop out for realism at this point is gonna get you some bad press. Granted, I also feel like many AAA game devs are starting to branch out from brown into more colorful games. Slowly, but surely.

I lost all faith in dark and gritty "realistic" graphics as a concept when I played the Torvus Bog in Metroid Prime 2.

The rain and fog in the light world, coupled with the game's innate lighting, made viewing the difference between areas much harder than need be.

The underwater sections had this horrible draw distance to emulate a murky water effect, and the upgrade that cleared it didn't help much.

The dark world was this monocolor purple wasteland that made you want to leave immediately, but also showed that you really couldn't see where you were going at anytime in that place, due to the lack of light to begin with, and it wasn't any better when you had to find the temple keys.

But hey, the game never lost it's basic lighting principals and never looked like all of the cutscenes' lighting varied wildly right?!

The Torvus Bog really was the worst thing I've ever visually seen, and not once have I played or seen a "mud colored" game that didn't make everything worse with its color scheme to a differing but similar degree.

Amusingly enough, this example is from Nintendo, who I mostly play games from just for the consistently bright colors.

The same people who praised brown praised it for the wrong reasons. As they praise graphics now, they don't know that gaming isn't about graphics.

Quake, quake did it's brown tones right because it wasn't a cheap tool. It had a style, not all styles need to be colorful, but having a signature in your art style should be what's important. You can have the next Crisis but it won't mean anything if it's just another indie early access zombie crafting survival game from steam. AAA is crap but that argument doesn't need to be made, the problem is in people who help create them.

What we see as "realism" in games has always been the industry-popular-genre abiding grounded yet mildly stylized picture of the game's world. Over time, those goalposts have shifted and more polygons, pixels, shaders, and dynamic effects have been computed, but the trend has remained the same.

Its familiar and rarely offputting to general audiences, gives the sensationalist press something to crow about (look at how far we've come!), and is very easy to work with. Because the style behind the familiarity and fidelity is harder to notice, people usually write off such game's style as "realistic".

Just as invariably as this trend is the backlash against it, that has been going on for over 20 years. So if you ever find yourself getting sick of "realistic" graphics, you might be burning out of the current trend, or you may have played a game which lacked any distinct visual style because the developers blindly pursued the same fallacy.

Despite the advancement of technology, we've used that as a crutch and rejected innovation and depth.

I don't think I'm burnt out on any trend either, I think that at an evolutionary standpoint gaming is going to be stagnant for a long while. I judged every single decision I could make as a consumer and decided I don't need a shiny game, I want a stable one that wont crash with an artistic signature at the very least.

Aside from that I'm poor, my laptop can't handle games that are requiring 8gb's (i see you 16gb's) and after what i've heard, i really, really, do not want to need 100gb's of space for a game.

If people aren't being told to not make their games bloat or not to make them efficiently, we're going to reach a crash from people who just can't catch up due to companies refusing to downsize these games.

Both muted colors and saturated colors have their place. A gritty game about human war probably shouldn't be bright and stylized, and should have more realistic looking textures. On the other hand, a relatively lighthearted fantasy game probably should lean towards a brighter palette and less detail. It's a matter of mood. It's also a matter of personal preference as to which you think is better.

As for me, I still think Spiral Knights has a great style, it's one of my favorites.

Triert wrote:Despite the advancement of technology, we've used that as a crutch and rejected innovation and depth.

I don't think I'm burnt out on any trend either, I think that at an evolutionary standpoint gaming is going to be stagnant for a long while. I judged every single decision I could make as a consumer and decided I don't need a shiny game, I want a stable one that wont crash with an artistic signature at the very least.

Aside from that I'm poor, my laptop can't handle games that are requiring 8gb's (i see you 16gb's) and after what i've heard, i really, really, do not want to need 100gb's of space for a game.

If people aren't being told to not make their games bloat or not to make them efficiently, we're going to reach a crash from people who just can't catch up due to companies refusing to downsize these games.

I think you're generalizing your viewpoint a bit too much. In terms of visual style, about the same amount of publisher backed titles are using the current fidelity baseline to build their own visual style on as in years past. It can be argued that the massive amounts of trite steam garbage and asset flips fall into what you described, but they do so because they're asset flips, and don't reflect the state of in the industry as a whole.

As for the pursuit of graphical fidelity, games that gimp themselves to visually impressive are long in the past, and the slowing advance of graphical hardware and recent introduction of new console generation has led to increased optimization. For every title like Arkham Origins and Mankind Divided there are another 50 who can scale properly and look excellent too. In the case of Arkham Origins, the lack of performance was due to subcontracted developers and never should have shipped, seeing the console versions working fine. And the games which are absolute buggy messes such as the most recent homefront, are messes because of incompetence, with no relation to their drab styleless aping of "realistic" graphics. Same with the recent Call of Duty's. As graphical improvement has slowed, audiences are once again looking for good games under the spectacle, because EVERY game has spectacle now. This in turn has led to innovations across the industry's corners, and the lagging sales of games that refuse to adapt.

Thanks to a now established professional indie scene in the west, there are more games than ever that satiate your requirements without straining, but on the whole, in a world where 1tb is the baseline consumer digital storage and torage costs are getting ever cheaper, there isn't ANY incentive for games to optimize to the extent you want them to. Even I only have 500gb, and I no longer run into such issues unless i want to play a noturiously Space heavy title like Star Citizen.

There are some horrible and pervasive issues with the games industry I tell you, but this isn't one of them anymore.

Who said anything about Star Citizen? Is that your only reference point for games with the need for large storage?

I was referring to Final Fantasy 15 and Forza 7. I"m not even sure we're talking on the same wavelength with you bringing up consoles here. Like them or not I can avoid this issue with a console.

If I didn't make it clear before which I feel like I did I'll repeat it, growing hardware needs are making it hard for a poor person like me to really catch up. With games that need 8gb and soon to be 16gb's of ram I feel like this is only going to get worse before it gets better. I don't think I'm living in the past in that regards.

If pointing that out gets me that kind of response then I really feel like you're being misleading in favor of the industry, if not actively defending the practices of the industry outright in favor of idea's people held in high regards before the consumer base got swamped with complete idiots.

Sorry, I misunderstood you. Because you said 8-16gb, I thought you were talking about installation size.
NO game currently effectively uses more than 6gb of system ram. Some games at very high texture resolutions and concurrent polycounts can potentially exceed this, but its exceedingly rare because runs into other limits related to VRAM. What you are probably thinking of is VRAM, which is exclusively on the graphics card. In the case of VRAM, at the most I see 6-8gb pushed on the highest settings for memory heavy titles such as Shadow of Mordor. . The advance of graphics card technology has also brought 4gb and 6gb vram to the entry level, so for even 400~USD a full system can be made that can play most games without being VRAM limited. And consoles are pretty much never VRAM limited now, unless its a horribly optimized title like the Order 1886, a horribly playing game.

What I'm trying to say is that hardware needs are actually decreasing every generation, and while game development has been getting steadily less efficient, technology and its price has been falling much faster.

If you think a response like mine is being misleading in favour of the industry, I'm not sure what to tell you. I absolutely DESPISE the current game's industry, don't get me started. But, what you're trying to criticize I feel is an unfair over-extrapolation of your own situation and experience, which runs contrary to all evidence, my own experiences in the industry, and the fundamental logic of this situation.

The short of it is, compared to any time in the past, playing modern games at decent fidelity and framerate is cheaper than ever before, with grater accessibility than every before due to unilateral pushes (cough sony) for crossplatform play and the decline of concrete exclusivity. As the push for narrowing fab processes deaccellerates, the industry has also been hitting the wall of graphical diminishing returns, and as a result have focused on optimizing existing tech for the sake of consoles while adding more and more post-processing effects to games, which while are massive performance hogs (EG: god rays), can easily be turned off without affecting graphical fidelity. This is due to graphical targets, which were previously set to max settings, have steadily moved down the spectrum to medium or low targets thanks to aforementioned optimization.

If you're still stuck on a 2010-era low end laptop, then yes, NONE of this plays well, if at all, on your machine. I can sympathize, because I was in a similar situation for 7 years. If anything though, this at least is good news that when you eventually are able to buy into a proper system, you will be doing so at the most cost effective time (with the most variety of games) the industry has seen to date.

PS. Now is a slight issue with what I just said, the recent rise of etherium mining and faults in supply of DDR4 have increased the prices of new hardware, but used parts are still very cheap, and once these two market factors cease, new hardware prices will fall back into line or even lower than before the disturbance.

I think it worth noting that PC games don't get to monopolize system resources or target a single hardware configuration the way console games do. The former point means that, for most PC gamers, a game has to compete not only with itself, but the bloat of Windows and whatever user programs are running in the background, and freeing up as much CPU, GPU, and RAM as possible isn't as simple as closing everything on the taskbar(and may not always be desirable anyways). The latter means that even well coded games have to make trade-offs between optimization and compatibility, and from a business standpoint, running decently on a wide range of hardware is more important than running perfectly on only specific hardware. A game that itself only needs 6GB of RAM could easily push the total RAM requirements for a system to 8GB or even 16GB depending on what the player is likely to have running at the same time.

Though, really, hasn't the cost of a rig that can handle current year games at top settings always been one of the downsides to PC gaming? To the point some deliberately stay a few years behind what's current so they can make do with buying a new machine once every few years and stick to parts that aren't prohibitively expensive?

As for graphics, being blind, I couldn't care less what a game looks like, I just wish there were more games I could play in the first place regardless of platform.

Just so you know, I am blind.

Those who approach life like a child playing a game, moving and pushing pieces, possess the power of kings.

Triert wrote:Wait, isn't etherium mining the thing that's putting Apple and Nintendo at war with each other over Iphones and the switch?

Etherium is a cryptocurrency meant to mine efficiently on gpu and compute cards, rather than ASICs like Bitcoin. Its launch earlier this year drove up the price of several graphics card, but from Etherium's unsteady performance on the market hardware prices may subside later next year.

Jeffery Mewtamer wrote:I think it worth noting that PC games don't get to monopolize system resources or target a single hardware configuration the way console games do. The former point means that, for most PC gamers, a game has to compete not only with itself, but the bloat of Windows and whatever user programs are running in the background, and freeing up as much CPU, GPU, and RAM as possible isn't as simple as closing everything on the taskbar(and may not always be desirable anyways). The latter means that even well coded games have to make trade-offs between optimization and compatibility, and from a business standpoint, running decently on a wide range of hardware is more important than running perfectly on only specific hardware. A game that itself only needs 6GB of RAM could easily push the total RAM requirements for a system to 8GB or even 16GB depending on what the player is likely to have running at the same time.

Though, really, hasn't the cost of a rig that can handle current year games at top settings always been one of the downsides to PC gaming? To the point some deliberately stay a few years behind what's current so they can make do with buying a new machine once every few years and stick to parts that aren't prohibitively expensive?

As for graphics, being blind, I couldn't care less what a game looks like, I just wish there were more games I could play in the first place regardless of platform.

When i made those claims of system targets and RAM consumption, it is under the assumption that under worst case loads and overhead of both the base system and the game, the player will need 6gb of RAM. Divviying it up, that 1.5 gb used by operating system, .35 ish being used by other handlers, and the next 4 ish gb being used by the game itself. This also matches the methodology to develop the published system requirements for a game.

Recently publishers have started to artificially inflate system requirements, but actual testing reveals these are the same divisions and its quite difficult to exceed them. Noturiously innefficient games like Call of Duty Ghosts can't even break 6gb combined system and game RAM usage. This is once again because most of the system stress caused by increasing graphical fidelity is felt by GPU ipc count, VRAM, and CPU to GPU throughput.

While I don't have the time now to make a graph for it, I'd suggest looking at anandtech's msrp listing for each flagship gpu and the simultaneous generation of intel i5 k platform chip. Even if you look at just the past few years, you'll see a marked drop of price in systems built with these components. And for their worth, they are the highest end new components for gaming at the time. This drop in price is considerably more pronounced when your performance target lowers, and dives even more sharply when looking at used components. Why? Because gpus like the

gtx970
gtx980
gtx1060
r9 290x
r9 390
r9 Fury
rx480
rx570
rx580

all reach 60~ fps on any title at 1080p as target, because of resolution saturation. (Resolution saturation is a long story but in short sets a limit where AA, tessalation steps, and polycount can physically be seen because there aren't enough pixels to represent them)

Thats alot of choices for graphics card. There are even more choices for a chipset platform, 6-8gb of ram, and cpu, that all will not bottleneck those gpus. Before the etherium mining and ddr4 shortages, I could say that with full confidence that such a prototype system would cost just 700USD new, considerably lower with well selected used parts. And I think thats the best value proposition I've seen in 15 years for near universal gaming, especially with the rise of 7th and 8th generation console emulators and refinement of the previous generations.

The cost of a gaming PC is still higher than that of a console, but the added fidelity, flexibility, games distribution channels, and lack of proper exclusives on anything but the Nintendo Switch make the extra cost well worth it. Plus the games are often half the price.

I think your mindset comes from the days of the 8800gt and 580, years when resolution saturation wasn't even close to being reached and when process nodes were 3 times the size of today, leading to a mess of both pc games development and hardware that logically led to that kind of mindset. Thankfully, times have changed in that regard.

Triert wrote:
Did you get your second paragraph off an xbox magazine before you went blind. It ignores so many factors that I can't expect it from anything else.