Quote:Hilmar says: Having the perspective of having done this for a decade, I can tell you that this is one of the moments where we look at what our players do and less of what they say.

There you goThough its a shame all those ppl with fried GPUs cant seek recompense or something cause CCP KNOWS by now its their CODE not our computers--------CCP knows better than the players whats good for their game. SOE knew what was best for SWG too. Better than all those players that left too. This is your NGE/CU moment CCP.

However, could you PLEASE get it through your thick skulls that it is IMPOSSIBLE for ccps code to have fried any GPU that was properly designed, manufactured, installed and maintained. If your GPU broke it was the fault of the manufacturer for designing/building it badly... Or it's YOUR fault for not ensuring it was correctly maintained.----I like the base.

Posted - 2011.08.30 15:13:00 -
[5]Edited by: Alice Katsuko on 30/08/2011 15:17:02Edited by: Alice Katsuko on 30/08/2011 15:16:23Depending on the computer, it's perfectly possible to stress a CPU or a GPU to the point of failure by running hardware-intensive software on it. This is because as processor utilization increases, that processor throws off more heat (common sense and basic physics). In theory, a computer's cooling system should be capable of cooling the GPU(s) and CPU(s) to acceptable temperatures even when they are under maximum load. However, most pre-built desktops and virtually all laptops do not have such a cooling system. In fact, laptops are notorious for running hotter under even moderate load compared with their desktop counterparts.

Where temperature exceeds safe levels, the GPU/CPU should automatically reduce voltage, or shut down entirely. However, not all processors have this ability (especially look at 1:10), and not all such systems can react in time. Furthermore, even if a GPU/CPU can continue to operate safely at a high temperature, continuous operation at such temperature will most likely eventually cause it to degrade, at the very least reducing its lifespan, or may damage other computer components.

In practice, this means that end-users do have to pay attention to how hardware-intensive a particular game or program is. So while theoretically it's "impossible" for software to melt a computer, in practice it's very much possible when the user runs software that he does not expect will push his computer to its operating limits.

Edit:

Captain's Quarters is a perfect example of unexpectedly hardware-intensive software. CQ uses significantly more hardware resources than the old hangar view, and more resources than a typical space scene, given the same general graphics settings. This means that running CQ on relatively high settings could unexpectedly 'melt' a computer if that computer was already running near its limit in terms of temperature levels when rendering the in-space portion of the game.

Posted - 2011.08.30 15:25:00 -
[6]
Funny given even they realize theres an issue an thats WHY theres a stickie on it

but you all wont realize that, keep defending bois :D--------CCP knows better than the players whats good for their game. SOE knew what was best for SWG too. Better than all those players that left too. This is your NGE/CU moment CCP.

Originally by:MarchociasI know that y'all cannot help being computer illiterate... Your inherent inferiority means that your ignorance is guaranteed.

However, could you PLEASE get it through your thick skulls that it is IMPOSSIBLE for ccps code to have fried any GPU that was properly designed, manufactured, installed and maintained. If your GPU broke it was the fault of the manufacturer for designing/building it badly... Or it's YOUR fault for not ensuring it was correctly maintained.

Actually, on the contrary. Yoy can easily fry your properly cooled graphics card. Although not the GPU itself, most graphic cards are known to fry their PWMs in some situations, most notably when framerate exceeds 1000 fps, which is easily achievable, for example in menus or during pre-rendered cutscenes if the framelimiter is not properly implemented (or not implemented at all). This is because GPU power draw peaks at a certain moment during rendering of each frame, and if the peaks match (more or less) the PWM working frequency which is usually between 1-1.5kHz, the power regulators start to resonate and can burn in seconds or miliseconds if frequencies are a close match, faster than temperature protection can kick in.

This, however, doesn't apply to Eve, since it has a working framelimiter. If anyone burned a GPU with Eve, it was probably because of heatsink clogged with dust and GPU working close to thermal limit for too long. But don't call people illiterate when clearly you could do much to expand your own knowledge.

Originally by:MarchociasI know that y'all cannot help being computer illiterate... Your inherent inferiority means that your ignorance is guaranteed.

However, could you PLEASE get it through your thick skulls that it is IMPOSSIBLE for ccps code to have fried any GPU that was properly designed, manufactured, installed and maintained. If your GPU broke it was the fault of the manufacturer for designing/building it badly... Or it's YOUR fault for not ensuring it was correctly maintained.

You mean you can't stick dual GPUs in the "Sunday Superstore Special" with stock cooling?

I love how all these people complain heat, but when you ask about cooling they won't list what they have

Posted - 2011.08.30 17:01:00 -
[11]
Well its funny, Ive been here since 2006, running 2 clients at once, clean out the inside of the computer regularly, and till THIS BUILD have never had heat issues with any game Ive played EXCEPT THIS ONE

That doesnt say to me that its my ignorance on how heat works that tells me this client is working my computer harder, as the computer is clean. It has a stock cooling system (or whatever it was when the computer was built when I bought it), it has a Radeon ATI 4650 HD video card. But before you scream AHA THERE IT IZZZZ!!!!!

Note the part where Id ALREADY SAID this is the ONLY GAME IVE EVER RUN and ONLY IN THIS NEW BUILD thats caused heat issues. Ive run this game previously without heat issues. Two clients, no problems. BUT NOW there is.--------CCP knows better than the players whats good for their game. SOE knew what was best for SWG too. Better than all those players that left too. This is your NGE/CU moment CCP.

Posted - 2011.08.30 17:33:00 -
[14]Edited by: Alice Katsuko on 30/08/2011 17:34:20It's partially CCP's fault. CQ is much more resource intensive than the rest of the game at comparable settings. Most folk wouldn't expect a non-optional (a check-box buried in the ESC menu doesn't really make a function like CQ truly optional, at least initially) expansion to increase hardware requirements so significantly.

For example, my laptop runs the in-space portion of the game just fine on mid-range settings, outside of giant fleet fights. But entering CQ with those same settings makes the CPU and GPU temperatures both spike to 70C, and reach 80C within ten to fifteen minutes, despite a clean heatsink, cooling pad, and air conditioning.

Reasonably, I wouldn't have expected a relatively minor expansion to put that much stress on my computer compared with the rest of the game. Keep in mind that the original Classic graphics were not removed from the game for years after Trinity precisely because they required greater system resources, so if anything it would have been reasonable to assume that any content requiring better hardware would be made fully optional.

Majority of PC's never use their whole power, not GPU, not CPU or RAM, so that's why companies screw you, customers and put poor cooling systems. And why this goes on and on?

BECAUSE NOOBS DON'T CARE ABOUT COOLING THEIR COMPUTERS!

No program should use 100% of any systems resources on a constant basis. If they do that's a sign of **** poor programming. Why? Because it doesn't matter how good a cooling system you have on your PC there is only so much heat that can be moved off a component.

Now I don't know whether or not Incarna is actually frying GPU's I know that while mine are performing within spec for temperature the new client is ridiculously resource intensive.

Even though I am running a more than adequate rig with acceptable cooling I still ended up turning off the CQ due to how much running it degraded my system performance.

You simply cannot run it at acceptable settings if you run more than a single client.

I ended up having to get a second video card to run my secondary monitor and for my additional accounts in order to continue doing running my little mining op while PVPing on my main account.

Even with this the performance compared to what existed before is just horrible.

Oh and to be quite frank for all that resource utilization the environment and avatar still look like **** that could have been released 10 years ago.

BTW what's the point of working with NVidia to use Physx if you aren't going to provide the option to offload that work to the physx optimized GPU?

Majority of PC's never use their whole power, not GPU, not CPU or RAM, so that's why companies screw you, customers and put poor cooling systems. And why this goes on and on?

BECAUSE NOOBS DON'T CARE ABOUT COOLING THEIR COMPUTERS!

This is very true, it's like when people where blaming that benchmarking program for their Ati 4850's and some Nvidia cards cooking because the card itself was poorly designed and couldn't actually supply its own parts with the power needed.

CQ causes a high GPU load for sure, but my GTX460 still only ever made it to 57 celcius that's about 3 degrees lower than it gets to in Metro 2033. Though Metro also looks a metric ****ton better.

Originally by:HicksimusCQ causes a high GPU load for sure, but my GTX460 still only ever made it to 57 celcius that's about 3 degrees lower than it gets to in Metro 2033. Though Metro also looks a metric ****ton better.

And that's the really frustrating part. CQ looks "okay," but nowhere near as awesome as it should considering how resource-intensive it is. I maxed out most of the settings at one point just to see what the CQ looked like, and the avatars still looked like wax mannequins or something out of Hellraiser half the time. CQ might be awesome from a programming perspective, but the end-user result isn't all that great.

Posted - 2011.08.30 17:44:00 -
[18]
so.... if its our fault and we're all horrible noobs then why is this the ONLY GAME that causes my rig heat issues?--------CCP knows better than the players whats good for their game. SOE knew what was best for SWG too. Better than all those players that left too. This is your NGE/CU moment CCP.

Originally by:Alice KatsukoAnd that's the really frustrating part. CQ looks "okay," but nowhere near as awesome as it should considering how resource-intensive it is. I maxed out most of the settings at one point just to see what the CQ looked like, and the avatars still looked like wax mannequins or something out of Hellraiser half the time. CQ might be awesome from a programming perspective, but the end-user result isn't all that great.

Indeed. I will concede that it is very heavy on resources, given that the result is only "quite nice... ish".----I like the base.

Posted - 2011.08.30 19:11:00 -
[20]
Alot of people here saying stuff about poorly cooled systems. This isnt the case for me, my I7 is water cooled but I'm running a backup card right now because my good one died some time ago. My 8800 series card climbs to 70C idling in CQ..

My PC is custom built and has crazy good airflow but what brought me to making this post is that even such intense games such as

CrysisFarcry

Do not bring the levels of heat on full load that CQ does, which for me is absolute crazy.

Originally by:Garak JakobsAlot of people here saying stuff about poorly cooled systems. This isnt the case for me, my I7 is water cooled but I'm running a backup card right now because my good one died some time ago. My 8800 series card climbs to 70C idling in CQ..

My PC is custom built and has crazy good airflow but what brought me to making this post is that even such intense games such as

CrysisFarcry

Do not bring the levels of heat on full load that CQ does, which for me is absolute crazy.

The 8800 was known to do that before CQ howerver.

Its why I have a 430 and not an 8800 anymore..Adapt and overcome or become a monkey on an evolution poster.

As for the chips without heat protection circuits, that linked youtube is comparing a P4 to a Athlon XP+. AMD now has protections built into the CPU. Although I will concede that GPU, to my knowledge do not have such protections. Also, even though the CPU may be able to protect itself that doesn't mean that the heat generated won't damage other components in close proximity.

If your afraid to run your system at 100% cpu utilization then I suggest you add better & more cooling, pure and simple. Places to do so are the chassis, cpu, gpu, chipsets, psu. Get the heat out fast enough and you won't have a problem.

Posted - 2011.08.30 19:29:00 -
[23]
press escgo to displayuncheck the "load station environment" settingthis fixes almost all the CQ bugs i got, but the station hanger won't load anymoreEve online next expansion details

Originally by:Garak JakobsAlot of people here saying stuff about poorly cooled systems. This isnt the case for me, my I7 is water cooled but I'm running a backup card right now because my good one died some time ago. My 8800 series card climbs to 70C idling in CQ..

My PC is custom built and has crazy good airflow but what brought me to making this post is that even such intense games such as

CrysisFarcry

Do not bring the levels of heat on full load that CQ does, which for me is absolute crazy.

There is no Prime95 for GPUs, unfortunately, but you can use rthdribl. It's probably the best tool for the job. http://www.daionet.gr.jp/~masa/rthdribl/

Technically speaking, it's a hardware/hardware environment issue. While CCP's programming might be the cause by virtue of load, they operate within the parameters of the language and APIs that they have. I use air cooling on my 5770 and my card rarely reaches past ~69c. Of course, operating parameters for the 5770 have a max temp around 100c.

This fits within the findings from a techspot review on the 5770 done a while back:The dual slot cooler of the Radeon HD 5770 made little noise even when under load. We recorded a stress GPU temperature of 74 degrees, which is fairly low by today's standards. The idle temperature of 47 degrees was average, with many other GPUs operating cooler at idle. Still, far from a bad result

Your 8800 is similar. nV cards have a thermal shutdown around 125. Over 110c is more or less where it starts to throttle down to protect itself. 70c is a safe operating temp. The same site that reviewed the 5770 found a load temp of 83c with the 8800 GTS. Ultimately, I think that you're overreacting.

And, for the record, you don't "idle" in a 3d environment. It is not a still picture. It is generating all those polygons and using all those shaders 100% of the time.

The only way to cause hardware damage via software is via software overclocking, EVE does not do this.

If you are having heat problems then as said above, sort out your cooling. If your system is running properly then on any game were you aren't hitting the max frame-rate limit then your system SHOULD be running at fairly close to max potential anyway. The only reason it wouldn't be is if parts of the system aren't been used or if the game you are running is poorly optimised.

On my machine, with all the features turned to full, there is no temperature difference on EvE, Dawn of War 2, Star Craft 2(Makes my CPU run hotter cos more cores running) and thats it. World of Tanks leaves my GPU much cooler and only uses 1 CPU core, but thats simply because its a very poorly optimised game.

Also, against popular belief, temperatures of 150c + are actually no problem for most modern CPUs and GPUs, the problem is that you get hot-spots inside the chips of metal-melting/gate popping temperatures, the higher the base temperature the more likely you are to get one of these and even a single spec of dust inside when the chip is manufactured can cause this, hence the very strict conditions under which they are made. This is why every chip dies eventually, well with the possible exception of chips under liquid nitrogen cooling constantly ;)

At the end of the day, if your setup is having a problem with the temperatures at 100% usage, there are several options open to you.

1: Improve your cooling. More fans are NOT always better. You need through-flow, over the cpu and gpu and FSB. Heat-sinks on these components help(draws the heat to be more accessible to the air-flow)

2: Under-clock your over-heating components. This will prevent overheats, you won't actually lose any performance here if you do it right, since you are effectively setting your 100% performance max to the level at which your cooling can handle.

3: Buy components which don't run at stupidly high weight under full load. Certain ATI cards in particular are known for been 'egg friers'. Review your hardware before purchasing.

Hope that helps-The sword is only as sharp as the one who wields it.Drenzul (My normal internet tag)

Posted - 2011.08.30 20:49:00 -
[26]
so I have yet to see an answer to why this game causes heat issues when no other does.

you can bang on about the cooling issues but if there was a heat issue, wouldnt that issue be present in every game you play then?

Or are you saying that the cooling system is just choosing to stop when I open EVE--------CCP knows better than the players whats good for their game. SOE knew what was best for SWG too. Better than all those players that left too. This is your NGE/CU moment CCP.

Posted - 2011.08.30 20:54:00 -
[27]
If your GPU is overheating or even close to it, you are doing something wrong, and most likely your own computer isn't optimized for any heavy gaming regarding air cooling.

Even if CQ is heavy, it doesn't stress the GPU that much. If its over-heating, your computer air flow, is just bad.

I'm running a client in almost full details at 1080p on a crappy i3-540 with 2GB of memory, and a gtx465 (one of the worst and hottest GPUs you can find), and it runs perfectly smooth, and far from being very hot. It doesn't stress the GPU as much as its just inefficient rendering wise.

Posted - 2011.08.30 22:16:00 -
[28]
just turn it off. its simple, CQ is a resource hog, games from 5 years ago look better then CQ but CQ takes 10 times the gfx/cpu power to run it.Eve online next expansion details

Posted - 2011.08.30 22:56:00 -
[29]
didn't actually blizzard burned a lot of graphic cards due to faulty programming on sc2? and they did took responsability about it, trying to protect CCP on this case is meaningless since there is already a precedent on game companies where faulty code did burn hardware, now software CAN burn pieces, overclocking hardware for example can overvolt or set frequencies why beyond what the hardware was intended to run at, despite the manufacturers caution on setting it on reasonable limits, now obviusly EVe online dosn't do this, but FACT is, my computer runs crysis way better and faster at FULL graphics than captain quarters, despite the former having way superior graphics, which means the coding in CQ is just not good enought, period. argue all you want, facts are facts, CCP must fix their game, and if they were a reasonable company they would replace the burned video cards.

On the cooling issue, i use several aftermarket coolers on a proper full tower Case, video card with aftermarket cooler and well greased and clean, yet i have been able to reach up to 100 &C YES enought to fry an adjacent network card, but the video card wasn't damaged, a very rare case that only happened on eve online, fortunately my card manufacturer makes some really hardcore hardware and it has proven to be rather durable. (can't say the same of my PSU supplier... i have burned 5 700 watts PSU in the last year)

Originally by:MarchociasI know that y'all cannot help being computer illiterate... Your inherent inferiority means that your ignorance is guaranteed.

However, could you PLEASE get it through your thick skulls that it is IMPOSSIBLE for ccps code to have fried any GPU that was properly designed, manufactured, installed and maintained. If your GPU broke it was the fault of the manufacturer for designing/building it badly... Or it's YOUR fault for not ensuring it was correctly maintained.

****in sick of morons like you.

my PC plays everything at high everything. My (noisy) GPU fan rarely comes on unless ambient temps are high. the instant I dock into a CQ environment my GPU fan starts. What does this suggest to you?

control groups:all general PC use including watching tv: no GPU fanhigh level gaming at full settings including cod series, crysis: no GPU fanEve online, in space: no GPU fanCQ: immediate GPU fan switch on.

In this experiment I have identified CQ as the reason my GPU fan comes on, indicating that CQ causes my GPU to heat up significantly more than ANY OTHER SINGLE THING i do on my PC.

my PC has a windows experience rating of 7.8 and is cleaned and maintained regularily due to the fact its built in a bookcase in my front room.

COPYRIGHT NOTICEEVE Online, the EVE logo, EVE and all associated logos and designs are the intellectual property of CCP hf. All artwork, screenshots, characters, vehicles, storylines, world facts or other recognizable features of the intellectual property relating to these trademarks are likewise the intellectual property of CCP hf. EVE Online and the EVE logo are the registered trademarks of CCP hf. All rights are reserved worldwide. All other trademarks are the property of their respective owners. CCP hf. has granted permission to EVE-Search.com to use EVE Online and all associated logos and designs for promotional and information purposes on its website but does not endorse, and is not in any way affiliated with, EVE-Search.com. CCP is in no way responsible for the content on or functioning of this website, nor can it be liable for any damage arising from the use of this website.