What sort of code could I write in a low level language like C in order to induce maximum energy consumption (and hence melt my computer)?

I remember being shown graphs in my computer architecture class that showed most of the energy expended during computation was from moving bits around, and that the actual operations in the ALU didn't consume that much energy. Based on this information, I was thinking that writing some loops to copy random bits between a few arrays which fit in the L3 cache would heat my computer up the most.

The reason why I think the L3 cache would work better than copying to DRAM is because I've never seen or heard of DRAM needing cooling like the CPU does, so I figured DRAM doesn't consume much energy.

My question: would this actually result in more energy consumption than say, running your average computer stress test? I know the small FFT tests in Prime95 are designed to stress the fpu and the cache, so I wonder how this strategy would compare to that.

Also, does the same logic apply to testing out my GPU? Would it be better there to stress the main memory or the scratchpad memory? I know GPU's don't have much cache to speak of.

unless the hardware has a serious flaw, the worst you'll be able to do is waste electricity and maybe shut down the system (the CPU and GPU are more likely to throttle and stay within safe temperature ranges, but if you have especially poor cooling (for example, a laptop packed full of dust), you might be able to get something hot enough that it shuts off to prevent damage.

Modern computers have very good thermal throttling - ie, when they get hot, they slow way the fuck down to avoid overheating. Old CPUs (10+ years old by now) didn't always have thermal sensors, so you could get them to burn up by running even fairly standard programs without a heatsink - here's an example, from back when Intel had thermal sensors but AMD didn't.

These days you can't really cause that to happen, but I remember seeing a StackOverflow post on how to get your CPU to use as much power as possible. The answer is here - this will burn a ton of power in a very localized area of your processor (the vector unit in each core), so expect it to start throttling pretty quickly if you try running that code.

If your goal is to ruin things & not merely waste electricity, throwing the hard disk drive head back & forth while spinning the disk up & then braking repeatedly can ruin a hard drive given enough time (makes a cool noise, too).

As for the motherboard, write up a script to flash the EPROM repeatedly & it'll burn out, leaving a bad BIOS image.

For a final touch, create a looping system restart (as rapid as possible); If allowed to run long enough, the power supply will fail.

I have seen all of these things occur accidentally, so it shouldn't be too hard to make it happen on purpose.

Even on fairly modern computers, it’s possible to cause a lot of damage. For instance, you can brick your battery (or possibly set it on fire) or set your printer on fire. On older machines, I imagine it’s much easier to do terrible things. This stackoverflow thread mentions melting magnetic-core memory and, my personal favorite, “moving the read/write head of a disk drive with the harmonic frequency of the drive cabinet, causing it to walk across a table and fall onto the floor.”

As other commenters have said, modern computers are pretty good at turning off before they get too hot, so any serious damage requires getting around the safety mechanisms first.

MostlyHarmless wrote:As other commenters have said, modern computers are pretty good at turning off before they get too hot, so any serious damage requires getting around the safety mechanisms first.

There have been exploits that enable code to be executed under Intel's System Management Mode - a highly privileged mode, way beyond kernel privilege.It is here that things like power management, security and cryptographic features, and thermal protection resides. Gain access to this and it's pretty easy to brick a machine.

So it does seem that GIMPS's prime finding software using AVX can easily send processor temperatures to 98 C in a few seconds. The only other code i've run which does the same thing is some rendering software written in pure java (but which I suspect makes use of AVX under the hood of the JVM).

Normal "heavy" usage without AVX usually doesn't see the temperatures go about 70.