Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

I agree with your points to a degree -- and, no, I am not one of the skeptics with my head in the sand completely.

I do believe CO2 driven climate change is PART of the bigger picture and is in part driven by man -- or at least being accelerated -- and I agree completely that accelerated global warming is a bad thing.

That said, *part* of the cycle is natural and would probably have happened regardless -- but the amount of CO2 being pumped into the atmosphere by Hydrocarbon burning is miniscule in comparison to the amount stored in the permafrost layer and dissolved in seawater.

The other problem I have is that most of the climate models I have seen tend to ignore historical data before our own era and often take great liberties (or completely ignore) such things as increases in cloud cover and increases in solar reflection due to snow cover.

Basically, yes, I feel global warming is occurring and we are partly to blame (though I think we are merely accelerating the inevitable rather than being the primary cause -- i.e. it probably would have happened anyway, we are just speeding it up by several hundred years, to our own detriment).

What I disagree on is the final outcome -- there is just as much evidence, and possibly more, that the final result of global warming will be to plunge the Earth back into another ice age. Earth is basically a self-regulating/self-correcting system and when it gets too far out of balance, it tends to over-correct with a vengeance.

Earth will survive. Some plants and animals will survive. The Ice Age will end and species will spread and evolve to fill the empty niches. Life will go on.

The thing many people overlook (the global warming people especially) is that if you go back and look at ice core samples and prehistoric patterns of glaciation, the current weather patterns look eerily similar to what has happened before.

Specifically: initial warming leads to the melting of the permafrost, which leads to a massive release of CO2 into the atmosphere. This promotes runaway global warming -- which unfortunately means greater ocean temperatures and much more evaporation. This means more rain and more SNOW.

Additionally, it also tends to disrupt ocean currents and the rotation of heat from the equator to the poles (i.e. the vast majority of Europe is at Latitudes higher than Canada is -- and if it weren't for the warm ocean currents they would have equally frigid weather).

The basic problem is that if you get enough extended period of heavy snow, you may eventually get enough snowpack to resist melting well into the summer months. This is exacerbated by the fact that snow, being white, reflects a HUGE amount of light/heat back into space. In essence, due to snow fall, cold weather is somewhat self-perpetuating.

Eventually you reach a situation where the amount of extra snow that falls in the winter is too great in certain latitudes to EVER completely melt in the winter -- and then things start going down hill from there. Thanks to the fact that evaporation / refreezing and then remelting acts as a wonderful method for desalinization of seawater, you also end up playing merry havoc with the ocean currents as well (and end up with much more coastal ice formation as the freezing point of the fresh water run off is much higher than that of pure seawater). Eventually the currents supplying heat to the North Atlantic basically shut down altogether and things go to hell in a hand basket (i.e. hell freezes over!).

The point that most of the "global warming" alarmists miss is that data shows that in the past both average global temperatures *and* CO2 levels peaked at levels significantly HIGHER than they are right now -- immediately before the planet plunged into the next ice age.

People need to realize that ALL of recorded human history has occurred in the current warm interglacial period -- which is only the most recent one. Furthermore, they need to realize that these warm interglacials of 20K-25K years are the EXCEPTION not the rule -- with ice ages of 100K years or more being the norm (with the interglacial periods between them).

While the movie "The Day After Tomorrow" was largely pure BS, there were some grains of actual science behind it (albeit they sped up the time table of events by several orders of magnitude to make it exciting).

Spacezilla writes "EA is dropping the bomb on a number of their video game servers, shutting down the online fun for many of their Xbox 360, PC and PlayStation 3 games. Not only is the inclusion of PS3 and Xbox 360 titles odd, the date the games were released is even more surprising. Yes, Madden 07 and 08 are included in the shutdown... but Madden 09 on all consoles as well?"

maccallr writes "The DarwinTunes experiment needs you! Using an evolutionary algorithm and the ears of you the general public, we've been evolving a four bar loop that started out as pretty dismal primordial auditory soup and now after >27k ratings and 200 generations is sounding pretty good. Given that the only ingredients are sine waves, we're impressed. We got some coverage in the New Scientist CultureLab blog but now things have gone quiet and we'd really appreciate some Slashdotter idle time. We recently upped the maximum 'genome size' and we think that the music is already benefiting from the change."

likuidkewl writes "Two super-earths, 5 and 7.5 times the size of our home, were found to be orbiting 61 Virginis a mere 28 light years away. 'These detections indicate that low-mass planets are quite common around nearby stars. The discovery of potentially habitable nearby worlds may be just a few years away,' said Steven Vogt, a professor of astronomy and astrophysics at UCSC. Among hundreds of our nearest stellar neighbors, 61 Vir stands out as being the most nearly similar to the Sun in terms of age, mass, and other essential properties."

MojoKid writes "The PC demo for Codemasters' upcoming DirectX 11 racing title, Dirt 2, has just hit the web and is available for download. Dirt 2 is a highly-anticipated racing sim that also happens to feature leading-edge graphic effects. In addition to a DirectX 9 code path, Dirt 2 also utilizes a number of DirectX 11 features, like hardware-tessellated dynamic water, an animated crowd and dynamic cloth effects, in addition to DirectCompute 11-accelerated high-definition ambient occlusion (HADO), full floating-point high dynamic range (HDR) lighting, and full-screen resolution post processing. Performance-wise, DX11 didn't take its toll as much as you'd expect this early on in its adoption cycle."
Bit-tech also took a look at the graphical differences, arriving at this conclusion: "You'd need a seriously keen eye and brown paper envelope full of cash from one of the creators of Dirt 2 to notice any real difference between textures in the two versions of DirectX."

An anonymous reader writes "Ben Kuchera from Ars Technica is reporting that EA/DICE has substantially changed the game model of Battlefield: Heroes, increasing the cost of weapons in Valor Points (the in-game currency that you earn by playing) to levels that even hardcore players cannot afford, and making them available in BattleFunds (the in-game currency that you buy with real money). Other consumables in the game, such as bandages to heal the players, suffered the same fate, turning the game into a subscription or pay-to-play model if players want to remain competitive. This goes against the creators' earlier stated objectives of not providing combat advantage to paying customers. Ben Cousins, from EA/DICE, argued, 'We also frankly wanted to make buying Battlefunds more appealing. We have wages to pay here in the Heroes team and in order to keep a team large enough to make new free content like maps and other game features we need to increase the amount of BF that people buy. Battlefield Heroes is a business at the end of the day and for a company like EA who recently laid off 16% of their workforce, we need to keep an eye on the accounts and make sure we are doing our bit for the company.' The official forums discussion thread is full of angry responses from upset users, who feel this change is a betrayal of the original stated objectives of the game."

One of the things this article ignores completely is the area of embedded programming -- and there is still a LOT of it going on. There are still a ton of NEW projects being done based on 8051 and 6800 series derivatives -- and those are just two of the major architectures.

Even if you are not specifically doing embedded programming per se, the more you know about the basic architecture of your system the more you can help the compiler take advantage of it.

For instance, on the vast majority of processors comparing a register to zero is typically a VERY low cost operation. Furthermore, many processors have hard coded instructions that both decrement a register, compare it to zero, and branch if it is not. If you enable the most aggressive optimizations on some compilers they will attempt to do loop re-ordering (often with disastrous results) and do sometimes succeed. HOWEVER, more often than not there are chunks of code inside the loops that prevent effective re-ordering from occurring. If you are aware of your processors underlying architecture and try to intentionally write most of your loops as count down to zero in the first place, you make life much easier for the compiler and allow it to make more efficient code as a result. This is just one small case.

Also, as far as hand optimizations go, I still do it quite often -- even at the raw assembly level. With visual inspection and manual adjustment I have proved time and time again that I can do a MUCH better optimizing job than the Keil compiler can. I can also typically get some gains on ColdFire/Freescale stuff as well, just not quite as drastic.

I have worked on many projects over the years and seen more bad programmers than I care to admit -- and the most recent/youngest batch has both some of the best and worst I've ever seen (with far more of the later than the former). This is not their fault, it is the fault of what the university's are teaching them. At one company I used to work for (and this was a BIG company with over a hundred thousand of employees worldwide) our local HR department had a standing policy to NOT hire Computer Science graduates for permanent programming positions unless they had interned with us first. Basically, all the CS grads had far to many theoretical and inefficient/unreliable programming ideas to unlearn to be useful.

Also, there are many cases where even hand manipulation at the raw binary code level is still needed and useful. Although most projects I deal with now, thankfully, use flash for code space, a few still do use OTP (one time programmable) parts. It has not been that many years ago that I had to spend the better part of 2 months figuring out a way to "overburn" a set of parts by finding a safe location where I could turn existing instructions into NOP's followed by a branch to a new chunk of code at the end of our programmed space (when re-programming/overburning an OTP you can still turn a 1 to a 0 but not the other way around -- thankfully, the architecture we were using considered 00 as a NOP and we had left all the unprogrammed space as FF's). And yes, this is a very extreme example, but it allowed me to find a fix that allowed us to use over 35 THOUSAND mis-programmed parts that otherwise would have had to be tossed (and these parts cost in the $12 range EACH).

Similarly, on some large volume consumer products, manual code optimizations, low level coding, and hand tweaking is still the norm -- for a very simple reason: it saves money. In almost every case, it is still almost always cheaper to use the micros with less onboard flash and hand optimizing the code often allows us to keep code just below the threshold of the next size part. Similarly, on one project I was on we had 3 engineers spend 6 months hand writing a custom DSP algorithm that allowed us to remove a filter circuit whose component cost was on the order of $0.15 USD (yes, 15 cents). The management team was utterly thrilled over this as the volume for the circuit in question was way over 1 million units (you do the math -- on projects such as this engineering cost is essentially free, you throw as many resources at it as you can get for as long as you have and just hope you can optimize another penny or two out somewhere).

The world of Windows and GUI's is far from the only game in town -- it's just the one that everyone sees and thinks about. There are FAR more devices out there with embedded code in them than there are PC's. Just sit down and look around your house... think about your TV set (although anymore these are pretty high level code due to the ATSC switchover), your DVD player, your microwave oven, your thermostat, your home security system, your garage door opener, your printer, your scanner, your fax, your LCD monitor, your cordless phone, your freezer/refridgerator, etc. Even your PC itself contains many small embedded processors with their own custom firmware (e.g. the firmware on your hard drive, the firmware in the motherboard chipset [a logic block for an 8042 keyboard controller still exists in every chipset that can support a PS2 keyboard], the keyboard itself even has firmware...)

Whether you have ever realized it or not, embedded programming is EVERYWHERE.