Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

This is retarded, there are no other places where those temperature graphs appear, and you want to turn a 5-year local trend into a failing for the large predictive models, which are successfull.
You know, the very same Guardian newspaper which she links to admits that she exaggerates (http://www.theguardian.com/environment/climate-consensus-97-per-cent/2013/sep/27/global-warming-ipcc-report-humans) (http://www.skepticalscience.com/certainty-monster-vs-uncertainty-ewok.html) the level of uncertainty.
In essence, what you say is totally irrelevant to the larger trend.

"Does some QA" and "distributing" have been the most innovative contributions from Canonical to the FOSS world and one of the reasons why Linux is reaching the consumer mainstream.

Maintaining support channels, distribution, testing, QA, and infrastructure are very costly and time-consuming things, but they're the sort of stuff that separate enthusiast products from turnkey and consumer products. This distinction is not at all trivial, and I fill is underestimated by most FOSS enthusiasts since, well, they've never been on the other side of the supply chain.

Basically, commercialization, distribution, integration and support are the most costly parts in most products, more costly than actual development. But it's boring tedious, and standardized, so not deemed to be of interest.

You think anyone in Europe or the US can just start throwing fiber without submitting to regulatory practices to be an ISP?
ISPs are held to many legal standards in data retention and privacy and whatnot.

But the things is, predictive models, while imperfect, have had a signifcant enough degree of precision to merit warning.
You know, I'm surprised that laymen have no ability to distinguish between significant errors, insignificant errors, and acceptable margins of error. For AGW, being in the proper ballpark of the order of magnitude is a significant enough datum for systems which may have exponential behavior (or much worse, like the realities of the difficulty modeling climate). Considering how hard it is to make these models, and considering that the impact is still absolutely massive, it is intellectually dishonest to disregard these results.

In construction, materials are made to withstand their recommended pressures and load values 30% above what the expected usage will be. It doesn't matter for all practical purposes wether it will stand 30% more or 200% more; what matters is that it goes above 20%. Most constructions could stand double the load they have. Well, in AGW, even 10% of the estimated damage is so great that it merits taking care of it.

Engineers frequently make estimation errors in the order of magnitude, using incredibly precise measuremente tools, and nobody complains. Let's not be fools about this either.

It is a small error in the grand scheme of things. Some measurements need only be precise to the order of magnitude to be significant. In this case, the fact that such a large amount of land can be underwater is still relevant even if they're off by a factor of 10.

The professional grade alternative is that you drop the half the superfluous bullshit J2EE seems to support and build and honest-to-God web architecture like God intended: shared-nothing workers on the web server and scale out on the database, use job queues that go to a compute farm if you need that. A decent web framework has transactions built-in so I don't even have to think about the problem, SSO is handled with a cookie and a generic backend that plugs to whatever you want, you don't need special clustering techniques because there are no hard dependencies between nodes.

Sorry, but J2EE has absolutely nothing a modern Perl/Python/Ruby framework doesn't have today, except mechanism for managing the inherent complexity of a crappy language with a shitty object model that needs overly complex architecture to make up for its fundamental lack of programming constructs.

You're not going to find equivalents to @Stateless beans, Message-driven beans or any of that bullshit because decent frameworks make managing state something absolutely trivial, not an exercise in frustration that requires configuring 5 XML files and implementing two interfaces just to save an object to a database. Nobody cares about RMI or "messages" because you can make a restful web service with 2 regular expressions and a 15-line Ruby file. Nobody cares about monstrous ORMs that can barely handle programmatic queries when SQLAlchemy does that and with a 20-minute tutorial you're already up and running.

Oh, and just for the record, most people dn't even need to check for memory leaks because they use a language VM that consumes at most 20% of your runtime's resources and does the hard work with C plugins.

Let the scourge of J2EE and "entrerprise" frameworks burn in the underworld.

Well, which un-regulated, unsubsidized utility that uses public space ever succeeded? You simply can't have phones, internet, or wireless without regulation. The space for putting land lines is public and the government must regulate to allow its use. Same thing with wireless spectrum.

Because knowing OS theory doesn't make you an OS specialist dedicated to implementing good practices on production systems. Even a kernel dev might not know how to install and deploy a production system and implement all backup, user, and processing policies.

Apparently those brilliant people who understand statistics can't be bothered to refute AGW. Might it be perhaps because some of them HAVE looked at the process, consider it sound, and don't find it worthy of their time to add an "I agree" to the endless posting of much more valid scientific work?

You're confusing a variety of unrelated things here. Javascript works fine in every browser that implements standards accordingly (that is, every browser with the exception of IE 6, 7 and 8). The language is not only consistent across browser, it's actually implementing some really interesting features such as list comprehension, generators, and block scoping.

And I don't know where you get the idea that debugging Javascript is any more difficult than any other scripting language. You can't claim to be a professional JS dev and not have heard of sometools.

Oh, and as a scripting language, it is one of the fastest dynamically typed languages available, in the same league as SmallTalk and Lua. The fact that Palm developers obviously used the wrong tool for the wrong job does not in any way detract from the qualities of the language.

Methinks there's a lot of people that talk crap about Javascript but have never bothered to get the proper documentation and tools. Newsflash for everyone: anyone who does professional Python and Ruby development uses debuggers and text editors specifically for that job. Just because JS runs on the browser doesn't mean it doesn't need the same level of attention.

What are you talking about? Javascript's only similarity with Java and C comes in the form of its syntax, made to appease Java a C++ programmers. NOTHING else in the language is even remotely similar, and its developers have made that clear from day one.