Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Also the same article states that* they were not targeting either officials, but the party they were happening to call.* and were ordered to delete the records, as soon as the involved parties were identified.

Sounds not quite like hiring a double-agent from an allied secret service.

Even if would take your explanation, what kind of mindset would shorten "this lousy excuse for a science" to "science"?

> On the other hand, the entire Heartland anti-AGW fund is smaller than the one bribe, er, "grant" paid to one NASA administrator, and a tiny drop in the bucket compared to the various government pro-AGW propaganda expenditures.

Taking into account the amount of factual results produce, I would say, the Heartland Institute receives a disproportionate amount of money.

Science, it works in the sense, that for example, that it allows us to produce rockets, which got us to the moon.If the Heartland Institute produces something similar, then I would consider it putting it in the same league as a single NASA administrator.

With the same application mix, an even distribution of load is not a good sign: You have to wake the sleeping CPUs more often, so they can't switch them to a deeper sleep mode, and probably, the scheduler is switching the threads between processors, which means more cache misses.

So, the load graph alone is not a good indicator. Of course, that doesn't make your observation any less valid.

Exactly for that reason, proving is done by deductive reasoning, not by testing.You are working on a different abstraction level.

For proving, the number of variables or configurations are not a sensible measure of complexity.There are seemingly simple equations, which haven't been proven for decades or even centuries,and there are equations with an infinite number of scalars, which can take an infinite number of values,which are well understood and proven from several different angles and used by undergrad students every day.

The same way it doesn't "take 100 years to" write code, which takes "every possible code path and input" in account,it doesn't take it to verify it. Discovering an algorithm might take 100 years, but not writing the code.Those are separate problems and usually one does the first, not the latter. Especially not in the cited case.

Writing correct code is about implementing an algorithm, which already considers "every possible code path and input"and implementing it correctly. Software verification is purely checking, whether the written code matches the algorithmis tedious and time-consuming and error prone in itself, but only takes a simple factor more time, which it took to write the code.Automated verification is a totally different beast, because there is provably no algorithm for it.

To my understanding, that is the quintessence of the Gödel incompleteness theorems:There are things, which are intractable for automated systems, which aren't for humans.

The size of the "solution space" is mainly important for testing, which seemed to have failed in the cited case.

Some things may be over-engineered. But in my experience, more often it is the case, that people rather re-invent the wheel,than they bother to try to understand, what someone else has done, and how it is supposed to work.And over time, it will bite. Usually not the one who wrote the code, because that person is gone, but the project in whole.And no, I don't see a difference in "own code" and foreign libraries, from a "long" term perspective, it is the same.

> I don't see any reason, why the version would have to change in the middle of a file in any case.

It is probably not due to the fact, that the version might change in the middle of the file, but in case, you only have a part of the file.This makes it more robust, and better suitable for streaming: You can simply start sending from an arbitrary position, and the parser shouldbe able to recover at some point.

Not really. Event driven programming is more about decoupling caller and callee than responsiveness. So, instead of calling the handler directly by function call it either getrouted through a message loop, or is fired over a delegate. Just try to handle a complex function in such a handler, and look how responsive the GUI remains. It will become dead.Yes, you can work around it, by loading of the code to OnIdle or Timer event, but that is essentially a poor-man's multi-threading, as now you have to break downyour code in smaller interrupt-able chunks yourself. Hence, the standard approach suggested in API documentation is: Use a worker thread.

Those sections are referring to disclosure, not "acquiring" information. As paragraph (c) points out, illegal acquired information is notcovered.

> Here you go.

Well, that is fairly damning and hardly a moment of glory for the people involved. I have to give you that point.But fortunately, I can still retreat to ad hominem.

> > Which requirements of the FOIA have they supposedly been trying to circumvent?

> The emails referenced in the above link are subject to FOIA requests. Deleting them is a felony.

You keep re-iterating it. I fail to see it written in law.

> Given the title of the leaked file, it is quite reasonable to conclude that the whistleblower was tasked with complying with an FOIA request, and when that request was denied leaked the information compiled to comply with it anyways. And quite rightly so, both as a matter of honour and a matter of law.

Hardly, because if the request was denied, they are de jure not subject to FOIA request.So it is not within his rights to release the mail. That is why the police is investigating that person, instead of the scientists.

Actually, no. Simply thought experiment: What would happen, if you requested your professors correspondence?Personal communications are exempt under Section II, 40, as they are protected by the Data Protection Act 1998,which overrules FOIA as explicitly stated in several places.

Let me reiterate, the FOIA is a personal right in your relation to a public authority, not a person.There is no contradiction in the fact, that a public authority is composed by private persons, it simply makes it more difficult toseparate those to.

Strangely enough, the BBC refers to i stolen. The police has been informed and is investigating. Unless you can point out, under which law such an action is legal, my point still stands.

> In fact, *not* revealing it would be a crime!

Hardly, there is no legal requirement to publish ones personal communication, unless there is a court order.

> There is quite clear evidence [...]

I can only reiterate my wish for actual facts, instead of half-baked assertion.

> evade the requirements of the FOIA

Which requirements of the FOIA have they supposedly been trying to circumvent?

> that is a felonious activity, to conceal your knowledge of it is the crime of misprision.

The FOIA is a law pertaining the legal rights of a person in relation to a public authority. I am intrigued, where you derive the legal framework from for judging a person working there. Enlighten me, by pointing out the name of the passed law, and the section.

Even if it were a crime, you seem to claim that the persons in questions are the perpetrators, which in turn would make not publishing it not a crime. The right against self-incrimination is fairly well established.

> Only instead of a few million Windows computers getting botted, our very economy is at stake from the "warmers" and their political machinations.

Those, as you called them, "warmers" are actually scientists publishing in peer reviewed journals. Despite the illegal and unethical breach of their private communication, no new facts concerning data and/or methods have been unveiled, only adding further to the list of ad hominem attacks.

Concerning the effect of assumed counter-measurements against climate change, I am astonished, that you can claim to know the economical impact, as at least to my knowledge, economic models are several orders less reliable than climate models, as recent events may indicate.

Care to share your insight, which seems to exceed that of the tree huggers at McKinsey's?