Disaster was averted so narrowly! He was busy defending christianity from the looming threat of pagans on Fox News that day, and couldn't have been lost along with the plutonium to cause the apocalypse. Simply a scheduling conflict. Does Tuesday the 29th work out better for you?

While it's true that a version transition like that hasn't happened again, it's utterly naive to think that just because there are newer versions available that they would be suitable for an industry such as ATMs or banking, which require serious reliability. There have been periods during 2.6 and 3.x where, for instance, ext4 drivers would silently corrupt data for a few stable patches.

You're talking about FAR beyond the notion the notion of "long term support", into "indefinite, forever, and guaranteed stable/regression-tested/quality-assurance-tested" support on one particular version for at least the next 10-15 years. Version upgrades (like from 3.2 to 3.3) would be completely out of the question because it could introduce a regression and take down a nation-wide ATM network or silently corrupt transactions without any way to easily debug and fix it in a short period of time. Every ATM also has to be effectively interchangable with any other ATM. Version changes break the ABI and require any external modules (likely required for an ATM) to be recompiled. The slightest mismatch between perceived ABI between different modules will cause corruption and/or crashes. So you are, in essence, only talking about security fixes on an already-reliable version, and the security fixes can't change any data structure sizes or perceptions.

That is basically what Windows Embedded versions offer. The kernel layer ABI sometimes changes in consumer Windows versions (and includes enterprise and server versions) due to compatibility or security fixes. It's still the #1 cause of BSODs because of antivirus or other security software.

The linux kernel's QA is basically "if it compiles and looks remotely OK, it's fine to put in a stable version".

FreeBSD would be a much more appropriate target for such a device, due to strong QA practices (everything must be tested) and a total commitment to maintaining both the kernel and userland ABI for an entire major release version cycle, except that of course it's more or less only available for x86 (which is one of the reasons why Sony put ORBIS right over the top of a FreeBSD 9.x base).

So other suggestions on this story in general about embedded-specific OSes where you can buy upstream support forever (such as QNX) are entirely correct. Linux has to be treated with an extensive set of rules that nothing else does. Newer Windows Embedded will have to be replaced too often and takes far too long to boot or service.

Once you commit to an ABI, though, that's it. The hardware can't change (except possibly external/add-in peripherals that are optional), kernel modules can't be recompiled. Any versions of any libraries won't increment. System services and other binaries also likely will rarely change, only for serious and required fixes that have been extensively tested. But any individual binary or library on the system, which includes any kernel level stuff, has to be completely and seamlessly interchangable with any other. That may well be a binary that's 15 years old, or comes from 15 years in the present-future. That's how all of that works. If it doesn't work that way, it'll cause serious problems and chances are the company as too cheap to have a "plan B-H" to get things working within the hour.

Unless I'm reading it wrong, it previously appears to've been released under a BSD-like license that is non-copyleft, allows commercial redistribution. The only reason it's GPL incompatible is because they describe the venue of law under which the agreement is binding.

And they aren't dual-licensing, but simply relicensing from one to the other. That...is actually a step backwards. In general. I suppose for this particular code release, there's no difference of practical value, but in general it's still going in the wrong direction.

Someone else mentioned that Comcast has to follow net neutrality rules for a few years regardless of court rulings. It's getting full speed to AWS during Primetime for me (single data point, but people keep saying it's an issue during prime time).

There's been much discussion lately about how to prove whether Comcast is throttling Netflix, or if Netflix is simply vastly over capacity and throttling everyone.Netflix using and depending on AWS is quite the opposite of their claims lately (toward the end of 2013), that they have a separate distributed network and offer to put netflix servers on-premesis with ISPs (as long as the ISP drops peering charges).

Given that Amazon has enough bandwidth to cover them, and given that Amazon isn't being throttled (at least by Comcast), then it appears pretty definitive that Netflix simply isn't increasing its AWS scaling as demand increases, despite posting record profits. Shock of shock, Netflix likes extra money rather than ensuring reasonable service for all of its paying customers.

Even if Verizon is throttling it any, Netflix is probably throttling it considerably harder based on recent reports and local tests.

Also, if the submitter is paying for FiOS and only getting 12mbit max... A( I'd better hope that's on the 15/5 plan, B( dear god, why is that as expensive as Comcast Blast! and only 15/5? Comcast is in the process of making the same tier 105/12 (currently 50/10). (I got 60mbit on all of the 'net neutrality' tests linked.)

"The European Union approved the use of paraquat in 2004. Subsequently Sweden, supported by Denmark, Austria, and Finland, brought the European Union commission to court. In 2007, the court annulled the directive authorizing paraquat as an active plant protection substance.""It is also toxic to human beings and animals. Research has shown that it is linked to development of Parkinson's disease."

I'm only mildly surprised that I didn't see even a single semi-helpful comment yet. Everyone else appears to be bitching about the question, and saying that USB isn't capable of that. It is, of course, with a few caveats. But assuming you're not connecting a HID device to a USB3 port under Linux and expecting it to get better than 125Hz (driver limitation, works fine on Windows), it all works out of the box assuming you have the right connected devices.

A4Tech makes pretty high quality low latency mice, and apparently you have one, but they also have a similar line of keyboards. 1ms response time on both keyboard and mouse, 8-key rollover on keyboard with programmable macro buttons. Try the A4Tech G800V. It's fairly often on sale somewhere.

My only gripe with it is that it has a 'large' enter key and \ is in a funny place. It's well engineered, and not completely over the top like some other 'gaming keyboards' that offer break-off numpad and other weirdness. It's pretty similar to a standard keyboard, merely with the addition of some macro buttons.

Are you kidding me? They release an indie game with absolutely no advertising. They put it up on a pirate website themselves with a known-bad copy. A few hours after going on sale, they're laughing at pirates and saying they have a huge piracy rate. This IS their advertising strategy, and it's as bad as they come.

When Hotline Miami released, it was available on multiple stores, was receiving a lot of coverage by major sites like Rock Paper Shotgun, and when a pirate version was released? They supported it as if it was official, because they didn't want pirates to get a bad copy of the game. They treated it like advertising, handled it well, and made significant profit with over 130,000 legitimate copies sold, and multiple ports and sequels in the works.

Hotline Miami got significant positive coverage because it was a good game, and they handled things right. This is a dismal thing which they admit is a poor clone of another game, and instead of going to bat for it, they shoot themselves in the foot and have the gall to whine for sympathy when they put it on a pirate site themselves, made it a known bad copy, AND procede to then laugh in peoples' faces after a few hours, when they do absolutely nothing else to promote themselves, or their game? Let alone produce something reasonably innovative or fun?

It's been worrying me that the tagline "News for nerds, stuff that matters" has been removed from Slashdot (except in the source code, but gets replaced on any/all page loads), but this story is coming behind both TFA and the actual patches being available for two full days prior.

It's no "Preskill mocks Stephen Hawking" quote from 2012, like the other article, but maybe this could've ended up -slightly- higher priority given that it fixes 1-2 remote unauthenticated exploits in Java, and IIRC 3 in Oracle DB.

Well, what does it do that Windows 7 doesn't? Not counting the whole "app store" paradigm, or that live tiles work like Dashboard...Native USB3 and bluetooth stack as well as native mobile broadband support. Driver and application stacks can now lightly 'plug-in' on top, instead of having to replicate an entire stack themselves. This notably made both my USB3 and bluetooth drivers smaller and use drastically less CPU for the same functionality. They still offer features beyond the 'standard' support, but everything works out-of-the-box.
Enhanced Protected Mode for those crazy enough to use Internet Explorer as their main browser (apparently quite a few people).

WDDM 1.2 (for the video drivers) is more useful than people give it credit for. In addition to improving performance slightly (according to third parties anyway, I haven't noticed any difference) in a few isolated cases, it drastically improves GPU multitasking granularity as well as preventing legacy apps from needing to disable Aero for compatibility reasons. Everything that had to disable it before, "just works" now, and all the cases where Vista and Windows 7 would make the UI unresponsive under GPU load are now quite butter smooth. This has caused me to not notice a few times when I was running multiple windowed games at once because I forgot to shut something down.:P

Does native Hyper-V support on the desktop version count? It doesn't even mess with your GPU performance, like the old versions used to.

And generically superior power savings. I can't say how well it works for a laptop, but it can save an extra 0.5W on my desktop's CPU when idle (balanced, not power saving), when it was already under 2W.

People complained about Windows 7's "improvements", in case such recent history was forgotten, including the annoying "libraries" support, people becoming confused with aero peek's sudden transparency if you put your cursor in the wrong place. Windows 7's main improvements were kernel/driver related, much as Windows 8's are.

Some of the changes in applications or driver stuff (like networking) will primarily benefit those businesses (that I might consider strange) who are using Windows Server in production, such as being able to get far lower CPU utilization for the networking stack itself, but dependent on non-consumer-class networking hardware. This includes datacenters and financial stuff (for which there has been specific options put in) which need as few microseconds possible added latency.

And don't get disingenuous on era gaps, please. 2K was the "reinvention" compared to Win9x and NT4. Vista was the overhaul (not quite as dramatic) compared to XP. Windows 7 was an iteration. Windows 8 is, by all conventional standards, another iteration. Most Microsoft devs would probably say it's nearly as big as XP to Vista, but I'd disagree.

Microsoft provides the standard. They didn't actually remove the ability for third party software to override that. There are A GREAT MANY (over two dozen last I checked) start menu replacements that give you a functionally (if not aesthetically) identical start menu to Windows 7, boot you direct to desktop, and effectively disable any chance of 'accidentally' activating Metro.It's very disingenuous to say that is worse than Windows 7. Many people hated that Windows 7 completely and totally removed all traces of the Win2K "classic" menu. It got a lot of people to pay attention to the start menu replacement applications, which had previously been rather niche.

Windows 8 is "forcing" nothing more compared to what Windows 7 "forced" on former Vista users. Just because it's a "tick" release instead of a "tock" release doesn't mean it's automatically horrible. Win2K was a "tick" release, but many people did like it and found it to be very stable for what it was. If you look at any of the threads (including Slashdot) mentioning ANY other windows release, the same year as the release, you see very similar complaints, flaming, and generally chicken-with-head-cut-off panicking. Then when it comes time for the next release, people vehemently defending the old releases. And yes, I did check earlier today.

You dislike it. That's terrific, and you have every right to, but as the old saying goes...don't piss on me and tell me it's raining. The only incidences I'm aware of with Windows 8 behaving in a manner that isn't easily and permanently corrected in two minutes inclusive (google, click, click, download, install) are purely vendor issues (like the touchpad one). These happen every time a new version of Windows is released.Vendors are almost never on-the-ball, almost never release drivers or updates on time. If you use an OEM computer, it's always been the prevalent advice to wait somewhere a year for OEMs to get their stuff together. If you built your own, you aren't likely to encounter those issues.

The problem is between keyboard and chair. Between those six or so things that everyone needs to figure out a new or different UI, even Metro is easy. The fact that you can effectively turn the whole damn thing off (start screen replacement, disable windows firewall, and nothing will load) mostly means that people are being a "tad" stubborn about not wanting a start menu replacement rather than disliking Windows.

If all Windows updates can be considered incremental, when do you get an upgrade? Not counting that an upgrade version is a mere $40 still, and just $140 for a full retail Professional box...while a retail Windows 7 Ultimate (W7 Ultimate == Windows 8 Professional) box is $320...it doesn't do things 'worse' than W7.

They actually, for once, worked exceptionally hard on backwards compatibility, and getting driver vendors to produce early support BEFORE it went RTM, let alone before it hit official retail channels. Anyone who was an early adopter on Vista, Windows 7, and even the venerable, and much-loved XP (if I could get away with running XP x64 still, I would), knew that it generally took up to a year for both AMD and Nvidia to release drivers without serious flaws specific to the new OS.They drastically reduced the price and made it very competitive. They streamlined things so it boots faster, and uses less memory and power while allowing more functionality. It's equally customizable and tweakable. You can simply replace Metro and have your old UI back. There's even ELAM so your antivirus can check out -all- drivers and services loading on the system to make sure they're safe. People can moan all the want, but can Windows users genuinely say that, given all those facts (not even counting the stuff already mentioned previously), that a $40 upgrade is actually not worth it? That seems pretty unlikely.

People mostly complain about one aspect which can be hidden. Out of all of the issues I had at the MSDN launch date, 4 months ago, none remain that would be pertinent to ~98% of users, including 90+% of power users. Even then, everything worked out of the box as advertised. Naysayers will nay-say, but I can't recall any other Windows version launch which was as smooth, either on the technical or business (Microsoft) side of things.

It even lets me (just like Windows 7:p) run the BIOS clock in UTC and run the official ntpd source code to sync up to time servers. Although, Windows 8 has a new timer API which allows ntpd to get 0.3 microsecond accuracy from the default timer, just like it does on Linux.

And sadly? That's mostly off the top of my head in ten minutes, and I still dislike Windows in general, I just don't believe that Windows 8 is in any way somehow worse, let alone inferior to Windows 7. In essentially all technical aspects, it's a big improvement. If you don't like the aesthetics, there are many simple and free methods by which you can permanently disable, or change them. If you've never done that before, or don't know how to install a start menu replacement, I'm sure there are some people or forums who would be willing to detail a step-by-step guide if you didn't want to read existing ones.

Just wanted to come back and point out, that if you read the previous articles this author did for HardOCP, he slagged on Windows Vista for pretty much the same concerns, and also said "Yes, it is possible to enjoy both Windows and Linux - but unfortunately this product is unfit for any user."There's an EDITOR'S NOTE attached that says: "The fact is that Vista is far from "unfit for any user," and this statement by the author is simply incorrect."

Coincidentally, it was his last article on the subject for HardOCP. I wonder why...

I thought in a general sense, as a community, we'd moved past cheering for nerd-rage melodrama.Windows 8 makes a few gaffes, but they're largely the same problems that Windows 7, Office 2007, and others started introducing. It can be annoying, but it's the same stuff taken to a reasonable next step, as well as UI unification between desktop, laptop, and tablet.

None of that is necessarily a fun thing, but OSX has been pushing many similar UI changes for longer. A lot of people were unhappy with Lion's increasing similarity and unification with iOS, just in case anybody actually forgot that in less than a year.

The bottom line is, 8 works in the same ways as 7, just with some added complexity. The easiest way to almost entirely remove that complexity? A start menu replacer. People recommend Start8, ViStart, and others. My personal recommendation is "Classic Shell". It works exactly the same as it used to on Vista+, except it adds the "Apps" to the start menu as well.

But even so, why wouldn't somebody be able to figure this out? The video author was squealing about how the start menu "hurt him deeply". Trackpads aren't really supposed to do "touch gestures" by default. It's vendor opt-in. Logitech opted in, and chances are, this guy didn't install whatever WIndows 8 drivers or control panel may or may not be available. Either way, it's a vendor issue. Just like 'no install/repair/recovery/etc' disk is a vendor issue. If you don't want vendor issues, you don't buy things from those vendors.

All of the UIs Windows (95-W8), OSX, KDE, iOS, Android, etc are different. What everything has in common is that there are roughly 6 different things you have to know about each, then consistency covers all of the multi-step operations, or using various applications. Occasionally you get something that breaks out of that a bit (Office 2007+). There are so many "advanced" things, like command line digging, reinstalling from scratch, that the overwhelming majority of people will simply ask a friend for help with or pay a PC repair company. That's pretty much regardless of operating system.

But I digress. The rant is pretty simply over the top drama. It should sell itself as entertainment (if it at least had any humor), not as something relevant to 'tech news'. It's not politically correct to mention, but this guy sounds and acts like the stereotypical nerd, going into a panicky, narcissistic rage about primarily one change that, overall, isn't that significant to day to day use, AND for which there exist free, open source, and easy to use workarounds, while still obtaining benefits of a newer OS.

He himself admits he only tried it for 30 minutes, in a coffee shop, and didn't bother one iota further.Personally, I've been using it for 4 months (and preview versions before that) with NO issues that would meaningfully impact your average, or above-average user. All of my personal complaints are exceedingly specific and technical, and have mostly been taken care of by various updates.

And, in the interest of disclosure, I'm not the kind of person who likes Windows, or most other OSes, in a general sense.I prod and patch kernels, have no problems custom-rolling EFI stub-only boot on Linux, etc. What I really miss, is being able to run highly customized FreeBSD and still use ~90% of my Windows games at full speed. That's mostly a hardware/driver/wine(!) issue, though.

So when I say I'm using Windows 8 in the exact same manner as I use Windows 7, I'm not exaggerating. I actually like the availability of some of the W8 new features. I middle click on the start button (or use Shift+Windows) if I want to see live tiles like the weather...just like on OSX, you use F12 to get the Dashboard to pop up a full screen of 'one glance' kinda information. Even before using Classic Start, the only quirk I took issue with, on the 'start screen', is that when typing for programs, it wouldn't search for stuff like control panels "by default". You'd have to move the mouse over to select "settings". Most of the start menu replacements, however, work exactly like W7 and others did, in that it searches everything.

I know it's really popular in tech circles to have a hate-on for Windows 8 right now, but most of the people raging about it have never even tried it. Most of the people who have tried it, but are still screaming bloody murder, haven't tried it for long (30 minutes isn't even long by twitchy standards), or on a regular PC without vendor screwups. Every version of Windows, since 95 OEM (95 OSR2-C was fun), have had serious vendor problems, in that they mess everything up.

Also not mentioning that you cleared (you didn't actually say cleared though) LFR, not normal or heroic? I'm sorry, you really don't seem like you're the type to've been part of Conspiracy's world-first normal. If you don't have the patience to even "challenge" yourself with normal difficulty, let alone heroic, I'm sure you're one of the reasons Blizz is making everything "Barbie Play Time" easy in MoP. It's not because the heroic raiders are having trouble, it's because the average subscribing customer has the attention span of a horny gnat.

Blizz loves the casual mom and pop soccer-van gamer.

(For non-WoW people, LFR is the equivalent to kittens playing with yarn and somehow beating the game, difficulty. Normal is about what you'd expect from a game. Heroic is punishing. Not Super Meat Boy/I Wanna Be The Guy: Gaiden punishing, but pretty hard.

Less than 15% of guilds (not individual players) world-wide have completed the last two encounters on Heroic difficulty at the current date. Roughly 100% have completed LFR, and 83% have completed normal.

My mistake. ALSR enabled by default on VS2008, and was able to be selected on VS2005...and the WDK for Vista and above, also by default.

The summary claims that "AlwaysOn" ALSR isn't enabled by default "because of AMD". The summary also claims that AMD drivers are unsafe and insecure. TFA claims that it isn't enabled because of "some software, including AMD". The fact is Microsoft declares the forced ALSR unsafe -for a reason-. Forcing it on at that level has no benefit for the things that already support it, and can have consequences for any software or drivers that don't support address randomization clobbering them.

AMD drivers apparently didn't or doesn't support randomizing the base address. There are several reasons why they may not, including for performance. It could also be that there's simply legacy code, or legacy OS support to worry about, since AMD's fglrx supports kernel ALSR on Linux.

If AMD supported this, would Microsoft change the default by Windows 9? 10? Or would there still be other vendors of non-video applications and drivers, some of which may be legacy, that one vendor wouldn't make much of a difference?

AMD's lack of support for a hidden and "marked as unsafe" boot mode has essentially no end-user impact, security, stability, or otherwise. IF the boot mode is required to have randomized driver base addresses, Nvidia would be no more secure by default, or by any reasonable means available to a power user or security professional. If Microsoft changes it to the default (and maybe they have in Windows 8, but I'm certainly not keen on testing recovery mode), I'm sure the driver-signing and WHQL requirements will be changed accordingly, as they have in the past.

Yes, it is sensationalist to suggest specifically and only AMD (rather than Microsoft) has anything to do with a real problem. The summary and article are worded in such a way to suggest blame and danger, which gets people in a furor...over nothing. Anyone with half a brain knows that this is ultimately a Microsoft policy decision, one which the vendors are effectively bound to comply with. Microsoft makes lots of things optional, (in this case, optional, unsupported, and strongly discouraged) so if you think "forced ALSR" is something worth having, you could always write a news story about how "Microsoft makes security optional!!" instead. Just as sensationalist, just somewhat more on-target. Articles like this, including the one on CERT, are getting well into the FUD range.

I'd be surprised if anyone reading this has had "forced ALSR" as described in the article, enabled since WIndows 7 came out. There's not much point in crying over what you never had, and can't really have, at least not yet, according to Microsoft. I don't really care if the few open source, mingw-compiled, programs I'm using use ALSR or not.

Just since people can't seem to keep things straight... The last AMD vulnerability that I can find confirmed was in 2007, a local driver signing workaround, after which they had major overhauls (including on performance). Nvidia had two last year, one of which was a remote denial of service.

By the way, are you the same "anonymous coward" that submitted the article?:p

EMET is a tool Microsoft releases to enable specific settings, then they hide stuff like the "AlwaysOn" behind a registry setting they term unsafe.

Nowhere does it on any of the linked Microsoft pages say that this "unsafe" is hidden because of AMD, unlike what the article boldly suggests. Microsoft would be unlikely to grant WHQL status to drivers violating something it actually wants on by default.Nobody gets the EMET settings "by default". You have to download and run it, many options you have to enable per-program, and many programs don't work with it.The article they link to says Skype, Microsoft's own Silverlight, and World of Warcraft all don't work with the EAF option (everything is enabled by default for a program you select).Nobody is getting, or would get, any of these protections "by default". So saying that AMD drivers "are making your computer less secure" is ridiculous, given that even if it's still an issue (the only linked mention hasn't been updated in over a year), it's limiting the maximum POSSIBLE security, which you would have to enable and run yourself...turn on settings that Microsoft deems unsafe, and knowingly risk making your machine unbootable. All for having ALSR "potentially" work for binaries that don't deem themselves compatible? Great...

Microsoft's own documentation says that all binaries can opt-in to ALSR (same as they have to opt in to DEP by default), but it has nothing to do with system drivers. Out of all of the processes running on my system, only two (an IM client, and a mouse hook service) don't have ALSR. These days, on VS2010, binaries are compiled with the ALSR and DEP flags set by default. You have to specifically opt out.

EMET's own user manual says that it uses a different, conflicting ALSR implementation than what the system natively does...might explain why fewer things are compatible with it.

TLDR: There is no evidence whatsoever that AMD drivers would make your system "actually less secure". There's one note that it "could" make your system less secure, if Microsoft were pushing a security option that it doesn't support.

People should focus on actual issues, instead of inventing imaginary ones just to try to make themselves more relevant and "in the news". I'm disgusted by CERT's behavior. I would've thought they'd at least stick to the actual facts of the case, instead of acting like the dime-a-dozen "don't need no fact checking" bloggers.

Disclaimer: I currently have an AMD card, have used both Nvidia and AMD cards since the late 90s with varying success.

^ So says the article...too bad MPEG audio (including MP3) wasn't finalized until November 1992, with a public release in 1993, and formal specification in 1994...(first software mp3 encoder wasn't released until July 1994)Unfortunately, such a gross overstatement kinda makes me doubt everything else in the article.:P

Apparently I'm a very strange girl, which is supposedly why everyone hates me. It's funny that things can get so screwed up by people overreacting and being dramatic to every possible event, like going to sleep.