Reading between the lines, it sure sounds like they just stacked two LCDs and bumped the brightness of the light source. Mind you, that's a very good idea. The new underneath layer probably only needs single R/G/B group resolution in order to achieve the claimed specs, making it somewhat easier to manufacture, although alignment is still going to be important to get right, as will appropriately close bonding of the two planes to control leakage from one luminance cell (for want of a better word) to the neighboring RGB cells in the color layer.

A highly-motivated enthusiast might be able to get close to the same results by merging two existing IPS monitors and bumping the light source brightness.

That there is the heart of the problem, an attitude that anything old is necessarily bad. That your otherwise calm and reasoned presentation allowed this pejorative to slip in belies the psychological bias that underlies the wide arguments on the subject.

Lest we forget, Linux as a whole turned 25 recently. That's antique. Are you giving up the entirety because it's old? Your favorite editor is probably (just based on popularity) is either emacs or vi / vim. They are very, very old (heck, I've been using emacs since the early 1980s!). Are you dumping them because they are old? I hope you see why calling something "antique" is ill-conceived.

Now to make sure that my point is being made clear, allow me to be explicit: old does not necessarily mean bad, but it does not necessarily mean good, either. Things that are old now were once shiny and new, and weren't necessarily an improvement when they were introduced. But change merely for the sake of change -- which seems to be what was behind debacles in KDE, Gnome, systemd, and Wayland to name a handful -- is wasted effort. For systemd in particular, the primary argument for using it seems to be parallel init, something that as many others have pointed out really isn't much of an issue these days since (a) Linux is generally stable enough that reboots are rare (although there are specific use-cases that benefit, like demand-based VM creation), and (b) computers have become generally fast enough that reboots are inherently speedy.

Yes, sure, an interesting thought experiment, I suppose. Maybe. If you're the sort of psychopath who likes to pull legs off of small insects and animals just to watch them die. And, if that's the case, well, you need to be removed from direct contact with society and should be seeking treatment, possibly including protection from yourself. There is no legitimate use that comes to mind for a USB-killer other than to intentionally destroy property (unlike, say, a firearm which has legitimate uses beyond the raw ability to kill or maim). Moreover, it would seem to be targeted toward public-facing USB ports which are, in general, a public good, and destroying a public good brings us back to the psychopath issue.

For everyone else, well, that sort of creative energy is useful put to more positive efforts.

Do not ascribe to malice that which can be explained by ineptitude. Or something like that.

My understanding is that the result on Google, both displayed search results and auto-completion suggestions, are based on a large number of factors including (wait for it) your personal past history of typing, and the most popular results.

Assuming that a "voting for..." search would autocomplete with both main candidates equally assumes that there are as many people typing one candidate as the other. That is an unfounded assumption that is likely false. I searched a few times for a third-party candidate and lo! my autocomplete changed. The horror! Bias! Lynch them! Or, wait, maybe if I were searching for widgets that might be a good idea. Maybe if I were looking for somewhere to vacation, that might be a good idea. Maybe under circumstances that are not so emotionally incendiary, we might want exactly this behavior because it works very well.

Any evaluation that demonstrates bias and is shocked by the results (or merely reports them) must prove unequivocally that the assumption of a lack of fundamental underlying bias in the data exists. A fundamental equality of data (e.g. the same number of web sites for the two candidates, the same number of twitter posts, etc.) is unlikely to be true in any of the cases being discussed in this thread.

In other words, the original posting is not news. It is, to use the current vogue terminology, fake news. Lying with statistics. Click bait. Something to be ignored.

And he got DAILY classified security briefings! And he talked to LOTS of foreign leaders!

Oh, wait, so did Clinton. Because they were both finalists for the most powerful job in the world, and both needed to be prepared by existing US and foreign administrations for a smooth transition to power.

I'd also say guys like Nate Silver and Sam Wang may want to find something else to do, because this, even though I didn't believe, did end up being a Brexit-style vote, where traditional demographic models failed utterly and the pollsters and aggregators by and large got it wrong.

Just because the models missed one election does not mean the models are useless, nor that they can't be improved. I'm sure that there is going to be a ton of analysis -- and rightly so -- to revise the models to the point that they would have accurately predicted the outcome. You don't abandon a solid, useful mechanism just because you encountered an exceptional case.

The assumptions, from Noether's theorem stating that symmetries imply conservation laws, are that the universe is smooth, in the mathematical sense of smooth being that space is infinitely divisible. We know that last part isn't true: you cannot measure position to an arbitrary precision in the universe.

Last I understood, the experiments that wanted to prove space to be quantized have not produced positive results as of yet.

I've been saying for years (probably could find my posts saying as much on Slashdot, were I less lard-assed) that there are two things that are going to screw DeBeers utterly and completely.

1. Diamond is a really quite nice semiconductor. Lots of good things about it.

2. The semiconductor industry produces single crystal ingots that dwarf a typical natural stone by, what, three orders of magnitude, at five to seven nines of purity. They know how to make big, ultra-pure crystals in vast quantities much, much better than Mother Nature. And they do it at low prices, too.

Once the semiconductor industry kens on to the idea of using diamond rather than silicon, it is game over for DeBeers.

Heck, there's already a huge market in industrial diamonds. I've noticed some jewellery designers starting to use them, too. Just a question of time until the death knell for DeBeers, and they know it.

I've seen SSIDs that were designed to repel freeloaders through an emotional response. My favorite was "Boston Police". Maybe it was the police, maybe not. Are YOU going to risk running a torrent through that AP?

Every now and then there's a phrase that's put forth in British English that has us Americans gob-smacked. For example, back when the Grexit was all the talk, and there was discussion in the British press of the "potential failure of the Greek government" we Yanks were all up in arms because those words mean "failure of the society's mechanism for sovereign rule." Failure of the government, in American English, only happens during things like revolution or invasion.

But to Brits, and those more familiar with the Parlimentary system, it means (to continue in American English) that the current executive-branch administration has lost power and a new administration will need to be elected through the normal mechanisms of the still-functioning political structure.

A phrase in the summary above makes sharp the distinction: "Plans for Brexit are being challenged in a case with major constitutional implications, hinging on the balance of power between parliament and the government." Americans would think, "what? Huh? Isn't parliment part of the government?" A translation that would make us Yanks understand it better would be something like, "... the balance of power between the executive and legislative branches of government."

To paraphrase Phil's explanation: "we couldn't figure out whether SD or CF cards were better, so we decided to do neither."

And that's despite the fact that essentially every currently available consumer, prosumer and professional camera supports SD cards or some high-capacity variant thereof. As a semi-pro photographer (meaning I get paid to shoot events, but that's not how I earn all of my income), I have not used a CF-only camera for something like 10 years now. Heck even Canon's flagship 1D has supported SD since MkII back in 2004.

One is lead to the speculation that the real reason was to shave a few pennies from manufacturing costs by eliminating SD support.

According to the Wikipedia article on Yoyodyne (https://en.wikipedia.org/wiki/Yoyodyne), the one that Seth Godin started has been purchased by Yahoo! and is not the motorcycle parts supplier linked in the article summary.

I use the Fn keys every day to switch between virtual desktops (and thus between applications, as I use each of my F1-F12 desktops for a separate application; can't figure out why everyone doesn't do that since it's gobs faster than using the mouse or Alt-Tab). I won't ever, ever be buying a computer that lacks the function keys. Ever. It would be highly counter-productive to do so for me. If I were forced to use such a keyboard, it would only serve to engender deep resentment to the manufacturer who created such an abomination, and the organization that required its use.

Is the smart thermostat we see today the same one that was there yesterday?

I bet this can be demonstrated to be equivalent to the halting problem. The question should be really: here are the spcifications of a certain device (whether dictated by the manufacturer, or determined empirically): does the present device match them? With every query from here to eternity? Under all circumstances? That smells like the halting problem.

So, in other words, you can never be completely certain of the answer, only confident up to specific bounds. Maybe that's good enough, but $50K for that kind of work is not, and the amount of effort involved for the general case, is not. A good solution for the problem is going to be the sort of thing that would take a startup into a medium-to-large corporation.

But there are really much better ways to avoid the problem in the first place. I mean, to paraphrase a processor of mine, we don't need a microprocessor in every doorknob. Just don't use the damned things. Your fridge does not need to be on the net. Nor do your chairs. Nor each door in your house. Your washing machine works perfectly well without being on the net. So does your garage door. The risks of putting highly insecure interfaces on such items just does not justify the potential benefit.

No kidding. I have a pretty sophisticated self-specializing application that we rely on in my lab. It occasionally needs to create a semi-custom C program to run a particular computation with new parameter settings. It's so fast to generate the few thousand lines of C code and compile them that you barely notice the times when it needs to spin a new specialization of the code versus running an already-compiled version.