Re: Sort of points out that winning against a multi-faceted adversary will never win

"What you say is only correct because the defenders use standard processes that are predictable: "common practice". Once you depart from this predictability, an attack becomes much harder and potentially less effective."

You wake me up when your'e running a fully containerized and microsegemented environment with complete data path inspection, automated baselining, baseline deviation sensing, and automated incident response that includes at the very least auto-quarantining.

Unless and until you manage to get your security solutions to at least the above level, you have no place disparaging standard security procedures. If you understood today's IT security and were able to implement it, you'd understand the huge gap between today's best practices and the poor bastards cowering behind an edge router like it was 1993.

Maybe if Microsoft put in a "stop fucking spying on me, you lousy git" button, people would be more inclined to adopt it. At least with Windows 7 I can murder the bloody call home with a hosts file. In Windows 10, they hard coded the bastard in so that even this doesn't work.

Re: Great /sarcasm

Re: Great /sarcasm

It's Red Hat. They'll give the whole thing to Pottering, and he'll build it into systemd. It will end up riddled with bugs and with security as an afterthought's afterthought. When the first logoed vulnerability emerges, he'll blame end users. Nobody will be able to fix anything because all the logs will be binary, and corrupted by the attacks.

Red Hat will snicker and go "who else are you going to buy from". Three more RHEL derivatives will emerge the next day. Oracle will pretend it didn't hear anything about any of this.

Re: Does This Affect AMD Epyc CPUs

Why do you assume CPUs have to be "safe" for people to buy them? Do you honestly think that Spectre has slowed down CPU purchases? Do you think anyone but the handful of nerds that haunt these forums and some security nerds that read the Grugq a lot actually care about any of this?

Make no mistake, Intel is still the seller of server CPUs. They'll keep on being the seller of server CPUs through the lawsuits, and they'll emerge from this as the seller of server CPUs.

I'm as much of a fan of the underdog - and hence AMD - as anyone, but let's be realistic. Intel has an iron-fisted monopoly, and unless someone goes in there with the Almighty Axe Of Anti-Trust and cleans them out, they'll continue being a monopoly for at least the next decade.

You know it. I know it. Everyone except a few deluded die-hards knows it. So let's no pretend about this, shall we?

The people who buy things don't give a fnord about "security", and they never have. If they did, the Internet of Things wouldn't be such a gor'ram security dumpster fire. Nobody would ever buy Cisco or Supermicro again, and the list goes on and on and on.

But you know what else? It's not their asses that end up in front of the judge. It's us. The hoi polloi at the coal face. Nobody sends suits to jail. They make some poor bastard working in ops the lightning rod and ruin his life, and the lives of his family instead.

So yeah, Intel's dominance isn't going anywhere. Nobody's going to do a bloody thing about it. We're going to be responsible if/when it all goes horribly wrong, and we should know about this ahead of time so that we can take precautions and/or run the hell away in terror. (Depending on how you view risk.)

For suits, the only risk they care about it "will this cost me some of my bonuses". For nerds, the risk we need to worry about is "will this land me in front of a judge"? I leave it as an exercise for the reader to work out how likely (or not) they feel this is to affect their chances of negative consequences.

But we should all have eyes open here and understand that this issue isn't going away, and that we, as the plebians, have no choice but to deal with it.

Re: Does This Affect AMD Epyc CPUs

AMD and some ARM chips are affected by Spectre. But let me be 100% clear on this: AMD is completely irrelevant to this discussion.

AMD chips might powerful a smallish percentage of endpoints, but they power almost none of the existing fleet of deployed servers. Even if, for some reason, we all decided to buy AMD tomorrow, AMD couldn't deliver. At full ramp, AMD would struggle mightily to put out enough silicon to cover 10% of planetary server capacity during the next refresh cycle, and there is zero indication that demand exists for them to invest in that many wafers.

I'm sorry, but in the real world, AMD just isn't part of the any discussion about server chips. There's only one player in that market, and they'll be the one everyone buys replacement chips from.

Re: "but we will always have at least the original black box's capabilities."

Yes, I'm sure. I'm sure because already "serverless" doesn't mean "Amazon". Serverless has a lot of ardent followers, and they're building open source solutions that do what Amazon's Lambda does.

Similarly, machine learning, AI, BI and other BDCA tools are seeing a lot of open source growth. In short: a commercial entity (like Amazon) might come up with the initial concept and get to milk it for a while, but all technology eventually ends up democratized.

The basic approach of serverless - white simple scripts (or have a UI/digital assistant write them for you), and have those scripts simply pass data from one black box to another - isn't going away. Humans don't typically uninvent things, especially in IT.

The increasing adoption of digital assistants also makes this seem like a permanent thing to me. The black boxes used by serverless types are really no different than the "skills" one can build for Alexa. Indeed, many of those skills are nothing but serverless scripts that call black boxes, and I've already seen Alexa used to create new serverless apps, which could be published as Alexa skills...

The whole thing has already reached critical mass and started a cascade. While a bunch of whingy nerds who can't disconnect "application development" from "enterprise apps" might not get the importance of an ever-increasing library of digital capabilities that can be used by any Tom, Dick or Harry, non-nerds seem to get the importance really quickly.

So we, IT nerds used to the way things were, we might not see the utility, or ever get around to using serverless to do our jobs. Our kids, however, will use serverless-style tech to do all sorts of stuff. Using - and trusting those black boxes will be as natural to them as smartphones are to my generation, or staring blankly at VCRs flashing 12:00 was to my parents' generation.

So sure, in the short term I expect some of these black boxes to disappear. That will lead to a backlash, and to standardization, to the development of "skill libraries" and all of the predictable evolution of responses to this problem. Corporate greed is eventually overcome in tech, even it takes a decade or so for us to get our shit together.

It is very early days for this technology yet, but the basic approach is sound. And those who have used serverless in anger tend to become adherents pretty quickly. Even the disenfranchised and cynical nerds.

Re: I worry the author is bluring Capabilites and Serverless Environments

@criagh yes, but you are doing enterprise work. Application development for commercial purposes. What you - and all the rest of the angry nerd mob of commentards are missing - is the part in my article where I very specifically said "It is the commoditisation of retail and consumer application development".

Serverless isn't going to replace traditional development for organizations looking to build their own middleware anytime soon. VMware isn't going to make their next vSphere UI using serverless. That's not where the revolution comes in. The benefit of serverless is not "helping highly trained nerds do what they already do better", despite the inability of the commentariat to conceptualize anything different.

Serverless is going to let people who are not nerds create applications that solve problems that nerds and commercial entities are not normally interested in.

For example: Let's say that i want to grow cannabis plants at home. (In a few months we can legally grow 4 plants per household here.) Cannabis is a particularly persnickety beast to grow. Much more so than the bell pepper plants that I have all over my house.

With serverless, I could take data from my house - images of my plants, or perhaps sensors similar to these - and feed that data into a black box up in the cloud. I could set the thing up so that if A happens, the plants get watered, if B happens, a fan turns on and if C happens, it sends me an alert.

Traditionally, if I wanted something like this I would have a few options:

1) Build something myself involving an ardunio (lots of work)

2) See if a vendor has already build a pre-canned solution, probably involving their own sensor ($$$)

3) Hire a human ($$$$$$$$$$$$$$$$$$$$$)

With serverless, however, I just need someone to write a "black box" that can accept some form of data I can provide (images, sensor data, etc) and spit out simple data about the plant in question. That black box does not, to my knowledge, exist today...but I am sure it will soon. (I could even train my own black box using machine learning, but that's another discussion.)

Essentially, serverless is a scripting platform allowing access to an ever-increasing number of "skills" or "capabilities", in a marketplace-like format. A platform I don't envision being used to replace super-niche industry development, but one I envision being used by civilians to automate and enhance their daily lives.

Application development is already a thing that is out there enhancing businesses that can afford qualified nerds. Now it is going to start being something we can use in our day-to-day lives to collect, modify and act on the data around us.

Re: how much will it cost

The individual applications wont' last long, it's true. But the black boxes themselves can. Oh, they'll change and evolve as they're maintained, but as a general rule, once invented, they won't be uninvented.

If someone creates a black box that does facial recognition, we aren't going to find ourselves 50 years from now without a black box that does facial recognition. We may find that someone has created better facial recognition systems, but we will always have at least the original black box's capabilities.

The digital skills being created accumulate until, one day, you'll be able to say "computer, watch the roses in the garden and water them if they start to show wilt", and it will happen.

That phrase spoken to the computer is a script. It told the computer to gather a dataset consisting of images of the roses in the garden. It told the computer to send those images through a black box that determines if there is wilt. If there is wilt, it will trigger the garden's watering system. That, right there, is serverless. The only difference is the interface used to create the serverless script.

The end user doesn't care if the black box that checks images of roses for wilt is the same today as it was yesterday. They only care that it works. All that matters to them is that a black box that performs that action exists, and that it doesn't stop existing.

Re: "Codeless" maybe?

There's a whole industry full of supposed trained experts with loads of experience and education and all our information is ending up on haveibeenpwned anyways. You'll excuse me if I don't feel that people doing the application development equivalent of drawing lines between black boxes that are designed and maintained by large multinational public cloud providers could possibly do any worse.

Hell, maybe those public cloud providers will actually hire enough security people that the stuff they offer is secure by default. It would be a nice change from the shitpoaclypses created by the "I know better" or furiously cheapskate crowds.

Re: Serverless is a stupid name

Ease of use is the difference. Yes, serverless is basically "scripting for noobs". But the "for noobs" part is really critical.

Yes, really smart, well paid, highly experienced nerds can write a bunch of gobbedly-gook scripts, or write applications in proper development languages that called libraries or other application components. That's hard. And it requires expertise. And if you are incorporating some black box (application, library or other component) into your solution you first have to find it, download it, install it, make sure it's in the right place to be called, etc.

Almost all of that goes away with serverless. You write what amounts to a script, but you have access to this (constantly growing) library of tools and features supplied by the cloud provider. Making use of those tools and features is generally much easier than a bash script, and you don't have to manage or maintain the underlying infrastructure in any way.

When you take something that requires significant expertise and make it something that nearly everyone can use, that's a pretty big deal.

I'm sure that there were a bunch of nerds who knew how to build and maintain their own electrical generators who also said that national grids were a stupid idea. If they can run their own generation capacity, and wire up their homes/businesses exactly according to their own specifications, why can't everyone? Standardization means everyone isn't using the optimal plug/wire gauge/whatever for the job! There might be inefficiency! A national grid won't be revolutionary or change everything, because people can use electricity now!

Re: I worry the author is bluring Capabilites and Serverless Environments

@Craigh You are correct: there is no reason for various cloud features (such as BDCA tools) to be tied to serverless. On the other hand, serverless doesn't offer you much of any real use except the ability to push data into and pull it out of various cloud features.

The point here is that serverless basically allows one to lash together some pretty complex applications about as easily as one could write a (reasonably simple) batch script.

Real developers will build the various black boxes. Trained IT practitioners will be able to use those black boxes wither through serverless or other cloud solutions. But for the milled masses, serverless will be their gateway to application development.

Right now, today, in order to build serverless applications you need to actually type a few lines of code. But the code you need to write in order to pipe data from A to B to C and back out again is simple. It's the sort of thing that you could build into a drag-and-drop GUI and have 4 year olds use.

Nerds think about serverless from the viewpoint of "how does this help me do what I do now"? Non nerds don't do any of this now, and when shown how easy serverless is, say "wow, look at all the neat things I can do that I couldn't do before".

So serverless - in the form of AWS lambda - is itself pretty useless. But it is this gateway to all the other things that Amazon's packed into AWS. It isn't the only way to use all of AWS's goodies, but it absolutely is the means by which individuals who aren't specialists, trained practitioners or experienced cloud architects will set about making tools for their own needs.

"Given the general lack of flat white walls in the general area when out camping in the woods/fields/wilderness, won't the average millennial fairly-well-off youngster be a bit pissed off lugging an 8ft projector screen around with them?"

I'd just bring a white blanket and hang the thing off the side of the car, but hey, that's me. "Flat" is a luxury. You're camping, eh?

Depends on whom you ask. I think of camping more as "getting away from people and all the noise of a city". If I could, I'd live on an acreage surrounded by trees all the time. I'm too poor. So I go camping.

Camping with a projector to watch a movie while I poke the campfire? Sign me up.

Re: Confused

They're less "technically challenging" than they are "a damned lot of tedious, boring, thankless work". I.E. a miserable pig.

You have to go through a lot of data before you even get to a candidate. Long after the fun work's been done, and the challenges of building the 'scope are long past, there's just crunching image after image. If you're really lucky an algorithm can help some. Even then, that's an awful lot of faint smudges and maths.

Re: Confused

G09 83808 isn't the galaxy spotted earliest in the universe's history, however, it is spotted very early on, and we've gotten a chance to get a slightly better look at it than we could with previous instruments. Any and all objects that we can spot which are from about 1B years after the universe formed or earlier will get press, and rightly so. They're a miserable pig to find, and they can tell us a great deal about the early universe.

While the benefits of current robot-assisted surgery is somewhat questionable, this doesn't mean surgery robots are useless...or at least that they will remain so. Consider, for example, this prototype.

This prototype robot uses machine vision - amongst other modern techniques - to create a (mostly) autonomous surgery bot that is actually better than humans. Yes, it has some bugs, but it's an early prototype. I expect to see some rapid enhancement in this area.

Re: "the Arrakis Sandworms were actually quite delicate"

No they weren't. Drop a sandworm in water, watch it shatter into sandtrout. In relatively short order they would sequester all your planet's water beneath the surface and convert the planet into a desert filled with sandworms.

Re: Female CEO

"people like them give women a bad name"

So a handful of female CEOs that screwed up "give women a bad name", but thousands of years and millions of male leaders of companies/governments/transcontinental empires etc. don't give men a bad name?

Look, I question the validity of affirmative action as much as the next guy, but dude, that's some completely unsupportable sexist bullshit right there. What the fuck.

@Haefen

Speaking as a Canadian - and one who apparently has quite a bit more knowledge of what you're babbling about than you do - I would like to, in the kindest, politest possible way convey my feelings about your inane babble:

Re: SYSTEMD

Re: Can you please....

No, we can't. And the reason is that politics, privacy and security are tightly wound up into an inextricable morass.

Regardless of your politics, and what side you take in the multi-tentacled debate, politics affects the balance chosen between security and privacy. Or whether the needs/desires individuals (as opposed to corporations and/or states) should be considered at all.

So put on your big boy pants and welcome to the real world. Politics is everywhere.

Thank you kindly for choosing to engage directly with the community. It is nice to see anyone from any vendor taking the time.

I do have one small question however: does the rest of Microsoft know you're doing this? You're harming their ruthlessly customer hostile image. (Well, not a lot, as the Windows team exudes so much animosity towards literally everyone that it's hard to overcome...but you are denting the evil overlord image a little...)

Re: Gamers?

I am told by some of the hard core gamers in my sphere that if you want to do VR at 240hz then having 8+ cores @ 2..8Ghz or better is usually required. As I'm poor, and still working on a video card from 3 years ago and a Sandy Bridge-era CPU, I cannot confirm this.

Apparently VR is a thing that some people do. I don't understand. Why do you need VR to play Scorched Earth?

Re: I hate Agile as well

Yeah, I wasn't going to get into pricing with these sorts. Open source stuff like Ceph or LizardFS can handle HCI storage layer, with OpenNebula and many others providing great management UIs. Then we go up through the various smaller contenders like Maxta, Yottabyte and Nodeweaver to the midsized ones like Hypergrid or Scale to the big heavies like SimpliVity, VMware or Nutanix.

The price range varies wildly, and even Nutanix have entry-level gear that isn't that badly priced. HCI isn't expensive. It certainly isn't as expensive as ancient three-tier architecture. That said...

"It is difficult to get a man to understand something, when his salary depends upon his not understanding it!"