Posted
by
Soulskillon Tuesday December 10, 2013 @04:05PM
from the your-corporate-laptop-is-being-replaced-with-an-abacus dept.

mask.of.sanity writes "Google has revealed details on its Beyond Corp project to scrap the notion of a corporate network and move to a zero-trust model. The company perhaps unsurprisingly considers the traditional notion of perimeter defense and its respective gadgetry as a dead duck, and has moved to authenticate and authorize its 42,000 staff so they can access Google HQ from anywhere (video). Google also revealed it was perhaps the biggest Apple shop in the world, with 43,000 devices deployed and staff only allowed to use Windows with a supporting business case."

What they're saying is that the idea of border security is a bad model. One compromised system on the inside and you're pretty much done. IDS and DPI are good ideas but they aren't effective enough. Breaking in to any corporate network is as easy as spamming it's users with social-engineering-laden email. Get them to click on a link and you own their soft, squishy, zero-day-vulnerable desktops. Keylog and steal their credentials and you've got a jumping off point to worm in to the rest of their network. It's that easy.

What they're saying is once you move to a trust-nothing model.. Why bother investing in a huge corp network when you can't trust it anyway? When you don't have big corp network what's, the advantages of running your own services over purchasing them from someone else? Like Google?

If you had secure operating systems, and encrypted data flows, and weren't listening on a bazillion ports, it would be just as easy to secure the network by securing individual computers as it would to secure the perimeter.

The problem is security is a bolted on afterthought for some operating systems (Windows), printers, storage devices, and software applications.If we could get past that, we could stop building walls.

As the senior admin for such an outsourced network, I can tell you what will happen about 2 to 3 years after you migrate to an outsourced service like this.

"We're deprecating the ODBC connection as of January 1... no worries we've got a great new API and it accepts SQL!""To reduce system load and improve overall performance of your system we're limiting SQL requests to 100k rows""To enhance SLQ efficiency we've written our own proprietary query language called FU-SQL it's fantastic""We're aware that some of our customers are not happy with speed of FU-SQL so we've limited the number of joins you can make in a select statement to 1""To reduce costs for our customers we now bill our FU-SQL module separately, if you don't use it you don't have to pay for it! If you would like the unneeded additional FU-SQL feature it will bill for $150k/year""due to lack of interest FU-SQL has been discontinued, if you need mass access to your data please contact our professional service"

At this point they start doubling the price of their service every time you sign a new contract. Then your boss will ask you why your quote for migrating the network somewhere else was "A Metric Shitton of money"

I'm no expert in the field, but my understanding is that there are several models of network security based on real-world notions of security.

VPN is a part of your traditional wall security, where your typical authentication and authorization happens at each level of security zone. Once you're in, you can do anything the zone permits you to do. VPN is, as stated by others, placed at the perimeter.

BTW, full internal company-wide encryption just means putting the secure zones under a roof so no one flying overhead can see what's going on from above (e.g. big brother).

Another model of security relies on negative feedback. There are no locks anywhere, and no one has keys, but missteps have consequences. That's the security model most modern governments employ against their citizens. The levels of surveillance, strictness of the deeds, and harshness of the punishment determine the repressiveness of the model. The level of security is proportional to the amount of monitoring (a place like prison being maximum security).

There are other models, I'm certain, but like I said, I'm no expert. These are the two more prevalent ones out there right now.

Zero trust is completely different. It's almost like a double-blind experiment. There's no trust anywhere. Not the users/developers, not the administrators, not the auditors, not anyone. Authentication is fundamentally a trust-building mechanism, and a zero-trust model means authentication is obsolete (remember, encryption is merely erecting a roof over everything). Anyone can get in and do all the same things. The only difference is in the domain knowledge of the actors, which differenciates those able to do more things from less things if anything at all.

A rather dirty analogy of zero trust would be hosting an open project on Github. Anyone can go in and make modifications, but only those who know the code could make modifications that do meaningful work. And then, of the people building the code and running it, only those who who possess the ability to verify the modifications would know that they're not harmful specifically for their use cases.

Another analogy of zero trust would be to have an open e-mail account. There's no guarantees the sender is represented by the name. Every e-mail is assumed to have been read by anyone capable of entering the system. (Changing or deleting e-mails can be universally prohibited.) Such an account would be mostly useful for communications of metadata information, i.e. where and when to meet, and trivial matters.

I don't think Google's gone quite that far with their security model. They may have gotten rid of the VPN (or not...), but there are still SSH keys used for authentication and authorization, and users still need to log in to their machine to use it. After all, zero trust implies that even we the ultimate end users can't trust what's coming out of Google to be accurate (assuming that we could before--that's another debate for another time). And I don't think Google wants to make that impression.

It may be that they started with a zero-trust model, and identified the areas where trust is unnecessary, which they left insecure. At the same time, they also identified where trust is absolutely necessary, as well as the level of trust that's appropriate, and put up the necessary strength of walls to secure them, as well as levels of monitoring to see who's entering different zones. That sounds far more reasonable to me, especially considering the amount of trade and other secrets Google is holding onto.

"Why bother investing in a huge corp network when you can't trust it anyway?"

Redundency in security.

And its in-hand. You can fix it, expand it, modernize it, control it, instead of shifting all that responsibility to some third party to which you are merely another customer.

Trusting nothing, protection at machine level, the user level, the application level and the data level will not do away with the corporate networks.If anything, it may have the opposite effect, and encourage more use of such wholly-owned networks, perhaps melded with some cloud services.

But as sooner we move away from the Maginot Line mentality for our networks the better.

It may seem counter intuitive in the physical world, but a point defense system is easier to implement in computer networks than in the real world. Each computer should protect itself. Build this in from the beginning and it just happens naturally each computer, each file, each application. Because relying on the stockade to keep out the attackers hasn't actually worked that well in the physical world, and costs a boatload of money and expertise in the network world.

I wouldn't stick an employee with a slow half-top and expect them to be productive.

In my experience, a lot of companies buy whatever they can get a bulk price on and which someone in purchasing deems "good enough".

Resulting in employees with slow machines on which they're expected to be productive.

Hell, at an old job they bought a crap-load of new Dell boxes, and the native aspect ratio of the monitor was a non-standard thing in which a circle was drawn as an oval because the monitor was optimized for watching movies at 720p, but not for actually being a monitor (it's native aspect ratio was oblong pixels). Oh, and the machines came with 4GB of RAM, the OS they came with could only see 3GB of RAM, and it wasn't possible to install a newer OS on it because there were no drivers available.

In short, never underestimate how crappy of a machine companies will buy for their employees if it saves them a few bucks. Because many of them do it all the time.

Eh, my company spent more money on macs - but most places with ~35 employees have at least one "IT staff" guy and we never bothered with one - the savings more than made up for the "idiot tax." Besides, if you're even a few minutes more productive per week not dealing with an OS issue the nicer laptop pays for itself, and if the employees get a better experience that helps retention... there's a lot more to a good decision than just the number at the bottom of the credit card receipt.

why use so many Apple computers when there's your own awesome Chromebook [google.com]?

Google employee here (but I don't speak for my employer and I am basing this purely on anecdotal observation, not hard data).

I'm only familiar with my impressions from the engineering side, so I don't know much about the sales and marketing side of things, but nearly all of the engineers use Linux desktops (unless they're developing client software, like Chrome). Laptops are a different story. As a Bay Area-wide phenomenon, software engineers sure like their Macbooks, and this place is no exception. A few of us run Linux laptops, but my impression is that Macbooks outnumber Linux laptops plus Chromebooks combined. But the internal hardware requisition site is now offering the Pixel (indeed, recommending it instead of Macbooks), so this should change with time.

There's also the matter of hardware refresh cycles. The Pixel is not even a year old yet, and it hasn't been available for requisitions for its entire lifespan, so a good number of employees haven't yet had the chance to switch even if they want to. (Returned working laptops are refurbished and reused, so turning over the inventory will take longer than you might expect.) Also, lack of VPN or native SSH impeded the Chromebook's internal usefulness in the early days, but today hardly anything still requires VPN (it works now regardless) and the Secure Shell [google.com] app is pretty workable (set it "Open as Window" so that ^W goes to the terminal). And... well, the early Chromebooks had anemic hardware specs, which is not true of the Pixel.

Care to share the Distro of choice on those linux based non chromebook machines? Is it a free employee option ? Are there a set number of pre-approved distros? Is there a top-secret Google Gnu-Linux Distro that dispenses chocolates on the half hour?

This. Content consumption =/= content creation. Sadly, the nuisance is missed to many in this supposedly nerd realm that slashdot is supposed to be.

First all, it's 'nuance'. (Though, an argument could be made for nuisance too)

But, the reality is, the overwhelming majority of non-nerds using the interwebs are purely doing content consumption, and that's all they ever will do. And, even as a nerd, a huge fraction of what I do outside of work is perfectly fine on a tablet.

with companies less profitable than google?Mac's are expensivemost people don't own Mac's personallylots of people use personal computers to VPN to workhow would it work with the files on file servers people use to get work done? like MS Access databases?

Both of my daughters have work issued Macs. One is in education and the other a tech company. When you look at the cost of a computer compared to the salary (and benefits) for an employee over the life of the computer, the cost of even an "expensive" computer is a small rounding error. In addition, the cost of protecting and cleaning up Windows computers is non-trivial and the cost of a data breach can be enormous.This is not just a VPN, it is a VPN from a known, verified secure computer.? MS Access... what a joke.

I agree the cost of the computer is effectively a rounding error, but there are non-trivial costs in Window's favour too relating to compatibility.

It is getting a lot better with the rising popularity of Android / iOS meaning that fewer companies target a single platform, but I still find that when I try and take just my mac that I often find I have trouble doing some small thing.

You have this a little wrong. The cost of the computers is trivial in comparison to other things. What you are seeing is that the bean counters are focusing on reducing one specific cost (computer hardware) without taking other costs into consideration (employe productivity). Undoutably this is a case of “penny wise, pound foolish”, and is probably because no-one can write up the other costs into a spreadsheet, so the one number that is easy to define wins.

Both of my daughters have work issued Macs. One is in education and the other a tech company. When you look at the cost of a computer compared to the salary (and benefits) for an employee over the life of the computer, the cost of even an "expensive" computer is a small rounding error.

And yet I have not heard of a company doling out computers with SSD drives in them. Myself and a department full of people waste 15 minutes in the morning waiting for laptops to boot up. We did the math and the $ amount of the lost productivity was staggering.

You have computers which take 15 minutes to boot up?Every laptop I have owned for the past 10 years goes to sleep at night and takes about 10 seconds to wake up in the morning.I think you're doing something wrong.

It's best to just stay away from Windows programs.If you think you need Windows programs or you work in a company that thinks it needs Windows programs, I feel sorry for you for working among the clueless zombies. Nobody needs Windows.

How exactly is your example much different from any other Laptop.. the Dell laptops here (about half mac, half dell) have docking stations, and adapters needed for using HDMI in conference rooms just like the macs do... your additional cost example really makes no sense.

It don't just move it, but makes it wider. More connecting infrastructure that could be outside their control, more points where a fake certificate could be used to gain access, provide ways to do MITM attacks, or just inspect traffic. In internal networks you must check traffic, the ultimate vulnerability is always the user and carrying inside a trojan without knowing should be common enough (and if not, taking advantage of a 0-day exploit in acrobat or flash definately is), but the physical location provi

Fortunately, this seems unlikely to affect you unless you already have access to Google's corporate network. TFA is about Google redesigning its own network, not (as I feared) to start providing some kind of cloud-based service to other corporations. The headline is misleading, perhaps intentionally so.

The rj45 jacks in the office are just plain old dirty connections to the Inet. We each have multiple OpenVPN connections on our localhost giving us access to different parts of the network depending on our roles. It's convenient because our workstations work identically wherever we are ( home, work, coffee shop ) and it's convenient when someone leaves because operations just invalidates the VPN certs and the former employee is cut off no matter where they physically are. A side effect is whenever your VPN credentials don't work you're left wondering is you're about to get fired and ops just jumped the gun haha.

Interestingly, the company I work for is also like that. In our office, the "network" is just a regular consumer grade router (plus an expensive cisco AP). But we don't use VPNs (VPNs suck), all of our services are Internet accessible and protected independantly. So web-stuff is SSL + http authentication, email is IMAP, calendar is caldav. source code is ssh+git, etc. We have an internal SIP service (but that's also Internet connected).

Also, look at how large open source projects operate, Mozilla, Debian, Gentoo, GNOME, KDE, LibreOffice, etc. They're all a bit like big companies, but without a VPN, where everything is Internet accessible.

We don't use any internal application that's not web-based, does anyone else do that?

I don't know about OpenVPN, but for example Cisco Anyconnect is pretty flexible for this kind of stuff. It uses IKEv2+IPSec if possible, then scales down to DTLS, and finally just https (even through proxy if necessary), and as such, can pretty much punch through any firewall. In addition, you get endpoint assessment so you can for example enforce that any updates and such things are installed to the employee's device (whatever that might be).

Endpoint assessment is a stupid idea, a malicious (i.e. owned) client can easily lie to the server while a legitimate user wanting to use a configuration not thought of by the sysadmin gets screwed.Also having to use a proprietary client is terrible, you end up being unable to update your OS because doing so can break the third party vpn client, or finding yourself with extremely restricted choices as to what os you can use.

> Please tell us more about your setup.
We're a Java office in TX with a remote call center in OR and a handful of remote employees ( Chicago ).

> What type of work does the company and you do?
I'm the director of development, we're a j2ee web application development shop with special expertise in Oracle

> Approximately how many users work like this?
All of us ~30

> Does this company operate primarily as a standard physical office environment, or is this a distributed(work from home) startup?
A couple of my developers work from home 3 days a week and most of ops ( the network guys ) work from wherever and, apparently, whenever they want. They're pretty hot shit, published authors, speakers at LISA, etc so they're left alone most of the time.

> Where are the servers, on-site, datacenter, cloud?
We keep our staging and UAT servers on site and colo for production + another colo for failover

> Approximately how many servers?
I have no idea, I know we have some serious SAN gear for the databases. We probably have around 50 virtual servers in our testing setup and maybe 20-25 production server clusters with an average of 3 nodes each. Some physical some virtual.

> What type of applications are used, web, small applications like QB, MS Exchange or SQL systems?
Web applications, we develop/maintain some very large rewards and loyalty programs for the big banks. RDBMS is Oracle, email and IM is handled through Zimbra, project management is handled with Atlassen Jira self hosted.

> What are the negative aspects of this system?
The only problem i've ever faced is the VPN endpoints not staying connected. VPN connectivity becomes mission critical because without it no work can get done. I don't know what they're using for the VPN server, I know ops is a big fan of OpenBSD so it wouldn't surprise me if that's what they are using.

One other thing. I work on an Imac and use TunnelBlick to manage the VPN connections. I've had zero issues on a wired connection but sometimes have issues using wifi, the vpn connections will drop and then re-connect after a minute or two. There must be something weird in the office because when I take my mac home i have zero issues on wifi.

From a security perspective, Google is right about the notion that your internal corporate network being "safe" is dead. Between all the laptops, tablets, smartphones and very portable USB devices, there really isn't a secure perimeter on your network. Security needs to be applied at each entry point to the network, whether that is wired (internal or external doesn't matter), wireless or virtual.

The summary implied that the need for security devices goes away once you give up the idea of a perimeter, but that isn't the case at all. The form that security comes in may change, but you still need it. Authenticated users connecting via secure tunnels doesn't eliminate the risk of malware, so you still need IPS and anti-malware devices (Fidelis, FireEye, etc.) to keep your protect company assets from valid authenticated users.

If you can't trust any of the devices on your network, then you need to inspect 100% of the traffic entering the network.

Not at every entry point, security should be a serious consideration on every device. Work on the assumption that everything is directly exposed to the internet and start from there.Trying to only monitor the entry points is the problem, if anything makes it past your entry points then it could have free reign over everything inside.

In their whole talk they assumed the users of the services know what they are doing and how to behave. I'm sure that in Google's case all their workers are well trained, but I sure as hell couldn't allow VPN connections to our CRM database. Who knows what workers install on their laptops once they leave the office.

Ubuntu precise comes with gcc 4.6 and glibc* 2.15. So does Debian wheezy. So unless google is really really slow at updating their internal variant of ubuntu to the latest LTS release of ubuntu this shouldn't be an issue now (though it may have been an issue in the past and driven people to migrate).

* Strictly eglibc but afaict the difference is just in details surrounding some ports

I wouldn't necessarily consider the choice of newer than 4.6 gcc arbitrary. Later versions with better C++11 support are very attractive, because C++11 simplifies and standardizes a lot of things that can make development significantly easier.

I may be wrong with this but if your computer sends data to their meta inventory system, all the hacker needs is that data to replicate with some packet capture software and use that info to log in...wont it ?

They picked a company that stands behind its platform over a platform that has no clear owner. It has nothing to do w/ how 'real' the UNIX is, or the license (okay, that may be a factor) or whether the company itself makes an arguable alternative.

Why would Google buy Macs if they don't use OS X? They could use Linux on ANY cheaper computer they choose but bought Macs anyway.

I believe Google thinks like a lot of us: OS X for desktops, Linux for servers, a mix of iOS and Android for mobiles.

Because Apple makes good, attractive, hardware? Besides, hardware cost is inconsequential compared to the cost of a developer, whether his laptop costs $1500 or $3000 doesn't matter. Our entire development team uses Macbooks - and of 12 users, only two of them run OSX. One of them is even geeky enough to paste a Tux logo over the light-up Apple logo.

Since they deploy on Linux servers, it makes sense to develop on Linux. Write-once run-anywhere still isn't a reality - obscure platform specific bugs can still come back to bite you.

Google lives in a fantasy world, where the WAN is as fast as the LAN. For me, both at home and in the workplace, you're talking about two and a half orders of magnitude difference. That's the whole reason all this cloud stuff, streaming (as opposed to download) video, etc all seems so bizarrely alien. You're talking about such a tremendous performance downgrade, that I just can't begin to really take it seriously.

I suppose the thinking is that they are planning for the future, when some day the WAN gets reasonably fast, where my home and business DSL line is replaced with fiber. Cool. Be ready, Google. But how are you going to spend those decades of waiting? Some cons are a little too long, IMHO.

But how much data do you really need to send to your home computer?

I deal with multi-terabyte datasets every day, and can work just as effectively from home as I do from the office since my data lives on the server and I never need to bring it down to my computer. I rarely even compile code on my local computer anymore since it's so much faster to do builds on the 16-core 32GB servers than on my little 4 core 8GB home computer (and even worse on the old 2core 4GB laptop).

Likewise, I don't have a Windows computer on my desk - I remote desktop to the Windows Terminal Server when I need to run a Windows app. At long as I'm not streaming video, it works just as well from home (~12mbit DSL) as it does from the office.

Then work in the office...Google's plan is to do away with a local corporate network, so that the network available in the office is just an internet connection and you connect over the internet to whatever services you require. If you are in the office then your connection will be just as fast since the services you generally access are just as likely to be local as they were before. It's just that now instead of being on a flat network with insecurely configured devices, you will connect to those devices

Google lives in a fantasy world, where the WAN is as fast as the LAN. For me, both at home and in the workplace, you're talking about two and a half orders of magnitude difference. That's the whole reason all this cloud stuff, streaming (as opposed to download) video, etc all seems so bizarrely alien. You're talking about such a tremendous performance downgrade, that I just can't begin to really take it seriously.

I suppose the thinking is that they are planning for the future, when some day the WAN gets reasonably fast, where my home and business DSL line is replaced with fiber. Cool. Be ready, Google. But how are you going to spend those decades of waiting? Some cons are a little too long, IMHO.

Some thoughts on this:

It my be fantasy for you and I, but Google actually lives in this world. When you can dabble in setting up gigabit city-wide networks as a freaking "experiment" it's reasonable to assume that bandwidth for remote connectivity isn't really an issue for you.

100kbit is more than enough to buy you a reasonably quick remote desktop session. If all your real work is being done in the datacenter across multiple redundant 10gbit links, then who the hell cares what the WAN connectivity is, as

You're kidding, right? Google - home of the cloud - is going to worry about local storage limits on drone machines. And...again...drone machines - onboard video is probably 4x as fast as they need it to be for nearly all conditions. They've rolled out fiber in an entire town; I'm going to guess that they've got a pretty speedy wireless system on campus.

Apple hardware is very limited if (a) you're looking for a bargain and aren't on a corporate buying plan, or if you're a hardcore gamer, or if you are running massive analysis software, or you are locked into industry software packages which are platform locked. None of that is an issue for desk machines at Google.

I'm not, in any way an Apple fan, but pretty much none of the problems you state are of any consequence to their usage profile.