Posted
by
Soulskill
on Friday January 25, 2013 @08:49AM
from the take-care-what-you-make-public dept.

mask.of.sanity writes "Github has killed its search function to safeguard users who were caught out storing keys and passwords in public repositories. 'Users found that quite a large number of users who had added private keys to their repositories and then pushed the files up to GitHub. Searching on id_rsa, a file which contains the private key for SSH logins, returned over 600 results. Projects had live configuration files from cloud services such as Amazon Web Services and Azure with the encryption keys still included. Configuration and private key files are intended to be kept secret, since if it falls into wrong hands, that person can impersonate the user (or at least, the user's machine) and easily connect to that remote machine.' Search links popped up throughout Twitter pointing to stored keys, including what was reportedly account credentials for the Google Chrome source code repository. The keys can still be found using search engines, so check your repos."

These kinds of repositories need to learn that and not let these folks do this sort of thing. If would be simple to use a regex to filter out the posting of these sorts of files. Maybe Devs should even be charged a couple dollars to get a decent review of these things.

No. This is actually completely absurd. A developer that cannot grasp the concept that private keys have to be kept private, cannot be trusted to do anything but screw up the most basic security provisions when writing code.

They should get a kick in the ass, such as three months without any sort of commit privileges, and mandatory code review for an year. THAT should be enough to make it stick, and impress on them the real gravity of their failure. Otherwise, they will just chalk it up as "an annoyance done by those uninteresting people who should learn to code before they go pestering code-gods".

Sysadmins should also know how to code. Nothing better than showing them their screwup and the solution to it.

Plus, since all sysadmins, real ones anyway, are already competent in several scripting languages it is not that hard a skill to add if all you need to do is be better than bottom of the barrel programmers.

I dunno about that here. Ever since they rolled out Sophos Full Disk Encryption on every desktop and server here, it's contributed more to downtime than any virus/malware ever has. I think literally every person in this office has had to have their machine completely rebuilt after it got corrupted somehow, and that includes our testing servers as well.

All I can say is, thank god our production servers are out of our company's control. They haven't had any issues, but then again, they also don't have Sophos malware on them either.

No. This is actually completely absurd. A developer that cannot grasp the concept that private keys have to be kept private, cannot be trusted to do anything but screw up the most basic security provisions when writing code.

They grasp the concept just fine. It isn't that they don't understand, it's that they don't see it as their problem.

I don't think that would be easy to implement. In git you add and commit your changed to a local repository, then push them back to github. Somehow cutting them out later would give you really cool errors. They would need to catch them on your commit locally. The real question should be why those keys were placed inside the project directory, and not somewhere like ~/.ssh/

Could they not supply a.gitignore ?Either way simple enough to have a find script run, before you make it public. Basically on every commit turn off public access, run your clean script, then turn it back on. If this causes errors, that seems better than this.

I admit at work we mostly use a combination of CVS and a 2x4 to hit developers with to avoid these issues while still having a nice simple repository.

I've seen several people comment that they have their home directory config files under version control. If you're using git for that, it's a fairly simple next step to then "backup" the repo to github."It's only config files; nobody would be interested in those."

For new screwups the soloution would be to just reject the push and let the developer sort it out.

For existing screwups it's not so easy. One of the characteristics of hash-based dvcs systems like git is that they make it REALLY painful to change history. You could generate new commits and blacklist the old ones but doing so would tip off all users of the repository that something was up and those users would still have their copies of the original commits.

This is why developers shouting "give me full access now" should always be denied - there is a totally different mindset between developers and admins (or DevOps) when it comes to protecting things like SSH keys.

Both groups have similar (or certainly overlapping) technical skill sets, but have very different motivations.

This happens both in private and publicly developed projects. All too often the developers do not grasp the fundamentals of security. If lucky, they grasp 'enable encryption' but it's exceptionally rare for them to understand things like mutual authentication and appropriate key management or even why a backdoor or fixed credential is very very bad news. The 'answer' in many companies is to tack on a 'security expert' to audit the code and do some penetration testing. While this is certainly not a bad i

The first thing you learn is that your private SSH keys are sacrosanct. Most developers seems to just go through a howto on how to generate a SSH key and don't think about anything after that. They're probably all using node.js or something.........

The first thing you learn is that your private SSH keys are sacrosanct. Most developers seems to just go through a howto on how to generate a SSH key and don't think about anything after that. They're probably all using node.js or something.........

Followed by going through the git howto that tells them to git initgit add.git commit -m "Initial Commit"

This is why key management should be part of the operating system, and every piece of software that doesn't use those APIs should be suspect.

It's simply too big a subject to expect everyone who is in danger of falling prey to something similar (everyone who uses a computer) to manage on their own. If you know where every individual piece of software you run stores every single key, you are a very, very rare person. You're also probably mistaken.

The developers aren't the only people at risk here. Anyone who checks out the development tree is at risk. And just having the source code doesn't automatically protect you. You aren't very likely to go over the code with a fine toothed comb. And even if you did, it would be quite easy to miss an inserted "mal-feature". (I can't really call it a bug, since I'm thinking of code that was doing what it was intended to do, just not what *you* intended it to do

Actually, in the glory old days of SQL injection, google did shut down search for certain patterns (such as inurl:cfm inurl:page_id or inurl:asp inurl:password). If you did too many searches for these, you got a captcha to prove that you were not a "bot"...

Heck, Google disabled searching number ranges after some enterprising folks used them to harvest credit card numbers - doing searches for numbers between 4000000000000000 and 5999999999999999 which will get back lists of credit cards (Visa/MC) that Google indexed because someone put the list up.

So the smart move may indeed be to show how smart you are, and cause yourself being fired. Of course, it all depends on the circumstances, as being fired rather than resigning dimishes your chances of finding a job elsewhere.

I ended up in a job with a lesser payment (probably because of the laid off), but 6 months later when a LOT of guys were fired (the company decided to shrink the department - somewhat I had envisioned by my previous experiences in this field), I already was employed, with my bills being paid, while a couple dozen of professionals (a very few of them so good or better than me) were struggling for a new position in the market.

I was cruising ebay yesterday and saw that one of the laptops had their windows license keys exposed in pictures in a readable format. I poked around some more and found that isn't terribly uncommon. Some people just don't think no matter what website it is.

I was cruising ebay yesterday and saw that one of the laptops had their windows license keys exposed in pictures in a readable format. I poked around some more and found that isn't terribly uncommon. Some people just don't think no matter what website it is.

Those aren't actual working keys, though most of the time. Usually on machines from the big guys, they're nonworking keys - because the real activation key is built into the BIOS. For earlier (pre-Vista) versions of Windows, they would require manual act

Example: the keys for Vagrant [github.com]. Vagrant is a system for managing virtual machines for development purposes. The ssh keys are used to facilitate passwordless login. They aren't typically exposed to the outside world, and they are clearly labelled as insecure.

One of the things people do with it is build base boxes, which are preconfigured virtual machines, and share those base boxes for other people to build upon. In order to do so, the people who receive these base boxes need the private keys they are configured with.

You could distribute the private keys with the base boxes, I suppose, but then you are stuck sharing multiple files instead of just one, and you can't install a base box by running one single command with a URL argument any more. It increases

I think you've misunderstood. A base box is a virtual machine. A digital file. They are run during development. They aren't production servers that other people access, and they aren't physical hardware.

I think the summary is wrong. status.github.com [github.com] seems to indicate that github's search cluster died, not that they took it down. More likely is that there was a flood of search requests for private keys at the same time and the search cluster buckled.

According to their twitter and status pages, the search is currently unoperational due to problems with their search cluster. They recently released changes to their search including, I believe, a move to ElasticSearch. The linked article says as much, too, so yet another fail in a slashdot summary.

Back in the days when I was the root (of all evil according my fellow grad students) of our lab, one of the constant problems was people blindly doing chmod 777.* on the $home. They have.emacs or.profile or.cshrc that was customized ages ago by some grad student, and they want to share it with a new student. Somehow they stumbled on to "chmod 777.*" as a solution to all their file sharing problems. Now this "magic command" was also being blindly passed around without worrying about security implications. Oh, yeah, they think they are clever and tape the login credentials to the underside of the keyboard and laugh at secretaries who tape it to their monitors.

Looks like these grad students have all growned up and uploading it all to the cloud.

When people did stuff like that in my sysadmin classes we were encouraged to teach them a lesson. Far better to edit their login script to log them right back out than delete their homedir contents, or change their path so they got other versions of common programs. Probably the meanest was to make it so instead of calling the work submission script it called rm on whatever they were trying to submit as their classwork.

A subtler prank that I pulled on a friend who left himself logged in to one of the public undergrad labs (where there was the risk that an actual asshole would delete your stuff, send email as you, or something similarly cruel) was to add "echo 'sleep 1' >>.cshrc" to the end of his.cshrc before logging him out. I chuckled to myself, and then forgot about it.

A week later, when it was 5 minutes before a submission deadline and he was yelling at the terminal to finish logging in (since it was taking 2-

I hate that attitude though. I told a friend once that I didn't use authentication with X windows because no one is ever going to bother to interfere and I didn't care if anyone was snooping. So he went out and decided to pop up random pictures on my screen and post messages until I relented. So I wasted a lot of time learning all about authentication and configuring it correctly, not to keep out adversaries but to keep out friends...

Yeah... I was "that guy". The first time I installed Linux in 2000, I was annoyed that I needed "permission" to write to a directory outside of my home directory. I was coming from a Windows world, after all.

I solved this "problem" by chmod 777 the entire filesystem. Hah. Problem solved. Needless to say, I couldn't start the machine back up again. I'm guessing it killed itself from the shear embarrassment. After that, I decided it may be in my best interest to read the manual.

It's probably obvious and I'm just being stupid, but I can't think what you could possibly break by setting all perms to 777. Yeah, you'll mark a bunch of non-executable files as executable, but nothing should be trying to execute them anyway. There may be a few files (like/etc/passwd|shadow) which some components might refuse to use if they're world-readable, I suppose...

Any idea what broke?

BTW, my similar story: I purchased a NeXT machine in 1991. It came with a 110 MB hard drive, which wasn't a lot

Someone in my class installed a game in the officially-public network share. He was writing an AI for it, for a project. Other students found it, and played it.

It had taken a lot of hacking to get the game to run on Linux, and he was annoyed other students had played it without putting in that effort. So, he altered the 'start.sh' script to generate an ssh key, add the public part to the user's authorized_hosts file, and move the private key somewhere obscure.

He then got bored with the AI project.

Some time later, while helping in a tutorial, I was showing a student how to set up an SSH key. The authorized_keys file already contained about 20 entries. The AI guy was sitting at the next computer, and told me what he'd done (I knew him quite well, but he hadn't told me what he'd done until now). He found over 200 private keys in the obscure place. He deleted them, chown -R go-rwx'd the game, and we thought that was the end of it...

About a year later, Debian had that OpenSSL bug. The sysadmins ran a script across everyone's authorized_keys file, and removed any entries from keys generated by Debian OpenSSL. The email ended (I still have it):

By the way: some of you have FAR TOO MANY authorized_keys ENTRIES
and we seriously recommend that you radically shrink these down.
As I said, we recommend kerberos tickets or ssh-agent instead!

Users want to protect their assets. Often, they don't know what their assets are. For the biomedical researcher, the assets are her experimental results and literature survey etc. Does not even know that these credentials are valuable to some other people in a different universe.

This is common even in other walks of life. I see a few of my colleagues routinely bringing their passports in their brief ases they lug to work. They constantly open it to retrieve reading materials, music players, etc in the bu

Well, you have to consider quality over quantity. There may only be less than one in a thousand developers who screwed up but what if the keys that were exposed belong to super-important servers such as those that control Google Chrome source code or some other big project?

Not quite. They're already out there. The keys are still in the revision history. People have forked and cloned it.

Hopefully the developers who created these keys know that besides removing them from the repo, the keys can no longer be used. They must be removed from every.ssh/authorized_keys file, every service like Github that uses them for deploying code, etc.

Github is officially denying [github.com] that the search feature being killed has anything to do with the exposure of keys. They also have a link on the same page to information on how to purge keys from your repository. (Make of that what you will.)

This doesn't suggest github took anything down on purpose: https://status.github.com/messages [github.com].Seems to me they were just experiencing some technical difficulties from all the people sharing those search links and having a laugh at the stupids...I skimmed over the github site and didn't find anything that would suggest otherwise at least.Of course I didn't read the articles because they seem badly misinformed and confuse private keys with passwords.

Interesting how just last night this post about Arch users being pedos [4chan.org] showed up on 4chan. Someone had uploaded their zshell history file into the repository and OP happened to notice it. Today Github announces search is being killed...