A Web Security Primer – Sliced, Diced, and Giuliani’d

So, a couple weeks ago Donald Trump nominated Rudy Giuliani to head up his cybersecurity advisory position. Shortly after this announcement, I thought it’d be interesting to see what the strengths of Giuliani’s cybersecurity experience were. I was, to say the least… unprepared for what was to come over the next 48 hours. Thousands of retweets, over half a million Twitter impressions, triple digit Facebook shares, and several articles ended up sourcing the things I and a couple other friends found and shared (shout out to Paul Gilzow and Aaron Hill there). Things that, to me, were so incredibly basic that I was shocked that apparently no one else had noticed them and called them out.

Which brings us to our lesson. I want to specifically go over the things I brought up, how I found them, how you as a developer can learn from it and avoid the issues, and why you as a lay person should care. Every web developer should know basic security practices. In a world where tools like Let’s Encrypt exist, there’s no reason not to. Spinning up your own VPS for a website is trivial now, but lost in the process is the basic effort to lock them down. Let’s go down the bulletpoint rundown as I presented it on January 12th.

Caveat, I’m going to try and keep these talking points intentionally “light” as much as I can for the less tech-savvy readers that might come through. This isn’t meant to be a step-by-step guide on any of these issues – only to underline why they matter, and how easy it is as a developer to deal with them.

Expired SSL

This is a basic maintenance thing. The certificate had expired a month prior, and all they had to do was get a new one reissued and copy it over. SSL certificates, like domains names, must be renewed on a schedule to remain active. They also needed to get it issued for the right domain, as the one they were using was also mismatched to their site’s domain. There’s literally no hiding from this error, as any attempt to access https://giulianisecurity.com threw a blatant security error screen telling you this exact thing. Browsers won’t trust the site if it’s expired or the name on the certificate doesn’t match the one on the website.

This is roughly a $150 fix that only takes a couple minutes if you go through a formal signing authority for the certificate, though anyone can now get a certificate free through Let’s Encrypt with a little effort server-side.

Doesn’t force https

Let’s forget about the SEO reason for doing this and focus on why this should be done as a standard. One Twitter user (sorry, but I’m not gonna search back through that madness of notifications to find who it was) mentioned that this was silly to bring up since they weren’t doing anything transactional. It’s not like they had a storefront or were processing payments. Oh child… So here’s the thing, first off, as I’ll point out next, they have a CMS (content management system). You have to log in to said CMS. You do it from the site itself. So you should be securing that login process to prevent any interception of the data. Secondly, it’s a best practice thing. In a world where you can now get certificates for free, why wouldn’t you, especially when you are running a security focused website? But even if you’re just running a blog, you want to be sure any number of exchanges between our users and your site are locked down, whether that’s storefront related, or just contact forms or logins. If nothing else, it helps to ensure that if something were to be compromised on your site, there’s a chance it wouldn’t be able to leak the data to a third party in an unencrypted format.

Testing this is easy. If you go to somesite.com, and it doesn’t redirect you to https://somesite.com, then it’s not forcing you to use SSL. You can tell this in basically every browser by looking in the URL bar for the padlock icon (in the image above, Chrome outright tells you when something is secured). Fixing it is just a matter of a basic mod_rewrite rule, or most firewalls can enforce it. And even if this particular site wouldn’t benefit from forcing secure connections, yours well may. Keep in mind, people use open wifi access points and public computers every single day. If you aren’t securing the traffic, it has the potential to be monitored in those and many other situations. Even if it’s not login data, there’s behavioral information and activity that can be determined that – depending on the situation – can be extremely… useful… to a hacker.

Exposed CMS login

This was another one that I got a “so what” from some people about. First of all, exposed login forms invite brute force attacks. Even if you have other mitigating utilities, it’s an attack vector. And with the ability to call into force large networks of machines – particularly with the growth of the Internet of Things – you leave yourself in a position where if someone wants in, they will likely find a way given time and money. And fixing this is another trivial step. There are three immediate ways to approach it on most platforms. First, you can do directory level password protection of the directory via .htaccess. This means there are two layers to get through before someone gets into your CMS. The second is to change the path by which you access the backend. For instance, in WordPress, this is /wp-admin by default. You can easily write your own mod_rewrite rule to address this, or use plugins like iThemes Security (if you’re on WordPress) which will do it for you if you aren’t comfortable hardcoding the change. Last is the best approach: firewall it. Make it so that you can only access the site’s backend via a VPN that’s whitelisted, and any other request just gets rejected. This is overkill for most folks, and very heavy handed. It’s also a great way to make sure no one but you gets in.

Joomla backend login screen

When you don’t know a website’s CMS, you can always test things out by trying out a few common paths, like /login, /admin, /administrator/, or /wp-admin. In this case, not only did we discover that you could attempt to log in, but it outright tells you what CMS it is (as many do). Another mitigation option is to employ a secure proxy like CloudFlare which will monitor and detect traffic that appears to be abusive (like a brute force attack or attempts at injection into the form), and block it automatically for you. It’s not a complete solution, but it’s better than nothing at all.

Uses Flash

Yeah, this was a bit silly and I pointed it out more to illustrate how out of touch the group was with modern web standards of any sort. But, Flash is also not without security vulnerabilities. Flash has been under attack (in the figurative sense) for years now, to the point that even Adobe has capitulated on it. So I don’t feel that bad about pointing this out. It’s stupid and lazy for it to be in use on a site, particularly for the strictly decorative purpose it served (moving banners on the site). It’s also a good indication that the site hasn’t been touched or maintained in some time.

Using EOL’d PHP version (5.4.x)

How do you know what PHP version the server is running without hacking it? Easy. Super easy. Most servers straight up tell you this if you look at the response headers sent to the browsers when you communicate with it. Specifically, just check the X-Powered-By header.

An X-Powered-By header shown in a response (not giulianisecurity.com)

You can also potentially get this information out of tools like nmap, but I’ll talk about that in more depth below.

For those less familiar, EOL means end of life, as in, it’s no longer getting patched for any issues that are found. In this case, the 5.4 branch of PHP stopped getting security updates on September 3, 2015. Once you know the version, a trip to CVE Details will give you a list of every vulnerability known for the given software. A CVE refers to common vulnerabilities and exposures – basically bugs that can let you compromise a system. This doesn’t mean all of them are serious or exploitable in every scenario, but once again, it’s the start for an attack vector.

Again, this is mostly a matter of basic maintenance to fix. A couple commands and PHP will be up-to-date. It’s minimal effort to resolve.

SSL Lab grade of F

I’ve already mentioned that the site didn’t have a valid certificate and didn’t force SSL. SSL Labs offers a tool that will make a secure request to a web server and analyze the certificate and protocols used to determine how safe it is and what issues there are. Any time you access a website using https, you’re doing so over the secure sockets layer – or SSL. Again, nothing nefarious is done to discover this, your browser knows all this information any time you hit a site using https, SSL Labs just puts all of it into context and cross-references it with known issues.

SSL Labs site score

The two biggest things that are relevant here are that 1) the server isn’t supporting the latest version of TLS (just a connection protocol that, like any software, is updated, patched, and improved upon over time), and 2) the server’s setup makes it susceptible to a DROWN attack. A successful attack allows a hacker to compromise the encryption of the server and thus gain access to the meaty bits inside. Now, as mentioned, this site doesn’t really do much of value, so this has limited appeal. But it would allow the attack to potentially compromise something like a CMS login – if, you know, that was encrypted to begin with. This is always dangerous territory because of how frequently passwords are reused. If I get a CMS login, I might have a login to much more than that on my hands.

https://www.youtube.com/watch?v=fCkQxZYgh5Y

Fixing this is a bit more involved, as it requires patching a few different things, but again, we’re still talking stuff that’s covered by doing the bare minimum regular maintenance on the server. Update TLS and get the certificate fixed and your grade would go instantly up.

Using Joomla 3.1.1 (released in April, 2013 – current is 3.6.5)

First, let’s just all agree that there are so many jokes to be made about using Joomla for hosting a site that, arguably, should be nothing but flat HTML files to begin with. The reality is, it can be extremely easy to determine the version of a CMS. For instance, WordPress, by default, just spits this out as a meta tag if you don’t disable it. Just like with PHP, once you have a version number, go look up the CVEs on it, and get to work.

In the case of Joomla (and many CMSs), there are a lot of files that get installed that can be accessed if you know the URL that can give up information. For instance:

At the time of this writing, the site itself is down, so you can’t actually look at any of those files. Regardless of the CMS, you can also use the versions of libraries in use to help determine a CMS version. Think of them like timestamps or fingerprints of when it was last updated. Joomla uses all kinds of CSS and JS that can give itself away. On top of that, Joomla doesn’t secure a number of otherwise harmless XML manifest files that can be looked up which say the version number outright.

I cannot stress enough how important it is to keep your CMS, themes, and plugins that are in use up to date to avoid exposing yourself to security exploits. WordPress in particular can be bad about this, since abandoned plugins can remain in use even if they’ve been taken out of the public repository for known issues, and you’d never know otherwise. Tools like MainWP can keep track of potentially abandoned plugins, among many other useful features.

To mitigate issues, take out any meta tags that list your site’s generator, and delete all the basic readme and example files that are commonly loaded with the initial install of a CMS. Change folders where you can, too, to make any that you leave harder to find. Keep libraries up to date, and consider using combiner/minifiers for CSS and JS which will strip out all code comments that might leak version information.

SSH exposed to public access

So, this is a little bit of a grey area. We get this information thanks to a tool called nmap, which a user was kind enough to dump (plus more human readable versions posted on /r/sysadmin and elsewhere). Now, here’s the caveat – running around portscanning servers that aren’t yours falls into a questionable legal area (questionable, as in it can be straight up illegal depending on where you are), and I do not encourage you to go around doing it. But, since the data is out there, I’m more than happy to comment on it.

SSH is a means of logging into and using systems that you aren’t sitting right in front of. It’s a remote access tool. Generally this is done via the command line, and what you can do is based on your level of access. The reason you don’t generally want this exposed to the public is because it means that – just like with the exposed CMS login – it gives an attacker a vector to brute force an attack on your server. If they get into your server this way, they can potentially have completely unlimited access to whatever’s on the machine (and also more, depending on what the machine is connected to).

Rather than enumerating all the ways you can secure SSH – which has been done a million times over – I’ll just tell you to go read this and this. Remember the KIF approach: Keys, iptables, and Fail2Ban.

FreeBSD 6 (released in 2008)

EVERYTHING IS AN ATTACK VECTOR. That is to say, when you know something has a specific weakness, it’s a path to compromising a system. The CMS, the application layer, libraries, and the operating system. In this case, an operating system that is, strictly speaking, ancient by computer world standards. You know that feeling when you go to your grandparent’s house and they have an old white-box PC running Windows 98? That’s sort of how this looks to folks that work in web. Repeating previous statements, knowing this means you can get the CVEs and try attacks. Getting this information is easily attainable via the aforementioned X-Powered-By header or nmap results. Compromising the OS means everything is open to you on that machine. So, you know, bad.

Solution: Keep your operating system up to date. Especially when it was installed 9 years ago.

Open ports, so many open ports…

I REPEAT, DO NOT GO AROUND PORT SCANNING MACHINES YOU DON’T HAVE PERMISSION TO SCAN. You’re just asking for headaches and possibly a visit from your friendly, local law enforcement (or worse, more than local). But again, by this point, the information was already circulating, so it was there to look at. Some of these we can totally give a pass on, like the FTP and email ports (and obviously http). But why, WHY do you leave MySQL (the database system storing information for the website and possible other applications) just open to the world? Same with others like LDAP and the mysterious tcpwrapped ports (meaning something is running there, but who knows what).

If you aren’t using services, shut them down or block them with a firewall. Make them completely invisible. You can even take a server like MySQL and put it on an entirely separate machine that is only accessible by the web server, meaning you can’t get to it without going through the first machine, so you have an extra layer of protection. Even if the web server is compromised, your database may stay safe since it’s not in the same place.

Why This All Matters

First and foremost, the post is more about the basics of web security, and not meant to be political. That said, there is that underlying reason why this all came up to begin with, so I feel I have to address that at least on some small scale.

People have felt the need to point out to me – and several articles have mentioned – that the Giuliani Security site wasn’t about web security consulting. That is true. Their emphasis seems to be much more on crisis management and physical security – a combination that has helped Giuliani Partners bring in $40 million USD. But here’s the thing about that: Giuliani was just elevated to arguably the most important cybersecurity post in the world, and the best example we have of how he executes relative to that strategy was severely lacking and reminiscent of “old world” thinking. For example, as I discussed with Aaron earlier, the proper security response to the issues we pointed out would be to resolve them with the steps mentioned. Instead, they decided to throw in the towel and take a “security through obscurity” approach. What I mean is, while http://giulianisecurity.com was taken out of the DNS records – meaning that the domain name no longer pointed at any address – the server itself was still left up and running and accessible via direct access to its IP for hours after issues were pointed out. Meaning hours after problems were known, they had taken steps that mitigated risks by exactly zero percent. These aren’t the actions of people versed in good web security practices and has to force one to question just what Rudy’s cybersecurity approach will be.

Giuliani Security’s services

Yeah, but he’s going to use his position, power, and connections to bring in the best people from the industry to tackle this issue. I’ve heard that argument, too. And if it were really true, you would expect to see that in the evidence we have on hand. We don’t. The Giuliani Security website is clearly a brochure site (and while I didn’t give much attention to the Giuliani Partners site, what little I did look at there was just as discouraging). It’s there just to show that they do something. It was set up – almost certainly by a third party – and promptly forgotten about. That’s fine for most places, but it’s hard to hide behind when so little care and attention was given to the only thing that would imply some basic competency in the position he’s taking. If you were working at a company that was hiring a Chief Security Officer, and this was the sort of résumé you were presented with, would you hire him? I wouldn’t even bother with a phone interview.

That’s what made all of this so concerning, and why the conversation grew the legs that it did. Had I seen the site, ugly as it was, but noticed good practices on everything listed above, it’d be much harder to make the case that such a person was unfit for the role. As a web developer – anyone – should be able to identify and address the things I brought up. You could fix every single thing listed here in an afternoon, and the result would be a website that was pretty freaking safe for the most part. There’s more, of course. Things dealing with firewall rules, application security, penetration testing, etc. you could do, but that’s not my area of expertise. Nor is it most developers’. It’s not about being perfect, it’s about avoiding excuses to be stupid. Every designer should learn a little development. Every developer should learn a little design and security.

A final thought on this is that we must understand that if someone trying to penetrate your site is looking at it and gleaning points like this, it enables an additional, analog attack vector. A lot of hacking has nothing to do with code, but rather charisma. Social engineering plays a huge part in a lot of system penetrations. Understanding issues gives an attacker something to speak to someone about to make it sound like they know what’s up and that they should be given access to a system.

“Yeah, this is John Doe over at A-Plus Design. We were talking with Bob Smith over there and have been asked to update your site to get rid of that ancient Flash content that’s stored in Joomla and do an upgrade on the backend. But our project manager totally screwed up and forgot to have Bob set us up an admin account so we can get in and change the files out. And of course Bob’s gone on vacation with Rudy now. Anyway, long story short, who can we talk to about getting access?”

This approach works more often than you would ever suspect. When you sound like you have internal working knowledge, you can get people to do all kinds of things. It’s the equivalent of walking into a building with a toolbag and clipboard and being allowed right back into the storeroom. When we talk about a security firm – whether they are cyber or operational – you should expect them to be mitigating these opportunities (and very hopefully training their internal staff to not fall for attempts).

Media References

Partly including this for my own reference and posterity, but if you’re interested in reading more, the following sites all covered this story after it gained traction via Twitter and Facebook and cited the issues that I shared above: