Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

First time accepted submitter punk2176 writes "Recently I started a free and open source project known as the PunkSPIDER project and presented it at ShmooCon 2013. If you haven't heard of it, it's at heart, a project with the goal of pushing for improved global website security. In order to do this we built a Hadoop distributed computing cluster along with a website vulnerability scanner that can use the cluster. Once we finished that we open sourced the code to our scanner and unleashed it on the Internet. The results of our scans are provided to the public for free in an easy-to-use search engine. The results so far aren't pretty."The Register has an informative article, too.

Well hours next project be a search engine showing which houses don't have good security systems, or showing the weaknesses in each home's security? What an aweful way to attention whore - by giving criminals a list of defenseless people.

That's not a perfect metaphor and you know it, because if your house is insecure it puts you in danger. If your website is insecure it puts the users in danger. If your house has no security system, you're personally aware. If your website is insecure you likely do not.

I'm not making an argument as to whether or not this is a good idea, but you're over simplifying things on purpose.

I don't see how this is different than publishing a searchable database of unlocked doors that I found in my neighborhood, with the claim that my purpose is to improve my neighborhood's security. I do not see this as oversimplification. A group (gaggle? herd?) of tweakers could use the database to find an unlocked house whose owners are on vacation and then squat there, using it as a base to burgle other houses in my neighborhood, just as malicious hackers could host malware on a vulnerable site. It's still

Well, at least one difference is that when a website gets hacked it is almost always the people visiting the website who are the target because the goal of the hacker is either to grab information about those users from the hacked system or to use the hacked system to distribute exploits to anyone that browses there.

While when a house is broken into, it is basically a problem for the owners of the house and not really anyone else.

So publishing a list of vulnerabilities on websites serves the purpose of shaming the website operators into better protecting their users.

So publishing a list of vulnerabilities on websites serves the purpose of shaming the website operators into better protecting their users.

So by that logic, I assume you rape every woman you pass on a dark street, mug the elderly who don't go out in groups, and commit every other crime of opportunity to shame people into what *you* consider proper, minimum safe behavior. How brave and noble of you.

I'm so tired of people dressing up shitty behavior under the guise of protecting them when really all they are doing is being selfish, self-satisfying little asshats.

If this guy wasn't such a douche, he'd be emailing the websites a notice letting the

So by that logic, I assume you rape every woman you pass on a dark street, mug the elderly who don't go out in groups, and commit every other crime of opportunity to shame people into what *you* consider proper, minimum safe behavior. How brave and noble of you.

Phew. For a moment I was worried this thread would descend into hyperbole and strawman arguments.

So by that logic, I assume you rape every woman you pass on a dark street, mug the elderly who don't go out in groups, and commit every other crime of opportunity to shame people into what *you* consider proper, minimum safe behavior. How brave and noble of you.

Yes, in fact I make suire to rape and mug every chance I get!!

The fact that you have to make such an absurd argument ought to be a clue that you have misunderstood the original point.

Funny; my professor just told a networking class recently when discussing vulnerability scanners that it was seriously unethical to scan a system without permission - it would be like walking through a parking lot and checking which cars are unlocked. I think most people would agree with him. This project might have good intentions, trying to encourage the sysadmins to tighten up their security, but I think there's a better way to do it than public shaming.

Some of these sites are likely entrusted with sensitive user information. The car analogy is only apt if you borrowed $100 from a couple of your closest friends for rent and left that in the car you forgot to lock while you were getting a taco.
As I see it, the benefit of this type of public shaming is it reinforces in end users the idea that you should be careful who you trust with your data. For admins, if the majority of listed sites use web technology "x", maybe if you're designing a new site you look

And 99.97% are some guy trying to make ends meet by offering online chemistry lessons or showing you how to hook up your home theatre. IF there were any sites found that held personal information, the right thing to do would be to contact those sites, not encourage people to hack the personal information.

Certainly it does no good whatsoever to give script kiddies a list of sites to deface. The most popular hosting is Godaddy, with their $10 / month hosting account. (35% of sites are Godaddy sites.)

That's a fine point, but if you were that consumer shopping to implement a prefab site, wouldn't you like to know if the technical foundations are sound? If 1000 GoDaddy sites are hacked in a day maybe that prompts a response from the host.

The most popular hosting is Godaddy, with their $10 / month hosting account. (35% of sites are Godaddy sites.) The sites with hosting budgets of around $10-$50 month make up 95% of all sites.

As a web developer I may say categorically, fuck them. If you put a site on the web, it is your responsibility to make sure that it is secure. If you are not able to do that to a professional standard, you should not do it. In point of fact, there is a need for a licensing organization to prevent amateurs from practicing web development. The problem isn't that this website is exposing poor security practices, it's that it's not promoting professionalism.

Actually I do somewhat agree with the spirit of this post here. Software Engineering is a discipline that can affect large amounts of people and where not many people actually understand it. This is very similar to any other type of engineering (civil, nuclear, electrical, etc.) and to practice those other disciplines generally you have to have a P.E. or at least a P.E. signs off on the work done. Under current models, this is not in any way required for software and while most of your real software engi

and the god damn license isn't worth the paper it's printed on. What makes the difference is that an Engineer is Legaly Liable for any screwups. In other words, they've got a bit more at risk then Joe Sixpack who's throwing shit at the web to see what in hell is going to stick. Until the liability issue changes for web developers, nothing is going to change

I'm a little confused here, your post kind of contradicts itself. You say the license isn't worth the paper it is printed on, but then say

an Engineer is Legaly Liable for any screwups

The license is what allows someone to legally be an engineer for most disciplines. We went over this when I took my engineering ethics course back in college, and there have been numerous (some very frivolous in fact) lawsuits to keep people from using the term or actually practicing any form (of licensed) engineering. The only current exception to this is software en

If someone puts up a web site I have to figure that it might be for people to visit. If that site has vulnerabilities, I have to give the owner benefit-of-doubt that they might likely want to know such, as I also have to figure that they wish for it to be safe from attack - to prevent defacement, hi-jacking for attack app insertion, making off with private infos, etc.
Therefore I'

They're scanning large numbers of websites and putting vulnerability information up on a public website for anyone to view without notifying the website owners, much less giving them a chance to fix the problems before sending hackers their way. There's nothing ethical about that.

In the past hackers used to notify the owners and give them a chance to fix problems and the owners would either do nothing or even threaten to sue. Any slashdot reader should know this.

Anyone claiming to know anything about the topic of web security should know the procedure used to remedy ~90% of all vulnerabilities. Those security updates you get each week don't appear out of nowhere. Someone like myself files a security ticket with the vendor or affected party. The vulnerability is confirmed and analyzed, then other vendors who are likely to have similar vulnerabilities are notified. A patch is pushed, THEN a CVE is issued. After that, more mainstream sites like slashdot pick up on, and link to, the CVE which explains what the vulnerability was is links to the update to fix it. That's typically about a week after the vulnerability is teported and 2-3 days after the fix is available. That's how securitu issues are normally handled, they aren't ignored. (If they were ignored we wouldn't average 100 security notices per week, would we?)

When I found the PowerDNS vulnerability I could have come straight to Slashdot with "how to take down wikipedia and millions of other sites". If I were a scumball attention whore I would have done so. Instead, I reported it through proper channels. Wikipedia was patched within 36 hours, then other sites. The next day, the CVE went out, THEN you heard about it. I still get to brag - I just do it AFTER a) wikipedia and other responsive sites are safe and b) I have something worth bragging about, having protected wikipedia from being exploited.
This jackass is merely attention whoting at other people's expense. He hasn't done anything special - just ran Nessus - but is advertising himself via the results rather than handling them responsibly.

Suppose for a moment that some of the sites could leak sensitive information. Suppose also that sites which leak sensitive information should be slapped. Well, the slashvertised site, the cracker's search engine, is most certainly leaking sensitive information, ergo he should be slapped!

1) The statement that we "just run Nessus" is incorrect. We wrote our own scanner that works on a Hadoop cluster. Why is this important? It means that we can handle a lot more scans than anyone else (several thousand per day with a small cluster) and it's also specifically made for mass scans. This is important in point 2 below.

2) The process you're describing is for finding a vulnerability in a piece of software in general (e.g. a common CMS), not a specific vulnerabilit

We hope to provide a view of this to the website owner and yes, push them a little to get their security ducks in a row.

No, you don't. If you did you'd have built your system to make *them* aware first, instead of posting a "don't blame the messenger" shame tool that exposes their vulnerabilities.

The hacking-promotes-security argument is weak sauce, even more so in your case. The vast percentage of people you've exposed (i.e. not anonymous mega-corps, but rather small mom-and-pops set up and left un-managed by unskilled sysadmins, innocuous self-hosting newbies, etc.) will likely never encounter your list, even after it provides scriptkiddies with an easily digestible list of opportunities who wipe their servers and turn them into warez hubs only to be rinse-repeated because they will *never* know any better.

You are merely a new vector for the disease, selling itself as a cure. Where in this is your moment to feel proud?

1) The software used is a very minor part of the point, and as far as the ethics argument goes means literally nothing.

2) The start of the process he's describing, reporting the bug to the people who can deal with it, is the important step that doesn't change. Yes, it is different than dealing directly with software developers. It also means they probably aren't capable of fixing it so quickly. The software developers have a huge edge in that area. It does need t

How 'good' or 'bad' this database is depends IMHO partially on its query interface. If one can only ask for single (FQDN) URLs (with a query rate limit) and gets the vulnerabilities of that specific URL as an answer (plus maybe some pointers on what the vulnerable software likely is), it might actually be useful for the somewhat technically-inclined web owner. The proposed list of vulnerable software would probably not help them as they would have to remember which SW their site

I'm really not trying to be harsh, sorry for coming off that way. I'm probably a little biased because I have experience being a small business IT guy by default (as in no real training, just better with computers than the other people there), so that's really who I relate to the most. In my position I understood sometimes you need to seek help elsewhere, and I did, but I also learned that problems aren't as easy to fix as people think, nor are they always cheap, and money can be a big issue for a small bus

I see a note on the punkspider site to opt out of having your site scanned. Is there a specific way to opt in as well? I would be interested in seeing what results could come out of scanning a few of my sites. I've tried using Skipfish in the past, and a few other scanning utilities, and got a lot of false positives, and also a lot of missed positives. Things I knew were vulnerable and just wanted to see if the scanner would pick up on it.

So are you saying your argument in favor of the name & shame strategy is pointing out times where companies were named and shamed and still didn't fix it? I see a flaw in your argument... Which is sad because it's a valid point that you're trying to make, sometimes the name & shame strategy does work. But that's not really what this search engine does anyway, it's not as if they're posting on their front page that a site has vulnerabilities, you still need to go out of your way to check a specific s

On the other hand you could just check the sites you manage and design with this tool and see if it finds any problems. It is important your website is standards compliant and it is just as important that your site is secure. If you ever got hacked you will soon find your site blacklisted and a pile of work to rebuild the site and more importantly restore your reputation.

Without this tool you will be hacked eventually if your site has vulnerabilities, it has to be a good thing for you to know beforehand so

I think a better example may be someone going through stores and seeing which ones are posting people's credit card info on a large board behind the cashier for all to see or the ones that are actually trying to keep it hidden. That is seriously about how stupid many of these web sites are and if this was happening in meat space this list would be uncontested as a supreme public service.

This is more like going to a public parking lot and testing to find out whether the security cameras are real and working, not working, or fakes, and then telling people they shouldn't park their cars there if they want to park where there are security cameras.

Actually, it's less like telling people they shouldn't park there, and more like creating a searchable database of areas with no security cameras. It doesn't take much thought to realize the people looking for a place to park aren't going to search this database, it'll be used by the people looking for safer areas to steal those cars. Or to drop all this stupid analogy crap that never seems to have a positive effect on discussions, this is a searchable database that's only going to be used by people who are

The prof gave a wrong simile, you are an idiot. WebAppSec scanners can inject harmful payloads (like emptying whole DB tables harmful, a simple string like "or 1=1" in the wrong place can can cause loads of trouble) and should be never run again live/production websites.

Also, those guys are overly excited about their own work to the point or arrogance but give them time. They'll either get to appreciate all the complexities of those types of systems and power on or just give up after a while.
They got the

How about you borrowed your expensive camera to a friend and you noticed he left it visible inside his car on the parking lot. A parking lot known for many burglaries, in other words he was inviting somebody to steel your camera.

Would it be ethical to check if he locked his car so you can protect YOUR OWN belongings?

I agree with your statement, but looking at my log-files I wonder why the good guys should not be allowed to perform one scan while the bad ones are performing hundreds a day. Why should the bad

The basic car analogy fails to capture that vulnerabilities in computer systems are often used as stepping stones for further attacks on other computer systems. In the car analogy proper, the only person affected by a break-in is the unlocked car's owner, while the other car owners are safe provided their car doors are locked.

But say the criminal is a joyrider.
He picks an unlocked car, and then drives around the parking lot smashing into other locked cars for fun, and then runs away. Now the question: is

Public shaming has its place. Think back, if you were involved then, to the mid to late 90s. Smurf attacks were the in thing. Places like powertech.no and others began listing networks that had smurfable broadcast addresses. For a while, this did the work for the script kiddies but eventually the networks made it a point to remove themselves from the database and now the problem is nearly gone. I can see this database having a similar impact.

It's a tool. Tools can be used for good and evil, it just depends on who's hands the tool is in. Take Metasploit for example -- it's used widely by both whitehat security researchers and blackhat criminals.

As a security researcher, I'll add that PunkSPIDER doesn't shine light on anything that the bad guys don't already know. I'm glad to see another tool that helps enable those who are charged with defending web applications.

So one thing that we've been trying to make clear is that the project is *on track* to scan the entire Internet, we haven't scanned everything yet. We have scanned about 70k sites and have under 4 million indexed. Our next version is going to be clearer on what is and is not scanned - currently we just say 0 vulnerabilities if we haven't scanned it, indicating that we have not found vulnerabilities in it yet - not necessarily that it doesn't have any.
This was all part of our ShmooCon presentation which ju

I hope you've got a good lawyer and money to keep him or her happy. The first exploit you publish about a large (organization|government|important person) is going to give you a really, really, really big headache - at best.

Also... ethics - you have none. For this, as someone who has spent past lives working in IS, I hope you rot in a miserable existence.

I was at your talk at ShmooCon and was quite impressed.
What if for any domains that you discovered vulnerabilities on you were to automatically pull whois data (if the TLD has whois servers or web based whois without a captcha) and send a quick email about your findings to any emails listed?
A shameless plug: ruby whois [ruby-whois.org] is the best programmatic whois client and parser out there IMHO. It would make the above suggestion quite simple.