Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Trailrunner7 writes "Bug bounty programs have been a boon for both researchers and the vendors who sponsor them. From the researcher's perspective, having a lucrative outlet for the work they put in finding vulnerabilities is an obvious win. Many researchers do this work on their own time, outside of their day jobs and with no promise of financial reward. The willingness of vendors such as Google, Facebook, PayPal, Barracuda, Mozilla and others to pay significant amounts of money to researchers who report vulnerabilities to them privately has given researchers both an incentive to find more vulnerabilities and a motivation to not go the full disclosure route. This set of circumstances could be an opportunity for the federal government to step in and create its own separate bug reward program to take up the slack. Certain government agencies already are buying vulnerabilities and exploits for offensive operations. But the opportunity here is for an organization such as US-CERT, a unit of the Department of Homeland Security, to offer reasonably significant rewards for vulnerability information to be used for defensive purposes. There are a large number of software vendors who don't pay for vulnerabilities, and many of them produce applications that are critical to the operation of utilities, financial systems and government networks. DHS has a massive budget–a $39 billion request for fiscal 2014–and a tiny portion of that allocated to buy bugs from researchers could have a significant effect on the security of the nation's networks. Once the government buys the vulnerability information, it could then work with the affected vendors on fixes, mitigations and notifications for customers before details are released."

there is a large difference. the expoits in code are generally used as attacks and don't effect the normal use of the code. this would be the equivalent to holding the engineering firms that design the bridge responsible for the terrorist that bombed it.

Wouldn't it be more relevant to consider what happens to engineers who design bridges that collapse when someone blows them up? That would be a more relevant comparison. Pretty sure they get contracts to design replacement bridges.

It would be simple to only allow liability up to the cost of the software. That would provide incentive to commercial folks to fix their software and not overly burdon FOSS software.

Except that most FOSS isn't written for free. Companies pay people to write the code. If a company pays you to write a new device driver for Linux, and it turns out there is a bug, they you are liable for your full salary in government fines. Right? What programmer would work under those conditions? The company that paid you would, of course, have no liability, because they gave the software away for free (only charging for the hardware device).

The company would sue you/fine you?The end user did not pay, so he has no one to sue/fine. I would assume you would have in your contract that the company cannot hold your responsible. Companies are generally responsible for the actions of their employees.

So now we are going to support companies by buying their vulnerabilities for them?

It is worse than that. It is essentially rewarding companies for not taking security seriously.

There is software backed by companies which do offer a bug bounty, and there is software backed by companies which offer no bug bounty. Having a bug bounty for more software is desirable. But having government pay it for those companies, who do not pay it themselves, is not the proper solution. A much better solution would be that whenever the government buys software, it will primarily buy from companies, which do offer a bug bounty.

This will mean the software being bought is more likely to be secure. Additionally it will put a force on the market, driving it in the right direction.

The only situation where the government should be paying any bug bounties, is when the bugs are in software or services offered by the government. For example it could apply to security problems found in government websites. But if those products are bought from private companies in the first place, it should be made part of the contract, that the vendor will pay the bug bounty and fix the bug.

This is essentially a government subsidy to software companies that produce crappy code.

Look at Walmart. it pays its employees so little money that they have to use government assistance like foodstamps and medicare. Walmart shareholders reap the benefit, and the public is left taking care of their employees.

Here's a better idea - if a company is making software that's critical to national infrastructure, make them liable for any bugs that occur (and for smaller companies, require them to carry insurance up to a certain level of liability).

That's just not true. The internet and world wide web both existed in the early 90s, and neither was critical to national infrastructure at the time.

So then as soon as they became critical, the original authors would have to assume billions in liability? Or would software be exempted if it was not critical at the time it was written? So the liability would only apply to things that were "critical" before they existed? It sounds to me like this hasn't been thought through very well.

It would be fairly easy to have DHS come up with a list of things (physical locations, services, etc) to designate as critical to national infrastructure. In fact, I'd be shocked if they don't already have such a list already.

The organization that runs these these locations/services would have to build into all of their software contracts a liability clause.

The organization that runs these these locations/services would have to build into all of their software contracts a liability clause.

Problem solved.

Except the problem isn't solved. Our infrastructure is already underfunded. Making all the software cost ten times as much isn't going to help that. Every upgrade will also need new liability clauses and legal review. So upgrades will be less frequent, and our most critical infrastructure will be running the oldest and crappiest code, often written by companies that no longer exist because they were sued into bankruptcy. The military already learned this lesson: they found that the extremely expensive

There's a couple of logic holes here.First, who wrote that mil-spec software? Was it a contractor or private corporation? Ah, so the real blame on unreliable expensive software is with some private corporation, not the government, right?As for COTS being more effective, that's great assuming the critical infrastructure can be run on COTS.

As far as this bug bounty, it is a terrible idea. Sorry corporate America, if you want to keep your code private and reap the corresponding profits, you also get to assume

If it was mil-spec, there should have been a pretty stringent acceptance process. Why would anyone sign up to unlimited liability?-there was an agreed spec-the client set the acceptance criteria-they delivered what was in the spec-triggering acceptance finalizes the contract and their liability is limited

The only viable solution is to assert a cost to the providers of the software. If said cost is linked to such a bounty program, all the better - but you clearly cannot create a scenario in which writing bad code somehow ends up benefiting the software producers.

This sounds like a terrible idea. There are times the government should get involved in something, and time they shouldn't. This is one of those times they shouldn't.

It isn't the charter of any federal agency to shore up the products of private corporations. Corporations should be doing that anyway, and under the typical free market is awesome attitude most users here have, the expense of paying for bug discovery and fixes should factor into the corporation's pricing, profits, potential liability (haha) and

When you find the bug, they are just going to throw you in jail like they do with other vulnerability exposers. Then they'll offer you an out - be employed by them permanently at crap wages to avoid prison time.

But the opportunity here is for an organization such as US-CERT, a unit of the Department of Homeland Security, to offer reasonably significant rewards for vulnerability information to be used for defensive purposes. There are a large number of software vendors who donâ(TM)t pay for vulnerabilities, and many of them produce applications that are critical to the operation of utilities, financial systems and government networks.

Why should the government subsidize these businesses?I wouldn't have a problem with it if the program was revenue neutral, meaning the companies had to pay the government to essentially run a bug program for them.

Alternatively, instead of the carrot, how about the stick?Penalize companies that refuse to implement secure design/coding practices and penalize them separately if their hardware/software comes out insecure.

If Homeland Security said,"It is okay, attack our servers, our power grid, and other infrastructure. We'll pay you if you find a vulnerability." Then they can't just haul you to jail if you attempt it. I always thought,"Don't mess with the stuff to begin with" was a significant deterrent for most people. Now, you might say,"Fix it before an enemy of the state uses it for true detrimental means", well then you'd have to argue with brass who have to admit they were wrong all along.

What? You say that you caught me breaking into the CIA, FBI, the White House and another unnamed three letter agency? Naw, I was just participating in the Government sanctioned Bug Bounty Program. Proudly helping my country protect itself from evil-doers. If you don't believe that then I declare a fatwa on you and I want my Imam, I mean Lawyer.

In cases of murder, the first thing the police do is investigate the spouse, especially if they are the one who say, came home and discovered their wife had been killed. They are considered the initial suspect.

I would be surprised if anyone who reported a bug wasn't likewise investigated to see what they might have done right after they discovered it. Seems like a person would be opening themselves up to some possible grief doing this.

If I understand correctly, this is about government doing bug bounty programs for vendors that do not? That looks like an incitation for vendors to not do it, since government will. Except of course if we introduce a tax on vendors that do not have bug bounty programs.

Is the government going into the software publishing business? No? Then why should the government be paying for other corporations mistakes. If anything they should be fining the corporations. Giving the corporations more incentive to find bugs.
We don't need to be finding a way for DHS to spend more money, we need to find a way to get rid of DHS.