How to manage a closed source, high-risk project?

Secrets are no fun but sometimes they're necessary.

This Q&A is part of a weekly series of posts highlighting common questions encountered by technophiles and answered by users at Stack Exchange, a free, community-powered network of 100+ Q&A sites.

abel is in the early stages of developing a closed-source financial app within a niche market. He is hiring his first employees, and he wants to take steps to ensure these new hires don't steal the code and run away. "I foresee disabling USB drives and DVD writers on my development machines," he writes. But will that be enough? Maybe a better question is: will that be too much?

Trust goes a long way

Virtually all professional developers won't steal your source. It's understood that if you work for somebody else, the employer own the code that you write. Devs might copy code for reference purposes, but it's highly unlikely they will offer it for sale to anyone else. Getting caught isn't worth the risk.

More so, distrust breeds distrust. Disabling USB ports and DVD writers will engender a feeling of distrust which will, paradoxically, make it more likely that the developers will copy the code.

By all means add a secrecy clause to your contract, but it's probably unnecessary to highlight it as the most important part of the contract.

Who would buy on the black market?

Also, in the real world, third parties don't want stolen code. The risk is too great. Back when Informix and Oracle were duking it out for the enterprise relational database market in the mid-90s, one of Informix's developers quit to join Oracle (which was quite common), and took a hard drive full of Informix source with him (which wasn't). He told his new boss at Oracle, expecting a warm welcome, but instead he got a security team and an arrest. Then Oracle security called Informix security, and the hard drive went back to Informix without anyone from Oracle having looked at it.

I can't comment to their effectiveness or appropriateness as I have limited experience with these solutions, but just thought that it might be helpful to point this out. Feel free to edit this answer with additional software solutions to data leaks.

Your employees are your real resource

If these programmers can write the software in the first place, then...

THEY DONT NEED TO STEAL IT.

They can simply rewrite it in a fraction of the time it took to originally develop it. Yes, it's true, developers arent complete idiots... once they figure out how to do something, they can often remember how they did it.

So, I guess you're just going to have to trust them, or else write the software yourself.

This is actually more of a problem than the responders acknowledge. Much of the value isn't in the code itself, but in the business processes and details in the code may be what is most sensitive. What if Coca-cola has a software reporting system that contains its secret formula, or Amazon has a set of classes that, together, reveal its business secrets to keeping costs down? Or the US Air Force has some specs for its new fighter ECMs in code?

I think there is some benefit in limiting access. Of course, you as a manager need to know what you're protecting before you can limit accordingly.

Disabling USB ports? If the developers are allowed at all to communicate with the outside world, be it with carrier pigeons or whatnot, no amount of disabling USB ports nor armies of packet inspection solutions will prevent leaking of code if somebody wanted to do it.

So as someone wrote, this is not a technical problem but a people problem. Deal with it like any other people problem ^^

But yes, limiting access will of course help, as long as the access isn't required for the work.

There is no way to enforce security in a way that makes stealing code impossible for your devs. There's hacks that employ microcontrollers acting as USB HID devices, "blinking out" data by KB LEDs. You will not be able to lock your stuff down enough to make working possible but stealing not. Especially not for people who are technically skilled and need enough privileges on a machine to develop software.

Take sane measurements that control the risk of accident and minimize outside attack surface. USB lockdown as an example could be valid or not. But keep it at a level that does not make working too cumbersome.

Tell those devs that this is secret and highly confidential. Be careful to not lecture them and have them affirm the fact. If they say themselves that they are aware of the confidentiality and commit to keeping their mouths shut that's psychologically different than just telling them they have to five times a day. (Of course this also needs to be reflected in their contracts.)

Make sure they have a positive working environment. Loyalty and morale can easily be had when people are happy with the work they do. It's when people are treated badly in some way that some get the idea it would be just to just run with your codebase.Communicating distrust is a form of "treating badly", just to say it. Making work bothersome beyond a certain level is too.

In general you won't be able to make stealing or leaking impossible. You need to encourage people to not consider it an option. If they like spending the time they work for enough to want to keep working for you, then you're about as safe as you can sanely get.

Can't help wondering why only really stupid questions show up in this feature. Previous selected questions bordered on silly. This one is outrageously silly. Serious suggestion: get a programmer involved in selecting questions.

Quote:

This is actually more of a problem than the responders acknowledge. Much of the value isn't in the code itself, but in the business processes....

Trust is essential, but so is taking measures to protect confidential data and code.

Yes, lock down any machine that is being used for product development. Yet also ensure that any developer who joins the project is willing to work within these restrictions before they join the project. It is a project requirement, much like any other project requirement.

It also helps to provide computers that are isolated from the project, but are connected to the outside world. Those machines are used to access outside resources, may they be for the project (e.g. documentation) or when your programmers are on breaks.

Finally, realise that no security mechanism is perfect. You have to be proactive in monitoring both the technical and social aspects of the project.

Can't help wondering why only really stupid questions show up in this feature. Previous selected questions bordered on silly. This one is outrageously silly. Serious suggestion: get a programmer involved in selecting questions.

They're limited by the target audience. They need questions and answers that are interesting to a fairly mainstream audience.

I agree that some of the questions have seemed like they were of the type that seeks attention rather than information.

Security mechanisms are unlikely to work. If someone wants to steal the code, he will find a way.

Disabling USB ports etc will merely serve to alienate your employees.

If you're paranoid about security, you'd be better off leaving everything wide open, while using technology that will live-intercept transmission of your code. The lack of apparent security measures will encourage a would-be thief to use an easily-detectable exfiltration route.

You have to trust your people because... you have to. You don't have a choice. They know what's going on. So hire people you can trust and make sure they know what is and is not ok to do with the code, or talk about outside the company.

I've enjoyed this feature in the past, but I agree that it's gotten worse over the last month or two. I wouldn't mind if it were more technical, even if it were technical about things that went over my head. It's nice to learn about new stuff.

At one place I worked, I was told that they were taking steps to make sure employees would not steal code and give it to out competitors. My reply, "Stopping them is dumb. If they use our code, they'll just wind up as fucked up as we are."

Make keeping secrets cool, and enlist the developers in support of defending the secrets.

For example, a number of large employers issue laptops with things like encrypted partitions and even those privacy filters attached to the screen. The idea is to communicate to the developers that what they're working on is crucial to the company's success, important enough to keep secret, and valuable enough to prying eyes that even looking over your shoulder at a coffee shop is to be discouraged. And don't ever let it be regarded as a joke… visitor logs, keycards, perimeter cameras… none of it deals directly with code, but it all draws a clear line between Us and Them, and it puts full faith in the developer to join the effort.

Now you're not management fighting against employees, you're a cohesive organization with defense of your IP built-in at every level, fighting against all comers. It shifts the scenario from an internally-confrontational footing to one of circling the proverbial wagons.

Cos if you try to solve this problem with secretive technical measures alone, you're only going to set up the challenge for your own people. Keep the challenge between your organization and your competitors, where it belongs, and you don't have to treat your own people like the enemy.

I have to agree with ChrisF, having been on BOTH sides of this equation. 13 years ago I was working for a startup. The boss was an OK guy, but he was paranoid and stingy. Over time his obsession with restrictions and security measures wore at the staff. Nobody felt trusted, they all began to imagine there were things going on that were both not in their interest and secret. Work slowed, key people were lost, the business eventually failed. The ironic part is that the person who eventually DID abuse his position there was the one brought in to be the security expert to ride hurd on the rest of the development staff. You have to trust someone eventually, best make it the guy who's worthy of it and has at least an emotional investment (which brings up another point, if you offer your staff collectively a couple of % ownership of a business they contributed key innovations to they will NOT BE LOYAL).

OTOH for the past 10 years I've been running my own software development firm which sells products, services, and consulting in the financial industry. I've had developers come and go of course, but I also have a reputation of rewarding people well, sticking to my word, and have never kept ANYTHING secret. If I know it, they know it. If there is anything they need, they have unfettered access to it. That doesn't mean you have to be an idiot of course, everyone has always had their own access and only needed rights. There are of course things like passwords and sometimes a few business details that simply cannot be shared, but if I hire someone, I am entirely sure they're not going to skeef source code. If they did? Well, its a big industry but they can't do business in my niche anyway, and nobody is going to invest in a venture based on stolen code.

Mainly its like ChrisF said, trust is a two-way street, you have to earn it and you earn it by extending it.

However, do lock down external media or require them to encrypt their portable media - this will avoid honest but horrible mistakes such as losing a portable media full of code.

I often take code with me to work at home but my drive is encrypted.

Re-writing code from scratch is a non-issue for a developer. More important is to prevent information leak about your business idea before you enter the market - the idea itself is more valuable than the implementation. Your competitors might take your idea and enter the market ahead of you.

I like the "Trust goes a long way" answer. I would add one more thing to it.

You should be very upfront about what you are trusting them to do for two reasons.

1. Letting someone know that you are counting on them instills self worth. Employees with good morale are less likely to turn on the company. They'll be more loyal just knowing that you value their honesty.

2. Discussing important issues, even if you think it should go without saying, will make it harder for most people to justify deviating from what is expected. It has a way of making it clearer in the employee's mind that what they are about to do is wrong. Often it's just enough to keep them honest even if they're not feeling the best about their job at the moment.

It's like talking to your kids about drugs. They already know they're not supposed to, but for whatever reason just talking about it makes a big difference when the temptation arises.

I think security or trust aren't even what needs to be discussed here. The problem is that way too many companies/people act as if they are developing nuclear weapons. Whether this comes from an inflated sense of ego or something else, I'm not sure.

The question isn't, "How do I keep people from stealing my code?" It's, "Why on earth would anyone want to steal my code?"

Lesson learned(again) - never spend more than 3 minutes on ANY fucking comment on the internet.

Or, make sure you copy and paste your response that took an hour to write to a text editor as backup before you submit it to the mercy of the web server gods. (Heck, even save it to a scratch file in case your computer locks up.) Use the same practices you would on any computer work that you don't want to repeat an hour of your life doing.

Where I work, we have a secret project where we are developing a Continum Transfunctioner. We are not even allowed to talk about the project's code name, "Cat Napper." (We are doing some cool stuff with the source code. If you want to see it, send me a text.)

We at SmashFusion, have even been known to fire or even terminate employees for talking in their sleep.

This is actually more of a problem than the responders acknowledge. Much of the value isn't in the code itself, but in the business processes and details in the code may be what is most sensitive. What if Coca-cola has a software reporting system that contains its secret formula, or Amazon has a set of classes that, together, reveal its business secrets to keeping costs down? Or the US Air Force has some specs for its new fighter ECMs in code?

I think there is some benefit in limiting access. Of course, you as a manager need to know what you're protecting before you can limit accordingly.

The problem is that in order to develop software in that scenario the developer has to actually understand said business process. Then, screw any media, the whole process has to be in their head. Worst part is that companies actually *like* people who carry around solved problems for other companies in their heads. That's what we call experience isn't it? Someone who has seen what others do, where they went wrong, and that have solved somebody else's problems. Guess what, these are the people who can remember what they did, how they did it, and can do it again. That's why you are paying them. So no, you can't stop them going away with code, maybe they won't copy the specifics or detailed implementation, but they sure know how to replicate a given solution.

There are many industries where it is a way of life, not in regard to trade secrets (which is this guys issue), but for national security.I find it funny that he is looking at his employees as a potential threat, but I suspect that it is only because that is the threat he can see.If this is a real issue, rather than fantasies of grandeur then he needs to take a defense in depth approach.

1. Hire employees which you can trust.2. Treat them well enough that they won't want to leave.3. Make it onerous to share secrets (prison, financial liability).4. Make sure code is developed on a network with an air gap.5. Split development across multiple teams with limited visibility.6. Make sure software is only deployed on systems with an air gap.7. Don't market your special sauce in public forums.8. Have good physical security on your site. (air lock doors, no windows, no-mans lands around buildings, etc)9. Have good network security, admins which are 'more' trustworthy than devs & regular external audits.

Every one of these has a cost on development productivity, start-up costs and product sale-ability, it is the classic castle approach to development security and only works if you project is worth $100s of millions to big customers.

The other options is to just bring the product to market faster, aka the guerilla warfare approach to development. Choose option b :-).

I once worked for a large aerospace company. One time the government used an assembly hall for a 'best and final' selection for a project bid. Each presentation was confidential, technically the gov was just borrowing the hall. The first company presented, and finished in late afternoon. The second company came in to prep the hall. They found someone had left behind one of the thick slide decks from the first presentation. They took it, and subtly modified their bid, staying up all night. Punchline: The whole deck was phoney - a sting operation designed by the first company to trap the second company. The second company was disqualified.

Earlier, a poster told a story about a hard drive being refused. I guarantee you the only way that happened is if the first company found out, called, and told the second that the drive was missing. You will also note that a current class-action accuses these companies of collusion in recruiting. A third strong possibility is the story was a lot more vague and amorphous the the poster knows.

People will steal any damn thing. Period. Compartmentalization and "need to know" is your best bet.If you think about it, the most critical secrets are almost never needed by us code monkeys.

I once worked for a large aerospace company. One time the government used an assembly hall for a 'best and final' selection for a project bid. Each presentation was confidential, technically the gov was just borrowing the hall. The first company presented, and finished in late afternoon. The second company came in to prep the hall. They found someone had left behind one of the thick slide decks from the first presentation. They took it, and subtly modified their bid, staying up all night. Punchline: The whole deck was phoney - a sting operation designed by the first company to trap the second company. The second company was disqualified.

Earlier, a poster told a story about a hard drive being refused. I guarantee you the only way that happened is if the first company found out, called, and told the second that the drive was missing. You will also note that a current class-action accuses these companies of collusion in recruiting. A third strong possibility is the story was a lot more vague and amorphous the the poster knows.

People will steal any damn thing. Period. Compartmentalization and "need to know" is your best bet.If you think about it, the most critical secrets are almost never needed by us code monkeys.

regards.

Disagree. I work at a fortune 500 company and I know for a fact that they have turned in somebody at least once for this kind of stuff. Companies take this seriously especially, as you pointed out, when they could lose government business as a result.

Well, it's very surprising that the replies here simply says that "just trust them, they will steal them if they want to". What he would wanted is a "keeping honest people honest" solution, not some draconian policy aimed for blocking every possible hole there exists.

Assuming you trust your employees, did background checks properly, paid them well, treated them with respect, etc. etc. etc., my suggestion is to create an internal network (which isn't connected to the external network at all) for all the development work, and also allow a network that is fully open, which can be used by any device the employee wants, with only minimal restriction. Have color-coded LAN cables marking which one is for the internal network, and which is for the external network. Clearly state that the development network should only be accessed by workstations issued by the company, which only have minimal number of USB ports (for keyboard and mice), and nothing else.

(Assuming you don't need those I/O devices - if you are doing some embedded software work, you need all sorts of I/O devices for communicating with the devices, which seems you don't.)

Make it clear that you are doing this not because you don't trust them - it's for preventing any external attacks to the development machines.

Well, it's very surprising that the replies here simply says that "just trust them, they will steal them if they want to". What he would wanted is a "keeping honest people honest" solution, not some draconian policy aimed for blocking every possible hole there exists.

Assuming you trust your employees, did background checks properly, paid them well, treated them with respect, etc. etc. etc., my suggestion is to create an internal network (which isn't connected to the external network at all) for all the development work, and also allow a network that is fully open, which can be used by any device the employee wants, with only minimal restriction. Have color-coded LAN cables marking which one is for the internal network, and which is for the external network. Clearly state that the development network should only be accessed by workstations issued by the company, which only have minimal number of USB ports (for keyboard and mice), and nothing else.

(Assuming you don't need those I/O devices - if you are doing some embedded software work, you need all sorts of I/O devices for communicating with the devices, which seems you don't.)

Make it clear that you are doing this not because you don't trust them - it's for preventing any external attacks to the development machines.

With this solution you have an additional benefit, from the customer's perspective: you make security part of the developer culture. If someone won't work in an environment that reduces the attack surface as a matter of course, they're writing buggy, insecure software. You certainly won't eliminate problems by making people conscious of security, but your product will certainly cause less annoyance to the people who deploy it.

I agree with a lot of the commenters that you need to hire people you can trust. Maybe you don't have the interviewing or people-skills to know that you've hired trustworthy people. That's something you're going to have to work on - locking down their laptops is just ridiculous. If they want to copy the code or ideas they will and there's nothing you can do about it.

What you will accomplish by treating them like children is either build resentment or drive them away. If I'm an ethical person and being trusted matters to me (the kind you want to hire), your treating me as guilty until proven innocent will piss me off. I may stay of the work is interesting enough and the pay is great but you've lost my passion. Day one. I may even hate your guts. I've worked as an executive at a couple of companies and have seen this play out. I've built teams of loyal, hardworking developers. I've also seen theft (a colleague's team) - so I know what I'm talking about.

What you should do is have an ironclad contract. This is just good business sense. Find a good IP lawyer and pay them the big bucks they're worth. Your employees will not have a problem signing it if you sit down and explain the contract to them and why it's important to you.

Most people want to do the right thing and have no interest in theft.

The other thing you'll want to do is make sure your code is protected (encrypt, firewall, no Dropbox, local Git repo etc.). This again is just good business sense. If your code is valuable then treat it like you would expensive jewelry. 5 employees at $50K/year - that's $250K. Would you keep $250K in your home without an alarm system and a safe? Unlikely. So lock everything down to prevent OUTSIDE theft. Write up policies around this so your employees appreciate the value of what they're doing. It will make them more careful.

Your bigger worry is your code being stolen by people in a country with loose or difficult to enforce international copyright laws. China's unfortunately a good example. Ergo the above.

If you're still struggling with all of this, hire someone like myself whose got a couple of decades of experience with IP, lawyers, employees, and software development on contract to help you get it all in place.

This is actually more of a problem than the responders acknowledge. Much of the value isn't in the code itself, but in the business processes and details in the code may be what is most sensitive. What if Coca-cola has a software reporting system that contains its secret formula, or Amazon has a set of classes that, together, reveal its business secrets to keeping costs down? Or the US Air Force has some specs for its new fighter ECMs in code?

I think there is some benefit in limiting access. Of course, you as a manager need to know what you're protecting before you can limit accordingly.

You deal with that on two levels:

1. Hire employees you can trust. This is hard to do. So you do the best you can -- screening out people you think might be untrustworthy, going with recommendations from people you already trust, etc. But in general, most professionals you would hire are not going to screw you on purpose. Once you have them hired, you must build on that trust relationship. You let them know that you trust them and show them they can trust you by treating them fairly and acting conscientiously and with demonstrable integrity in everything they observe.

2. Set sensible information security policies. What can and can't be shared outside the company? What requires or does not require a NDA? Where can data be stored? Professionals are able to deal with these issues but depending on their background, they may not be familiar with all the ways they could inadvertently expose corporate data. And do what you can to make compliance with good policy easy. For example, they need to get an NDA for something, make sure the people who can authorize it are readily available.

3. Training. Get help from security professionals (your company may have its own or you may need to hire contractors) to get everybody up to speed on what they can do to protect corporate data. Is it OK to plug a commercial USB drive into a corporate computer? Under what circumstances? When is it OK/not OK to run third party software? Is it OK to connect to your private online accounts from inside the corporate network? What should your policy be on passwords, ssh keys, etc?

The Air Force one is a non-issue, at least in terms of protecting it: it would almost certainly be classified by the US government, and any vendor dealing with it would have to have the computer systems accredited by the Air Force security folks. Locking down USB ports, removing CD burners, etc is common practice, as is physically isolating the network.

Disagree. I work at a fortune 500 company and I know for a fact that they have turned in somebody at least once for this kind of stuff. Companies take this seriously especially, as you pointed out, when they could lose government business as a result.

This. For example, in the late '90s, Boeing acquired tens of thousands of pages of Lockheed Martin proprietary material, and didn't turn it in right away; as a result, they were stripped of over a billion dollars of government business. At one point there were criminal indictments in the works, although I don't recall if people actually went to jail or not.

The issues relating to your own employees is complicated, as many comments here obviously show. I'd like to add one thing: educate your employees about security. You major problem will most likely not be employees steeling data, but outsiders to get in. There are is a number of software solutions out there to prevent such attacks, but the best way is to educate your workforce. Especially, since there has been quite a number of targeted spear fishing attacks recently.Such educations would have the additional effect of making your employees aware of the value their product has.

Disagree. I work at a fortune 500 company and I know for a fact that they have turned in somebody at least once for this kind of stuff. Companies take this seriously especially, as you pointed out, when they could lose government business as a result.

This. For example, in the late '90s, Boeing acquired tens of thousands of pages of Lockheed Martin proprietary material, and didn't turn it in right away; as a result, they were stripped of over a billion dollars of government business. At one point there were criminal indictments in the works, although I don't recall if people actually went to jail or not.

I think security or trust aren't even what needs to be discussed here. The problem is that way too many companies/people act as if they are developing nuclear weapons. Whether this comes from an inflated sense of ego or something else, I'm not sure.

The question isn't, "How do I keep people from stealing my code?" It's, "Why on earth would anyone want to steal my code?"

The corollary questions is could anybody get anything from the code even if they stole it? I recall one company that had put in place significant and annoying security measures to protect their source code, without noticing that the code was written in a custom language for a custom processor with a bizarre instruction set that did not exist anywhere else. Even if you got it, there was no place to sell it as it only ran on one specific box, and the code did functions that were so straightforward that any competent programmer could write it, so there was no value to be obtained from reading the source.

My experience has been that very little of the code I write has anything "proprietary" in it. The vast majority is straightforward math and physics that is widely known. Every now and then, there are "empirical factors" that are the results of in-house research. Recently, I came across a 0.18 and a 0.31 that we the only things that set all of their code apart from what could be derived from fundamentals.

Companies need to be more protective of their customer/client lists than just about any algorithm.