Thursday, December 24, 2009

I will be joining some very smart folks for a panel on PCI at Shmoocon next year. Yes, PCI at a hacker con. No, not a Pointy-Haired-Boss type presentation, but a panel discussion of PCI and its impact on our industry. This is part of a larger effort to bring compliance issues to a broader audience, focused on PCI but with insights into the larger compliance realm- look for more presentations and some podcasts in the new year. In this panel I will be joining Michael Dahn, Dr. Anton Chuvakin, and Joshua Corman to discuss everything from the origins of PCI through its unintended consequences and speculation about the future of PCI.

The abstract for this session:

Whether you love it, hate it, or are merely "friends with perks"- compliance is significantly changing what we call security. PCI has been accused of being the Spawn of Satan by some, and yet it has also been credited with advancing security by others. This panel of PCI experts, analysts, and victims will discuss and argue the realities of PCI: its origins, goals, and consequences (intentional and otherwise). PCI is having an impact on priorities, budgets, and personnel, which is being felt throughout the security industry. Unfortunately, there have been few informed discussions of PCI and compliance issues in the technical ranks of the security community. This panel will bring PCI subject matter experts with real-world experience to the technical security professional and hacker audience to discuss, engage, enrage, and argue about what may well be an existential threat to information security as we know it. The diverse viewpoints and experiences of panel members will guarantee a lively and often heated discussion, and will provide a broad base for fielding audience comments, questions, and criticisms. Bring plenty of Shmooballs to this session, you will need all you can get.

As far as Shmoocon in general- Yes, there will be a Shmoobus. Maybe more than one. There will be great talks, great people, much hilarity, etc. I hope to see you there.

Tuesday, December 1, 2009

The US-CERT issued a bunch of new notices today. And one is BS. Not complete BS, there is a real problem with the way SOME web-based SSL VPNs break cross-domain security.

I have three primary problems with this, starting with the title, "Clientless SSL VPN products break web browser domain-based security models". Of course, there is no such thing as a clientless VPN, there are just systems which install the VPN client in your browser (sometimes without user interaction), and a few just use the browser itself as the client. Most of what I see called "clientless" are actually installing a ActiveX, Java, or other client in the browser. "Clientless VPN" is a nonsensical marketing term which has no place in a technical discussion.

Next problem, the list of "affected systems" includes systems which are clearly not affected- and while their status is listed as "unknown", the implication is that they may be vulnerable. For the amount of vetting that went into the list, they could have included Microsoft Word and listed it as "unknown" status. For example, OpenVPN is listed, but it is an installed application, not web-based- and unless they have completely butchered the description there is no way OpenVPN is vulnerable to this. Also entertaining is the listing of several Linux distributions, most tagged as "unknown" status, with the notable exception of Red Hat, which is listed as "Not Vulnerable". Odd they would commit, or even list OSes given the multitude of VPNs which can be configured on a Linux. Wait, not odd, useless and misleading.

Finally, and most critically, by the time we've peeled back the obvious mistakes and fluff, the full nature and extent of the vulnerability is not clear. After a bit of de-obfuscation and digging, you can probably figure it out. Silly me, I thought that was what CERT was supposed to do for us when they issued these notices.

There are two posts on the topic over at Securosis that are worth a read, the first post isn't great, but the comments are. The second one is a good clarification of the first.

At first, I was really wound up about Nick's post on the 60 Minutes piece because he seemed to be excusing sloppy "journalism" because the value of reaching a wider audience outweighed the problems of questionable reporting. [Part of my reaction was certainly due to my contempt for 60 Minutes, I feel that that they don't do investigative journalism, they are what is wrong with "investigative journalism" on television. The fact that 60 Minutes is generally less horrible than anything else in genre is not comforting]. In case you somehow missed it, there was quite a bit of furor over 60 Minutes' claim that a Brazilian blackout was caused by hackers. Robert Graham had a pretty terse post about this on the Errata Blog, and Rich Mogull did a good job of providing a balanced perspective.

My take on the 60 Minutes bit is that investigative reporting should investigate, and do so with a significant dose of skepticism, and report findings honestly. I also think that we as the audience have a responsibility to be skeptical of the reporting. 60 Minutes' hacker claim could not be backed up conclusively (at least not publicly and on the record), so I believe they should have been honest about that instead of going for the hype. If they had said something like

"There are some conflicting reports as to the true cause of the outage, but we have high confidence in our sources. What may be more troubling than the actual cause of the outage is the fact these systems are so vulnerable to so many attacks, and so poorly monitored and regulated, that even after a major outage the true cause cannot be determined conclusively."

I would have been happy with that. But, that isn't a sexy soundbite. Oh, well, it is television.

I thought Nick's post on the Fudsec blog was good, but it included a fairly flip comment about the ease of mitigating the Aurora vulnerability (585k PDF), which triggered objections. Nick clarified his position on this in a comment to his post. The central idea of the post is an interesting one- that getting customers mad at the negligent utilities and demanding improvements is the way to address the problem of vulnerable private critical infrastructure. I am not sure how likely that is to happen, but that is the capitalist way to do it, and money talks (although it has nearly lost its voice of late).

Where does this leave us? Nick has made some very good points, and I think he has hit a fundamental problem in trying to get the word out to a larger audience than those of us in the security world: how to simplify the issues into concise and understandable language (NOT dumbed down) so that non-professionals can understand it, while not running afoul of appropriately detail-oriented, accuracy-demanding professionals.

This has to get sorted out, too much effort in the security community is spent in navel-gazing, chest-beating, choir-preaching, and other hyphenated silliness. We need to engage and educate people outside our community if we are going to make real progress.

We've heard this idiocy from a variety of smart people, including Nicholas Carr and even The Bruce, and there is some truth to it- some parts of IT are becoming commodities, and IT is certainly evolving. Some people have extrapolated these ideas into saying that careers in IT are dead-ends. Now I've got nothing against the judicious use of hype and hyperbole to make a point, but these ideas fall apart pretty quickly under a little scrutiny. As far as "death" of the careers, these lies aren't even true for actual utilities such as power and water.

Let's start with commodity- it is certainly true that in IT you can often get similar services from a multitude of sources, but the commodity/utility analog only goes so far. For one thing, utilities usually offer little or no choice; your water company is the only game in town, unless you dig a hole in the yard. Other utilities do have some competition, but "the x company" is often responsible for "last mile" connectivity regardless of who you send the check to each month. Turning to the product- when I turn the knob on the faucet I get water; when my neighbor turns on her faucet she gets water, too- and it is the same water for the entire area, and whoever needs it, gets it. Same goes for electricity, natural gas, etc. Sure, there are a couple of different pressure/voltage/flow options, but it is all just increments of the same thing. And as far as electricity, it is crap. "You'll outsource your network the way you outsource electricity". Except NO ONE with a need for stable and reliable electricity outsources it completely- what comes off the wire is garbage, we have to use a variety of devices, from UPSes to power conditioners to have any faith in what comes out of the wall. Oh, and I don't suppose you've noticed the booming sale of generators to businesses large and small (and to homeowners)- that's because the commodity is not good enough and not reliable enough to trust. I hear the arguments, "but Jack, my phone is MY number", and that is true, but it is still the same capability set with a little personalization. Cable TV falls into this category, your whole neighborhood gets a set of available features, if you want something unique, you get lots of practice at "wanting", because you aren't getting it. The phone company does offer a lot more than POTS lines these days, but they need a lot of people to do it- and you need people to take their services from the demarc point to something useful.

Moving on to the "... is dead" or "... is irrelevant" nonsense. Starting with the obvious: if everyone doesn't generate their own electricity, but instead buys it... the electric company has to hire a buttload of electricians and engineers to make this work. The task is not "dead", it just moved. As we move beyond that, answer this: if something is dead or irrelevant once it is a "commoditized utility" , can you explain why you see so many plumbers and electricians on your daily commute? Because things go wrong. Because it has to be installed. Because if you get a "one size fits all" commodity, someone has to make it fit for you. Because someone has to get the various commodities where they are needed and to keep them from leaking into unwanted places. Let's not overlook all the plumbers and electricians you don't see, the ones who go to work at the same site every day- plants, retail facilities, hotels, and so on. They have careers in spite of working with utilities. Some have jobs because of the utilities' poor quality and service.

Part of this flawed mindset is human nature, at least the nature of humans who aren't curious or observant- if someone else does something for me, it is automatic, and I can ignore automatic things. Until they fail and I'm screwed because I don't know how it works, so I can't even figure out the right person to call. Here, we're actually on to something, because that describes a lot of what we deal with in IT.

As mentioned earlier, IT is evolving, and some things are being "commoditized". Cloud computing, whatever that means*, is a great example of this. Unfortunately, there is a lot of confusion about cloud computing, and even more misinformation. It will eventually get worked out, but for now, I like being on the sidelines of the cloud game.

The "dead-end" career talk about IT is, however, absolutely accurate- if you aren't ready, willing, and able to work in an evolving environment. On the other hand, if you are working to keep up with your industry and looking ahead, you are probably as safe as anyone in this volatile global economy.

*I actually have a grasp on "cloud" terminology, but it is not my focus. If you want to know about cloud computing issues you are already a reader of Hoff's blog, or you should be.

You may have noticed I didn't mention anything about the impact of commoditization on security, or security's impact on commoditization. That is a set of discussions for another time, but for now let's just go with "What could possibly go wrong?"

Saturday, November 21, 2009

Always accurate, insightful, and irreverent, there's another great post over on the Layer 8 blog, this time taking aim at the "security metrics" landscape. "The meaning of metrics" has a great take on metrics, and really separates reality from navel-gazing. It also provides some memorable quips and quotes. I especially like:

Sunday, November 15, 2009

Those nice folks who give money to your company, you know, the customers- whose customers are they? Are they the company's customers, or the salesman's? Or a bit of both? Maybe it is more complicated than that, if your company sells through partners/agents/resellers- now whose customers are they?

And the tricky bit- you aren't trying to secure customer data without everyone involved understanding, and agreeing on, whose customers they are, and who is responsible for the data, are you? That would be waste of time, wouldn't it?

If you are new at this, especially if you only see it from an information security perspective, this may seem fairly simple. It isn't. Salesmen (real salesmen, as opposed to people who just sell stuff) always have their "Rolodex" with their customers in it. That's part of what you get when you hire a salesman, access to their customer base- and the salesman takes it with them when they go. The salesman's right to take their customer list with them was supposedly codified in law in some states, but regardless of law, the practice has been universal. And now we have breach disclosure and data protection regulations preventing customer information from "leaking", so that magically stops, salesmen readily surrender their livelihoods without a battle (to a salesman, their customer list is their livelihood, make no mistake about that), and we're covered. And those jurisdictions which codified the salesmen's rights to their customers, I'm sure they updated their laws to reconcile the conflicts between the various laws and regulations protecting the salesmen's rights and the customer's data. No state would leave businesses stuck between contradictory laws, twisting in the wind. Things like that just don't happen.

I would like to offer a simple answer, but this is another one where lawyers most likely need to be consulted, the problems discussed, policies drafted, etc. The critical part will be making sure everyone involved knows and understands what the policies are, what legal implications drove the policies, and how the policies will be enforced. And then the policies must be enforced.

I do have a few ideas about this-

Social Security, credit card, or other account numbers need to be expressly prohibited from entering or leaving via "the Rolodex"

No brainer, but needs to be clear to all involved

If any information is allowed to enter the company via "the Rolodex", it is only fair to allow it to leave that way

If it can't leave, don't let it come in.

If it comes in, it came from somewhere else where they are fighting the same battle

The data is going to leave anyway. Deal with it.

Really, deal with it.

Everyone has to know what is and is not allowed

Steps need to be taken to control and monitor data

This doesn't excuse the company from doing the right thing whenever possible- but the nature of people, especially salespeople, must be taken into account.

Monday, November 2, 2009

I know, that cool Podcast.com widget over there needs an update. I tried that, but they are having "technical difficulties" at Podcast.com right now. I'll be adding Exotic Liability, Threatpost podcasts, and others, with some details soon- if they get the widget fixed. If not, I'll swap it out for a different widget.

While you're waiting, head over to Pauldotcom and listen to me humiliate myself and several others on their Halloween episode. Not or the faint of heart, easily offended, or anyone burdened by a sense of decorum. The remaining parts of the podcast were great, tech segments, juvenile yet informative banter, etc.

Friday, October 30, 2009

What changed in the latest "final" version of Massachusetts 201 CMR 17.00? Here's what I see (emphasis is mine):

Under 17.02, Definitions

"Owns or licenses: receives, maintains, processes, or otherwise has access to personal information in connection with the provision of goods or services or in connection with employment."

became

"Owns or licenses: receives, stores, maintains, processes, or otherwise has access to personal information in connection with the provision of goods or services or in connection with employment.

That's a big win, adding that little word stores to the mix.

Also in definitions:

"Service provider: any person that receives, stores, maintains, processes, or otherwise is permitted access to personal information through its provision of services directly to a person that is subject to this regulation."

is now

"Service provider: any person that receives, maintains, processes, or otherwise is permitted access to personal information through its provision of services directly to a person that is subject to this regulation; provided, however, that “service provider” shall not include the U.S. Postal Service."

This just reflects the change in definition for those who store data, moving them from the "service provider" category to the "owns or licenses" group. The USPS exclusion seems redundant, the Commonwealth cannot impose regulations on federal agencies (especially that one).

17.03 (2)(f) 2 changed from

"Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information; provided, however, that any contract a person has entered into with a third party service provider prior to March 1, 2012, shall be deemed to be in compliance herewith, notwithstanding the absence in any such contract of a requirement that the service provider maintain such protective security measures, so long as the contract was entered into before March 1, 2010."

to

"Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information; provided, however, that until March 1, 2012, a contract a person has entered into with a third party service provider to perform services for said person or functions on said person’s behalf satisfies the provisions of 17.03(2)(f)(2) even if the contract does not include a requirement that the third party service provider maintain such appropriate safeguards, as long as said person entered into the contract no later than March 1, 2010."

And that's it. No more changes. See the previous version here for reference.

The "Final" (I think this is the third final version, but who's counting?) version of Massachusetts 201 CMR 17.00 was released today. I believe this is really final, I doubt that anyone has the stomach for more of the political process that crafted this regulation. Below is the complete and unedited final version. The changes seem subtle at first glance, I'll follow up once I have time to review and compare.

201 CMR 17.00: STANDARDS FOR THE PROTECTION OF PERSONAL INFORMATION OF RESIDENTS OF THE COMMONWEALTH

Section:

17.01: Purpose and Scope

17.02: Definitions

17.03: Duty to Protect and Standards for Protecting Personal Information

17.04: Computer System Security Requirements

17.05: Compliance Deadline

17.01 Purpose and Scope

(1) Purpose

This regulation implements the provisions of M.G.L. c. 93H relative to the standards to be met by persons who own or license personal information about a resident of the Commonwealth of Massachusetts. This regulation establishes minimum standards to be met in connection with the safeguarding of personal information contained in both paper and electronic records. The objectives of this regulation are to insure the security and confidentiality of customer information in a manner fully consistent with industry standards; protect against anticipated threats or hazards to the security or integrity of such information; and protect against unauthorized access to or use of such information that may result in substantial harm or inconvenience to any consumer.

(2) Scope

The provisions of this regulation apply to all persons that own or license personal information about a resident of the Commonwealth.

17.02: Definitions

The following words as used herein shall, unless the context requires otherwise, have the following meanings:

Breach of security, the unauthorized acquisition or unauthorized use of unencrypted data or, encrypted electronic data and the confidential process or key that is capable of compromising the security, confidentiality, or integrity of personal information, maintained by a person or agency that creates a substantial risk of identity theft or fraud against a resident of the commonwealth. A good faith but unauthorized acquisition of personal information by a person or agency, or employee or agent thereof, for the lawful purposes of such person or agency, is not a breach of security unless the personal information is used in an unauthorized manner or subject to further unauthorized disclosure.

Encrypted, the transformation of data into a form in which meaning cannot be assigned without the use of a confidential process or key.

Owns or licenses, receives, stores, maintains, processes, or otherwise has access to personal information in connection with the provision of goods or services or in connection with employment.

Person, a natural person, corporation, association, partnership or other legal entity, other than an agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof.

Personal information, a Massachusetts resident's first name and last name or first initial and last name in combination with any one or more of the following data elements that relate to such resident: (a) Social Security number; (b) driver's license number or state-issued identification card number; or (c) financial account number, or credit or debit card number, with or without any required security code, access code, personal identification number or password, that would permit access to a resident’s financial account; provided, however, that “Personal information” shall not include information that is lawfully obtained from publicly available information, or from federal, state or local government records lawfully made available to the general public.

Record or Records, any material upon which written, drawn, spoken, visual, or electromagnetic information or images are recorded or preserved, regardless of physical form or characteristics.

Service provider, any person that receives, stores, maintains, processes, or otherwise is permitted access to personal information through its provision of services directly to a person that is subject to this regulation.

17.03: Duty to Protect and Standards for Protecting Personal Information

(1) Every person that owns or licenses personal information about a resident of the Commonwealth shall develop, implement, and maintain a comprehensive information security program that is written in one or more readily accessible parts and contains administrative, technical, and physical safeguards that are appropriate to (a) the size, scope and type of business of the person obligated to safeguard the personal information under such comprehensive information security program; (b) the amount of resources available to such person; (c) the amount of stored data; and (d) the need for security and confidentiality of both consumer and employee information. The safeguards contained in such program must be consistent with the safeguards for protection of personal information and information of a similar character set forth in any state or federal regulations by which the person who owns or licenses such information may be regulated.

(2) Without limiting the generality of the foregoing, every comprehensive information security program shall include, but shall not be limited to:

(a) Designating one or more employees to maintain the comprehensive information security program;

(b) Identifying and assessing reasonably foreseeable internal and external risks to the security, confidentiality, and/or integrity of any electronic, paper or other records containing personal information, and evaluating and improving, where necessary, the effectiveness of the current safeguards for limiting such risks, including but not limited to:

1. Taking reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect such personal information consistent with these regulations and any applicable federal regulations; and

2. Requiring such third-party service providers by contract to implement and maintain such appropriate security measures for personal information; provided, however, that until March 1, 2012, a contract a person has entered into with a third party service provider to perform services for said person or functions on said person’s behalf satisfies the provisions of 17.03(2)(f)(2) even if the contract does not include a requirement that the third party service provider maintain such appropriate safeguards, as long as said person entered into the contract no later than March 1, 2010.

(g) Reasonable restrictions upon physical access to records containing personal information,, and storage of such records and data in locked facilities, storage areas or containers.

(h) Regular monitoring to ensure that the comprehensive information security program is operating in a manner reasonably calculated to prevent unauthorized access to or unauthorized use of personal information; and upgrading information safeguards as necessary to limit risks.

(i) Reviewing the scope of the security measures at least annually or whenever there is a material change in business practices that may reasonably implicate the security or integrity of records containing personal information.

(j) Documenting responsive actions taken in connection with any incident involving a breach of security, and mandatory post-incident review of events and actions taken, if any, to make changes in business practices relating to protection of personal information.

17.04: Computer System Security Requirements

Every person that owns or licenses personal information about a resident of the Commonwealth and electronically stores or transmits such information shall include in its written, comprehensive information security program the establishment and maintenance of a security system covering its computers, including any wireless system, that, at a minimum, and to the extent technically feasible, shall have the following elements:

(1) Secure user authentication protocols including:

(a) control of user IDs and other identifiers;

(b) a reasonably secure method of assigning and selecting passwords, or use of unique identifier technologies, such as biometrics or token devices;

(c) control of data security passwords to ensure that such passwords are kept in a location and/or format that does not compromise the security of the data they protect;

(e) blocking access to user identification after multiple unsuccessful attempts to gain access or the limitation placed on access for the particular system;

(2) Secure access control measures that:

(a) restrict access to records and files containing personal information to those who need such information to perform their job duties; and

(b) assign unique identifications plus passwords, which are not vendor supplied default passwords, to each person with computer access, that are reasonably designed to maintain the integrity of the security of the access controls;

(3)Encryption of all transmitted records and files containing personal information that will travel across public networks, and encryption of all data containing personal information to be transmitted wirelessly.

(4) Reasonable monitoring of systems, for unauthorized use of or access to personal information;

(5) Encryption of all personal information stored on laptops or other portable devices;

(6) For files containing personal information on a system that is connected to the Internet, there must be reasonably up-to-date firewall protection and operating system security patches, reasonably designed to maintain the integrity of the personal information.

(7) Reasonably up-to-date versions of system security agent software which must include malware protection and reasonably up-to-date patches and virus definitions, or a version of such software that can still be supported with up-to-date patches and virus definitions, and is set to receive the most current security updates on a regular basis.

(8) Education and training of employees on the proper use of the computer security system and the importance of personal information security.

17.05: Compliance Deadline

(1)Every person who owns or licenses personal information about a resident of the Commonwealth shall be in full compliance with 201 CMR 17.00 on or before March 1, 2010.

REGULATORY AUTHORITY

201 CMR 17.00: M.G.L. c. 93H

I expect that additional information will be posted to the OCABR site soon. NOTE: they still have the old version of the regulation posted.

Monday, October 26, 2009

I recently listened to a panel discussion on the regulation which shall not be named and heard someone say something stupid (amazing, I know). He tossed out some very large numbers of dollars that Hannaford Bros has lost and will likely lose in the future due to their breach, he said it could total up to one billion dollars over time- but that it could have been prevented with an expenditure of "only" ten million dollars. I'm with him so far, even if I am skeptical of the accuracy of some of the figures. Then he said that "ROI is the answer to your question" and I lost it. This has nothing to do with ROI, there is no such thing as Return on Security Investment, that's what led to the development of FOI, a real metric. But back to the case in question, lose a billion because you didn't spend ten million has nothing to do with ROI. If you must play acronym bingo, it is a case of LoFtI (Loss on Failure to Invest). Although LoFtI itself is bad, it is a valuable asset in the ITYS (I Told You So) budgeting process (assuming your company survives the loss). So, what if they spent the ten million and nothing happened? There's no tangible return on that. What if you spent the ten million and something bad happened anyway? That is FOI.

Can we say Hannaford didn't spend "enough"? While some make that argument, I certainly will not. How about the opposite- can we say Hannaford (ChoicePoint, TJX, Heartland, et. al.) spent too much? Well, not TJX, but that is a story best told over adult beverages. But for the rest, there is a strong argument to be made for this, because what they spent didn't prevent breaches, and thus was a waste of resources (unless the expenditures prevented other breaches- but we can't really prove the negative). At least we could argue that Hannaford and others spent money in the wrong places. Yes, I'm talking this in circles, which is all you can do if you talk about security solely in terms of money. Security is about exposures, vulnerabilities, mitigations and much more. Of course security costs money, but so does marketing.

Marketing, you say? Yes, let's talk about marketing in comparison to security. Marketing people try to provide the most effective programs possible for the money spent, and can measure the results in terms of leads per dollar, and then dig deeper into closing ratios, margins on closed deals, etc. That is measurable ROI. That kind of ROI can help steer effective future actions and expenditures. That kind of ROI doesn't exist in information security. (By the way, I am well aware that not all marketing expenses deliver measurable ROI).

I'm not suggesting that money isn't important, or that your security efforts shouldn't have value- but I am saying you cannot tell how blue the sky is with a yardstick.

Monday, October 19, 2009

I should be over it by now, but I'm not. I can accept that 201 CMR 17.00 has been reduced to a feeble checklist which only provides real security in the form of political cover for OCABR, the Office of Consumer Affairs Abandonment and Business Regulation. I cannot accept OCABR's behavior during the process, however. The hearings were scheduled for weekday mornings in an inconvenient location in downtown Boston, an arrangement guaranteed to skew attendance to those with a business need or justification. The location and timing of the events was very effective at keeping average citizens (aka victims) from attending or speaking. I have to hand it to AIM, they did an outstanding job of educating their members and rallying the troops- and their arguments carried the hearings. Of course, it is generally easier to win when you are largely unopposed.

The first hearing I attended was in a too-small room, many people were left standing, more couldn't get in and left, or left once the sweltering and stagnant air in the room became too much for them. The most recent hearing was in a slightly larger room, but still nowhere near adequate- it was completely predictable, unacceptable, and avoidable with reasonable planning. In such situations you might expect those responsible to be apologetic for their failures, you would be mistaken to expect such from Undersecretary Anthony and her team. Just another embarrassment for the beleaguered Commonwealth.

I am also astounded that in the Commonwealth of Massachusetts, one of the most openly hostile business environments in the US, OCABR chose to abandon their responsibility to protect consumers and repeatedly caved in to business demands. Come on guys, be consistent in your hostilities.

The OCABR welcome page states "At OCABR, we are committed to protecting consumers through consumer advocacy and education." If you happen to believe that, please contact me about some sure-fire business opportunities I have available for a nominal investment. Prospectus is on display in a locked file cabinet in the dark basement of a local planning office. Disregard the "Beware of Leopard" sign.

Estimated costs from bad things happening to your important business information

This is a simple, easy to understand, introduction to information security, focused on the small business. There is nothing earth-shattering here, just the basics. You could easily pick apart some of the oversimplification or other shortcomings in this document- but that misses the point, this is not for the seasoned infosec or IT professional, this is a tool to help us get the message out to those who need it.

Take a look at it, and spread the word to those who can benefit from it.

Sunday, October 11, 2009

It's been a while since I wrote about NAISG- so it must be time for an update. The original chapter in Boston has just kicked off its seventh year, and there are several other chapters running and a few forming. The current list includes:

Atlanta, GA

Boston, MA

Chicago, IL

Connecticut River Valley

Dallas , TX

Houston, TX

Midland, MI

Orlando, FL

Seattle, WA

Washington, DC

Bangalore, India

A few highlights of upcoming meetings:

This week's Atlanta meeting is on Wednesday night, October 14 and features "The Girls of Errata", Elizabeth Wharton, VP Legal Affairs & Business Development, and Marisa Fagan, Security Project Manager. The presentation is a Case Study Analysis: Social Networking ID Theft – Who You Gonna Call? The meeting will be at Taco Mac Lindbergh, and starts at 7pm. This pair delivered an outstanding presentation at Security B-Sides Las Vegas this summer and this one shouldn't be missed. More details on Andy Willingham's Andy, IT Guy blog.

The Boston meeting will be the following night, Thursday the 15th, and will feature The Pursuit of Security "Happyness," presented by Mike Rothman, SVP Strategy & Chief Marketing Officer for eiQ Networks, Chief Blogger at Security Incite and author of the Pragmatic CSO. The meeting will be at the usual location, Microsoft's Waltham offices. Details at the NAISG Boston site.

And [drumroll, please] NAISG's newest chapter, in Houston, will hold their inaugural meeting on Monday, Nov. 2. The presentation will be "Breaking Down the Enterprise Security Assessment" by Michael R. Farnum, CISSP. There will be "Eating and Mingling", the Official Chapter Kickoff and a book giveaway as well as the presentation and Q&A. The chapter will hold meetings at the Houston area Microsoft offices. Details and directions are on their NAISG page.

Note- even though Microsoft provides a venue for many of the meetings, NAISG is not a "Microsoft" group. Presentation topics span the spectrum of Information Security, regardless of venue- local Microsoft offices are simply generous with offers to provide meeting space for NAISG (and many other groups), and many chapters take advantage of this.

Saturday, October 3, 2009

Starting with the fundamental idea that information security is supposed to "secure information", we first need to determine what information must be protected. Here regulations may help specify, but there is much more information to protect in your environment than what is required- certainly confidential patient data and customer financial records must be protected, and not just because HIPAA or PCI DSS require it. Your organization may also have trade secrets, marketing campaigns, merger plans or other information which should be protected regardless of regulatory imperatives.

A basic rule of protection is that you must know what you have and where it is before you can protect it- even if the folks at MA OCABR can't figure this out. It doesn't matter if you need to defend jewelry from theft or credit card numbers from loss, you have to know where they are before you can protect them- so identifying the information you must protect is a logical first step towards both security and compliance.

The information to be secured will vary by organization and change over time, and therefore will require a flexible and versatile identification method. One effective approach is to start by asking three questions about the information to be protected:

How does the information enter the environment?

Identify every point of entry for the information.

Include the origins of internally created information.

Where is the information stored and accessed internally?

Not simply where it is stored, but also where it is used.

Not just where it is supposed to be, but where it really is stored and used.

How does the information leave your organization?

Map every egress point, including submissions to any outside organizations.

Note that you will have to account for remote workers, road warriors, and others "insiders" who store and access information while "outside".

Now for the truly informative step: connect the dots. All of the dots. Map all of those entry and creation points to the storage points to the use points, and then to the egress points. You will likely discover paths and storage locations previously overlooked, you may even need to go back and re-answer the three questions armed with your new insights.

With this exercise complete you can pick up the ClueBat and start cracking heads begin to build a plan for both securing the information, and meeting your compliance goals. Streamlining the information flow and reducing the number of storage points would be good starting points, these will reduce your exposure and simplify future security and compliance tasks.

Monday, September 28, 2009

In penance for the somewhat unpleasant tone of my last post, I offer this for your consideration.

A few months ago I was at an event where I exchanged pleasantries and made small talk with someone who is, shall we say, "not well respected" is some circles. I was later asked why I didn't just ignore the person. At the time I just shrugged it off, but here are a few reasons why I try to engage people, even when it takes some effort.

First, there's the decency/civility issue. The fact that someone else is a raging [insert preferred expletive/epithet here] does not give me permission to act like a [said expletive/epithet]. I realize that both "common courtesy" and "common decency" are almost as rare as common sense, but that does not excuse me from exercising them.

Second, just because someone is a [one of those] doesn't mean they aren't capable of intelligent conversation or providing a unique perspective on an issue. While it is generally more pleasant to learn from friends, it is often more insightful to view issues from a different perspective.

Third, the world is full of [them], learning to deal with [them] rationally is a necessity.

There certainly are some people who have crossed some line, ones who have established a pattern of behavior, or otherwise proven themselves beyond the range of civil discourse. At this point, avoiding them is usually better than arguing with them- you will either end up giving them exposure, making yourself look like a [you know], or both, if you mess with them.

Finally, there are some people who deserve to be called out. Just make sure you have examined the skeletons in your closet carefully and thoroughly before taking this step.

Now, if only I could follow my own advice better...

Jack

P.S. Don't believe any of this? OK, then try acting civil and consider it a social engineering exercise. The older I get, the less I care about motivation and the more I care about results.

Thursday, September 24, 2009

John Godfrey Saxe once said "Laws, like sausages, cease to inspire respect in proportion as we know how they are made." [No, it was not Otto von Bismark; and yes, that is the quote] I can't say much for "inspiring respect", but watching the creation of Massachusetts' 201 CMR 17.00 has been more like being a vegan and having to watch the manufacture of blood pudding.

Warning: harsh and unpleasant opinions and observations ahead.

On Tuesday I attended the latest public hearing on the data protection regulations, and it was not pretty. First up, the hearing was moved at the last minute- to a larger room, but still nowhere near large enough. It was pretty pathetic given the interest in 201CMR17.00 and previous hearings' overflow attendance, I had hoped they would be prepared. Instead, dozens were left standing and the room quickly became uncomfortably hot and stuffy. To me, this showed a lack of respect for the attendees and citizens of the Commonwealth. Maybe it was part of OCABR's plan to keep the proceedings short, if so, it didn't work. Sure, a handful of people decided not to speak at the end, but a lot of us did testify. That's where it got ugly, and it got ugly fast.

First up was Scott Schafer, Chief of the Consumer Protection Division of the Mass. Attorney General's Public Protection & Advocacy Bureau. That title should give you a heads-up for what's coming. Mr. Shafer read a prepared statement that could only have come from an unholy alliance of Commonwealth PR flaks and attorneys. He spoke for quite a while without saying much- but it appears the AG's office likes the latest version of the regs, undoubtedly because they are vague enough to prevent the AG's office from ever having to prosecute except in the most egregious cases. I was particularly impressed with his lack of awareness of the data loss landscape, highlighted by his reference to the TJX breach putting thousands of consumers' records at risk. First, 96 million is a lot of thousands. Second, fraudulent charges were made on compromised accounts- but he's just an attorney and so can't be expected to respect the significance of linguistic subtleties. Not that the difference between "at risk" and "exploited" is subtle. [Note that once again I am forced to give someone the benefit of the doubt by assuming them ignorant, because to assume otherwise would mean the statements were intentionally misleading].

Speaking of uninformed, many speakers who followed Mr. Shafer raised their objections to the exclusion of many Commonwealth agencies and offices from 201 CMR 17.00. Much weeping about it, actually. Too bad the whiners were so oblivious to the regulatory landscape that they were unaware of Executive Order 504, mandating security and confidentiality of personal information for the Executive agencies of the Commonwealth. It is my understanding that EO 504 has not been repeatedly weakened or delayed, state agencies are dealing with it now.

There were actually several speakers with very specific concerns over definitions, requesting clarifications and tuning of individual sections of the regulations. Those in this group have my respect, even when I disagree with them.

Of course we had a few of the "I'm here to be heard but have nothing to say, look at me, look at me" minority you get at any gathering of people. At least none of these bloviators said "where are the TV cameras?" Well, not aloud. [There weren't any].

Another set of speakers fell into the category of "we need an exemption for our industry". There were a handful of these, I had no sympathy for any- except the public service orgs, and I doubt they will get any relief.

I did testify, if you read my last post on the topic you will have a good idea what I said. I highlighted a few of the key weakenings of the regs that concerned me, such as

Removal of the requirement to monitor the CISP

Removal of the requirement to minimize the amount of PII retained

Removal of the requirement to identify and classify PII

Removal of the requirement to have a written PII access policy

Evisceration of the encryption requirements

And misleading information regarding encryption in the FAQ issued by OCABR

Then I reminded everyone that 201CMR17.00 was mandated by 93H, which was stuck in legislative stupor until the TJX incident- and I asked if full compliance with 201CMR17.00 would have prevented that breach. My answer was no, the initial compromise vector was WEP, and with the weakened encryption requirements WEP could be argued to be adequate. I then opined that 201 would not have prevented the Heartland or Hannaford breaches, and pointed out the lack of web application security guidelines to protect PII. I reminded the panel of the credit and debit cards re-issued in recent months, all due to breaches. I raised the 263,471,744 lost records (as of that morning) on the Privacy Rights loss list- not counting the "unknown number of records" incidents on that list. I suggested that the lack of strong guidelines, the lack of established penalties, and the repeated changes and delays made for a lack of real risk for non-compliance with their risk-based scheme.

In closing, I may have indulged in a bit of hype and hyperbole by saying that the regulations provide enough leeway and plausible deniability that they would simply be a regulatory and policy-writing burden on those who do business in the Commonwealth, without significantly improving security- and if that was the case, the Commonwealth was wasting time and money by imposing a needless and pointless burden on business. I ended by observing that security is hard, and negligence is easy- until you suffer a failure.

One later speaker remarked that he disagreed with me and said that he was confident that any business making a good faith effort at compliance with 201 CMR 17.00 would be more secure. I really wished I had a unicorn to give him. While what he said was technically true- we are where we are because organizations have repeatedly failed to make "good faith" efforts at security. "Good faith" without a big stick backing it up is nothing but unicorn droppings.

Tuesday, September 22, 2009

Where do you get information on products for your environment? There are a lot of options- in fact there is no shortage of people with opinions who will happily share. Now, the hard bit- Where do you get information you can rely on for products for your environment? There are a lot of analyst firms, labs, websites, magazines, etc. that generate product comparisons, product evaluations, market analysis, and even the ubiquitous "shootouts" (which generally make we want to shoot something- or someone). Sadly, most of it is crap for a variety of reasons: tainted (or at least made suspect) by vendor sponsorships, failure to define parameters and procedures, misconfiguration of tested systems, a fundamental misunderstanding of the products/market segment, and so on. Some reports crank out a lot of data, but return very little information, and nothing which you can act on. Sometimes you can parse the raw data yourself and come up with better conclusions, or at least more relevant to your needs.

There are very good resources available, but you need to do a lot of filtering to find reliable sources of useful information. That is why personal recommendations are so valuable, and part of the appeal of real-world gatherings such as user groups, and virtual water coolers such as the Security Blogger's Network and the Security Twits- because real people are behind the answers.

These thoughts were triggered when I was fortunate enough to get previews of two reports released this week by NSS Labs. I hate to sound like a fanboy, but they have really put some thought into their analyses on endpoint security for web threats. Actually, the full title is "Endpoint Security, Socially Engineered Malware Protection, Comparative Test Results. That's a mouthful. There are two similar reports, one for consumer products, and one for corporate products. These are not "Anti-virus shootouts" or anything vague. The reports define a specific problem (web-based malware downloads) and define their testing methodology, including steps to insure consistent testing. The testing cycles were repeated, and they used live systems for testing, not canned data sets. When the tests were complete and validated, the data showed some interesting things. Both general and specific conclusions come from the tests. Global observations include the increasing importance of reputation-based services in the cloud, and that no matter what anyone says, anti-malware packages are not "commodities", there are significant differences in performance between the tested systems. That leads to the specific, the products which performed best dramatically outperformed the worst for the specific threats tested. The consumer products report is available free (registration required), the corporate products report is not free- but depending on your environment, the $1800 price tag could be trivial compared to the cost of making a mistake in purchasing endpoint protection products. You can extrapolate some things from the consumer report, but the corporate version includes some additional observations on the ease of management, and there are real differences in performance between corporate and consumer products.

When you need information, be skeptical, but keep looking- there is good information out there.

Monday, September 14, 2009

Toilets, auto emissions, Hunter S Thompson, all that 201 CMR 17.00 stuff, and now bathrobes- I know, but hear (read?) me out on this one.

I tend to be cheap about travel, all I usually want in a hotel room is a halfway decent bed, a functional shower, and relative cleanliness. Occasionally, however, I stay someplace without a complimentary "Hookers and Truckers" floor show in the parking lot. Sometimes, I end up staying in nicer places, and on rare occasions I stay in very nice places (this generally involves someone else's money, or a complimentary upgrade due to a colossal screw up by the hotel). How do you know when you are in a "luxury" hotel (or at least a luxury suite)? It isn't just the big things (big rooms, big TVs, big bathtubs, etc.) but the smaller things like mints on the pillow and the sure sign- the fluffy bathrobes.

So what's the point? You (or someone) has shelled out a lot of money, and the hotel wants you to feel like you are really getting your money's worth- because they know if you don't, you will not do it again. It is only fitting, if you spend a lot of money, you should know where it went and feel good about it.

Thankfully, the people who control security budgets don't expect to see where all that money has gone, nor do they expect to feel good about it... oh, wait. But we already do a great job at keeping people informed and... um, strike two.

It's different you say, they *have* to spend money on security. No, they do not. There may be a wide variety of factors which compel them to spend the money, but especially in this economic climate (hey, there's an area where some Global Warming is needed), financial pressures force some hard decisions. No matter what the regulations say, if the choice is between making payroll (and thus existing into next week) and anything else, "anything else" better be pretty darn compelling. Assuming your situation is not that dire, you should still think about the visibility of your security efforts, and whether the organization gets a good feeling from it. I am not advocating that your IT and security team(s) start offering a bed-time turn-down service to the boss (actually, I strongly advise against that) but think about what visible benefits your organization gets from their IT and security budget. If the answer is "not much", start thinking about how to change that. Don't bore people with too much detail or too frequent updates, but find a way to make your work visible (in a good way, not just as the jerks who always say no).

Tentative plans call for a departure from the Northern Virginia area, gathering Security Twits and others on the way to Boston, then heading into the Great White North. [I REALLY hope it is not so white in early October].

We have learned a few things from the previous adventures, and thus we hope to make new and exciting mistakes this time. I nothing else comes up, a border crossing should provide plenty of opportunity for entertainment.

Once again a big thank you goes to my employer, Astaro, for sponsoring the trip.

Friday, September 4, 2009

[I am occasionally contributing to the Corporate Overlords' blog, including a version of the post below. I am posting a version here, too, because the snark level on the official version was a bit low for my tastes].

Some people seem to be confused about compliance- some hate it, a few like it (I worry about these), and some really like to argue about it, especially when it comes to PCI-DSS. PCI-DSS is the much-maligned Payment Card Industry Data Security Standard, a set of requirements for companies which process credit card data. Full documentation is available from the PCI website. The standard is currently 72 pages, not a quick read- and that may be part of the problem; an amazing number of people like to argue about it without ever actually reading the beast. But then again, facts only serve to screw up a perfectly good uninformed rant.

I believe the root problem is that many people confuse being compliant with being secure. While they may be complimentary goals, compliance and security are very different. Being compliant with a "security" standard or regulation does not make you secure, and it's approaching the problem from the wrong direction- focusing your efforts on being secure, then aligning with your compliance requirements will result in a more secure, sustainable, and affordable environment.

Even people who should know better have been confused by this (or lied and claimed to be confused); recently Heartland CEO Robert Carr said in an interview with CSO Online that he believed PCI compliance meant that Heartland was "secure". We all learned that Heartland wasn't secure when they suffered the "Largest Data Breach Ever". The reactions to Mr. Carr's comments were strong and swift, Rich Mogull and Mike Rothman were among the many people who took exception to Mr. Carr's statements about compliance and security- and eloquent though their responses were, the controversy Mr. Carr's comments sparked only serves to highlight the problem.

Part of the confusion comes from the different security postures of organizations before they begin their compliance programs. For a company with poor security and a lack of organizational awareness of security standards, becoming PCI- (or whatever)- compliant can introduce many positive changes and dramatically improve the overall security of the organization. On the other hand, if an organization already has a well established and effective security posture, becoming compliant should be fairly easy, BUT, it could result in losing focus on security as attention shifts to compliance. Worse still, if an organization has done a thorough risk assessment and focused their efforts accordingly, some regulations may require them to divert resources to addressing requirements which are not aligned with actual risk to the organization, effectively reducing their security.

Another problem with compliance is that while most security professionals understand that regulations define the minimum security standard, many outside of the field believe that compliance is all that you need to do to be secure- thus confusing a security baseline with a finish line. Or, maybe they don't confuse them, but a scrap of paper gives them cover to say they have done "enough".

In the absence of standards and regulations it is often easier to grasp that security is a process, not something you "are" or "aren't", and should be tailored to fit the situation. Unfortunately, it is also common for organizations to neglect security unless they are required to comply with some regulations or laws.

Finally, complaining about PCI, HIPAA, or any other regulation doesn't change the fact that we need to comply. Go ahead and work to change the laws or regulations you find onerous- but complaining is no substitute for an ongoing assessment of your environment, securing it as appropriate, and mapping your security posture to meet compliance requirements.

Saturday, August 29, 2009

If you are a CISSP, or hold any other (ISC)2 certification, read on- the elections are coming up. Just like in real life, I don't want to hear you whine about the state of things if you don't take the time to vote. There are three candidates who are not on the official slate, they each need 633 "signatures" to get onto the ballot. Jack Holleran, Claude Williams, and Seth Hardy are trying to get on the ballot and I encourage you to look into their positions and sign/vote as you feel appropriate. Remember, signing a petition does not commit you to voting for the candidate, it simply helps get them on the ballot.

Seth has a website outlining his positions. I took a look, liked what I saw, signed his petition, and plan to vote for him. Seth is an active and engaged member of the security community, the kind of person I believe may be able to help steer (ISC)2 in a direction I would like to see.

"Signing" actually means sending an email from the address of record with (ISC)2 (or including the address of record in the email), including your member number, and stating that you are signing the candidate's petition. (This assumes you are a member in good standing).

Friday, August 28, 2009

I need time to calm down before commenting further on this, but I believe this Slate article may be one of the stupidest and most irresponsible things you read on technology this year.

On the other hand, this 147k, 20 page PDF from NIST, their draft of NISTIR 7621, Small Business Information Security: The Fundamentals is one of the best things I've seen recently. Nothing earth-shattering, but it is a very good document on small business security. It is readable, explains the rationale for its recommendations, and while 20 pages isn't short, it is a quick read. By the way, some people wonder why I dwell on small businesses so much, this quote from 7621 may help you understand:

"In the United States, the number of small businesses totals to over 95% of all businesses. The small business community produces around 50% of our nation’s Gross National Product (GNP) and creates around 50% of all new jobs in our country. Small businesses, therefore, are a very important part of our nation’s economy. They are a significant part of our nation’s critical economic and cyber infrastructure."

Thursday, August 27, 2009

I don't usually just highlight someone else's blog posts, but sometimes...

A couple of weeks ago Jeremy over at PacketLife updated his cheat sheets, and they are great. Then, John at the Security Monks blog put together a huge list of cheatsheets including those from PacketLife and more from SANS, OWASP, and several individual contributors. Check them out, there is a lot of great information out there.

Tuesday, August 25, 2009

It has been a few days since the latest amendment and delay evisceration of 201 CMR 17.00 was announced and it is time to take another look and give it a fair review. Besides the raw documents I recently posted, I strongly urge you to head over to David Navetta's post at InfoSecCompliance.com, he makes some very good points and clears up several changes. While you're there, review their redlined PDF version of the regulations- I think you'll agree that red is appropriate given they way the 201CMR17.00 has been butchered over time.

There are several points which frustrate me in the updated version, but I will limit my comments to a few (I tried for a few, it appears several is a better description of the result). Note that emphasis in text excerpts is mine, added to highlight my points.

First, the definition of encryption has changed from:

"the transformation of data through the use of an algorithmic process, or an alternative method at least as secure, into a form in which meaning cannot be assigned without the use of a confidential process or key, unless further defined by regulation by the Office of Consumer Affairs and Business Regulation."

to

"the transformation of data into a form in which meaning cannot be assigned without the use of a confidential process or key."

"Confidential process" and "cryptography" is a pairing destined for failure, and a password is a key, right? You and I may understand the difference between "encrypted" and "password protected", but I assure you that this will lead to many people blurring the two and not encrypting their data when required, or doing it badly- and the state has provided them with a plausible excuse by this definition.

Second, the previous version stated in 17.03 (1)

"Every person that owns, licenses, stores or maintains personal information about a resident of the Commonwealth shall develop, implement, maintain and monitor a comprehensive, written information security program..."

It now states

"Every person that owns or licenses personal information about a resident of the Commonwealth shall develop, implement, and maintain a comprehensive information security program that is written..."

While the words "stores or maintains" are missing, I think those are covered adequately elsewhere- it is the loss of the word "monitor" which concerns me. Make a plan, print it out, and put it on the Shelf of Neglect with the others. Sure, the FAQ says you need to monitor your plan, but the regulation doesn't, and that's what counts.

The next one might not be that bad, 17.03 (3) 5. before:

"Preventing terminated employees from accessing records containing personal information by immediately terminating their physical and electronic access to such records, including deactivating their passwords and user names."

Or, the removal of strong language might give the impression that "immediately" "physical and electronic" aren't that important. That would be bad.

Now for a series of outright attacks on security fundamentals and common sense guidelines, 17.03 (3) 7. stated

"Limiting the amount of personal information collected to that reasonably necessary to accomplish the legitimate purpose for which it is collected; limiting the time such information is retained to that reasonably necessary to accomplish such purpose; and limiting access to those persons who are reasonably required to know such information in order to accomplish such purpose or to comply with state or federal record retention requirements."

The corresponding section of the current regulations is

Missing

That's right, the common sense suggestion to only keep the data you need is gone. Forget the logic of "you can't lose what you don't have", go ahead and keep anything you want.

Also missing is the section corresponding to 17.03 (3) 8.

"Identifying paper, electronic and other records, computing systems, and storage media, including laptops and portable devices used to store personal information, to determine which records contain personal information, except where the comprehensive information security program provides for the handling of all records as if they all contained personal information."

This is an absolute fundamental tenet of any kind of security/protection program, and has always been- if you don't know what you have and where it is, you cannot protect it. Read through breach reports and you will find that data is routinely lost from places that weren't documented. Yes, a data inventory and classification project is likely to be painful, expensive and imperfect. That doesn't make it any less fundamental or necessary.

Section 17.03 (3) 9. went from

"Reasonable restrictions upon physical access to records containing personal information, including a written procedure that sets forth the manner in which physical access to such records is restricted; and storage of such records and data in locked facilities, storage areas or containers."

To this in 17.03 (2) (g)

"Reasonable restrictions upon physical access to records containing personal information,, and storage of such records and data in locked facilities, storage areas or containers."

Because the old political advice of "never write what you can say, and never say what you can wink" is the best way to handle policies, too. Or not.

Some of the items removed from 17.03 are listed in the computer security sections, 17.04- but that means those protections are not required for the physical world, only the digital.

As long as I am on a roll, let's poke at the FAQ, too. Besides confirming some of the above, the FAQ offers a few items I find especially problematic. First,

"Technically feasible” means that if there is a reasonable means through technology to accomplish a required result, then that reasonable means must be used."

Odd that a regulation which has been altered to "ease the burden" on businesses doesn't provide an economic escape clause- as written "technically feasible" includes solutions which may be prohibitively expensive. But don't worry, there are enough weasel words in this to allow an out somewhere.

Also from the FAQ, two truly horrifying things:

"Must I encrypt my email if it contains personal information?

If it is not technically feasible to do so, then no."

Per the previous definition of "technically feasible", email encryption is absolutely feasible. Also, this fails to address the simple solution of encrypting the sensitive information and attaching it to a message. Between the ubiquity of Microsoft Office and the free and cross-platform availability of OpenOffice, there is no excuse for not encrypting PII sent via email. Reality and intent aside, expect to see this used to shoot down email encryption proposals on a regular basis.

And this nonsense:

"Do all portable devices have to be encrypted?

No. Only those portable devices that contain personal information of customers or employees and only where technically feasible The "technical feasibility" language of the regulation is intended to recognize that at this period in the development of encryption technology, there is little, if any, generally accepted encryption technology for most portable devices, such as cell phones, blackberries, net books, iphones and similar devices. While it may not be possible to encrypt such portable devices, personal information should not be placed at risk in the use of such devices. There is, however, technology available to encrypt laptops."

It scares me to think someone in state government wrote this. Starting with the obvious targets- BlackBerries can be encrypted with minimal effort from the handset or via policy using a BlackBerry Enterprise Server. Nothing to it, really. Netbooks have BIOS and drives just like "real" computers- from the free and Open Source TrueCrypt through many commercial offerings, they are easy to encrypt. Other devices can get tricky, but Symbian-based phones support encryption. iPhones, well, Apple would tell us "there's an app for that", and even though we have learned that the built-in encryption for iPhone 3gs is nearly worthless, it probably still meets the requirements as currently written. Once again, reality notwithstanding, expect this blurb as a counterstrike to any suggestion of portable device encryption.

Maybe I need to look at it fresh, as if there had never been prior versions. Perhaps then it would look like a good start? Since the trigger for getting the parent law, 93H, passed was the TJX breach, would the current 201CMR17.00 have done anything to prevent that attack? No, it wouldn't. WEP is encryption, and this mess has enough wiggle room that I expect even the sadly broken WEP could stand up to 201's feeble scrutiny. What about other high-profile cases, such as Heartland? 201 doesn't require competent CEOs, web application code review, or web application firewalls; even the much-maligned PCI-DSS requires two of those three, and stopping simple SQL injection would have at least slowed down many recent attacks.

Now, for an immodest proposal, with no chance of passing (passing, as it would require a change in the law, 93H): forget all the prescriptive regulations and create specific and substantial penalties (financial and imprisonment) for failure, and make sure private lawsuits are expressly allowed. Let's put the RISK into this risk-based approach. (Yes, I understand that would drive some to try to keep their failures secret, but it will never happen anyway). I didn't suggest what I really want for punishment, though...

Monday, August 24, 2009

It's been a while, but while vacationing in Texas I was inspired to write a new installment of Security Anecdote Theater for you.

In the early days of the Republic of Texas, President Sam Houston felt that Austin wasn't an appropriate capital and wanted his namesake city of Houston to become the new capital of the young republic. Houston claimed that Austin's isolated, western location was insecure, and that Houston would be a much more appropriate seat of power. Austin's real hold on power was the fact that it held the government archives. When persuasion failed to win the cause, Houston resorted to sending armed thugs a military unit to Austin to steal retrieve the archives under cover of darkness. They quickly loaded the archives into wagons, and might have gotten away- but for innkeeper Angelina Eberly. Mrs. Eberly discovered the men loading the wagons- she knew she couldn't stop them, so she ran to the town cannon and fired it off ("ventilating" the Land Office building in the process), thus alerting the townsfolk. The alarmed and alerted citizens of Austin rallied and chased down the escaping men and wagons, retrieved the archives, and secured Austin's role as capital of Texas.

Sometimes all you can do about a problem is fire the town cannon- now we just need to work on getting the "townsfolk" to respond quickly and decisively when we raise alarms in Information Security...

Friday, August 21, 2009

I still want to do more work to them and release them as a podcast series, but for now- all audio from the Security B-Sides Las Vegas event is up on a SkyDrive folder. See the schedule on the B-Sides site for more information on the talks. Note that parts of Jennifer Jabbusch's and HD Moore's talks, and none of Valsmith's talk were not recorded at the request of the speakers.

Thursday, August 20, 2009

I have been on a couple of podcasts lately, and it is a testament to their hosts and producers that they managed to make me sound coherent. Well, almost coherent.

I'm a bit late on the first one, I filled in for Rich Mogull on the Network Security Podcast episode 153 with Martin McKeay. I was very happy to be on Martin's show, he has been blogging and podcasting for quite a while, and is one of the people who inspired me to start blogging myself.

BOSTON – Aug. 17, 2009 – In keeping with Governor Deval Patrick’s commitment to balancing consumer protection with the needs of small business owners, Massachusetts Undersecretary of the Office of Consumer Affairs and Business Regulation Barbara Anthony today announced adjustments to Massachusetts’ identity theft regulations that maintain protections and also reinforce flexibility in compliance by small businesses.

The updated regulations will take effect March 1, 2010. The regulations make clear that their approach to data security is a risk-based approach that is especially important to small businesses that may not handle a lot of personal information about customers. Under a risk-based approach, a business, in developing a written security program, should take into account its size, nature of its business, the kinds of records it maintains, and the risk of identity theft posed by its operations.

“In listening to the concerns of small business leaders, we understand there were issues regarding the impact these regulations have on those companies,” said Undersecretary Anthony. “These updated regulations feature a fair balance between consumer protections and business realities.”

New language in the regulations recognizes that the size of a business and the amount of personal information it handles plays a role in the data security plan the business creates. The new language requires safeguards that are appropriate to the size, scope and type of business handling the information; the amount of resources available to the business; the amount of stored data; and the need for security and confidentiality of both consumer and employee information.

The changes, Anthony said, make clear the regulations are risk-based in implementation, not just in enforcement as had been the case in earlier versions of the regulations. In addition, the regulations are technology neutral and acknowledge that technical feasibility plays a role in what many businesses, especially small businesses can do to protect data. The overall approach is more consistent with federal law, she said.

“Whether it’s a small amount of employee paperwork, or a large amount of consumer information kept on an electronic database, each requires its own appropriate level of security and protection,” Anthony said. “The changes we are making reflect that reality without exposing companies or consumers to a heightened risk of theft.”

The regulations are a product of the identity theft prevention law signed by Governor Deval Patrick. Governor Patrick signed an executive order last September requiring all state agencies to implement security measures consistent with the requirements in the regulations.

The Office of Consumer Affairs and Business Regulation today sent to the Secretary of State notice of public hearing on the changes. That hearing will be held on Tuesday, Sept. 22, at 10 a.m. at the Transportation Building, 10 Park Plaza, Boston.

For more information about identity theft protection, visit the Office of Consumer Affairs and Business Regulation website, www.mass.gov/consumer.

Pursuant to the provisions of M.G.L. c. 30A, and the authority granted to the Undersecretary of the Office of Consumer Affairs and Business Regulation under M.G.L. c. 93H, the Office of Consumer Affairs and Business Regulation will hold a public hearing in connection with the promulgation of 201 CMR 17.00, concerning the protection of personal information of residents of the Commonwealth. The public hearing will commence at 10:00 a.m. on Tuesday September 22, 2009, in Room No. 5-6, Second Floor of the Transportation Building, Ten Park Plaza Boston, Massachusetts 02116.

The purpose of the public hearing is to afford interested parties an opportunity to provide oral or written testimony regarding 201 CMR 17.00, Standards for the Protection of Personal Information of Residents of the Commonwealth. The purpose of 201 CMR 17.00 is to implement the provisions of M.G.L c. 93H relative to the standards to be met by those who own or license personal information about a resident of the Commonwealth. The regulation establishes standards for safeguarding such information, in paper and electronic records, in order to protect its security and confidentiality in a manner consistent with industry standards, to protect against threats and hazards to the security of such information, and to protect against unauthorized access to or use of such information in a manner that may result in substantial harm or inconvenience to any consumer.

Interested parties will be afforded a reasonable opportunity at the hearing to present oral or written testimony. Written comments will be accepted up to the close of business on September 25, 2009. Such written comments may be mailed to: Office of Consumer Affairs and Business Regulation, 10 Park Plaza, Suite 5170, Boston, MA 02116, Attention: Jason Egan, Deputy General Counsel, or e-mailed to Jason.Egan@state.ma.us.

Copies of the proposed regulation may be obtained from the Office of Consumer Affairs and Business Regulation website, or by calling (617) 973-8700.

What are the differences between this version of 201 CMR 17.00 and the version issued in February of 2009?

There are some important differences in the two versions. First, the most recent regulation issued in August of 2009 makes clear that the rule adopts a risk-based approach to information security, consistent with both the enabling legislation and applicable federal law, especially the FTC's Safeguards Rule. A risk-based approach is one that directs a business to establish a written security program that takes into account the particular business' size, scope of business, amount of resources, nature and quantity of data collected or stored, and the need for security. It differs from an approach that mandates every component of a program and requires its adoption regardless of size and the nature of the business and the amount of information that requires security. This clarification of the risk based approach is especially important to those small businesses that do not handle or store large amounts of personal information. Second, a number of specific provisions required to be included in a business’s written information security program have been removed from the regulation and will be used as a form of guidance only. Third, the encryption requirement has been tailored to be technology neutral and technical feasibility has been applied to all computer security requirements. Fourth, the third party vendor requirements have been changed to be consistent with Federal law.

To whom does this regulation apply?

The regulation applies to those engaged in commerce. More specifically, the regulation applies to those who collect and retain personal information in connection with the provision of goods and services or for the purposes of employment. The regulation does not apply, however, to natural persons who are not in commerce.

Does 201 CMR 17.00 apply to municipalities?

No. 201 CMR 17.01 specifically excludes from the definition of “person” any “agency, executive office, department, board, commission, bureau, division or authority of the Commonwealth, or any of its branches, or any political subdivision thereof.” Consequently, the regulation does not apply to municipalities.

Must my information security program be in writing?

Yes, your information security program must be in writing. The scope and complexity of the document will vary depending on your resources, and the type of personal information you are storing or maintaining. But, everyone who owns or licenses personal information must have a written plan detailing the measures adopted to safeguard such information.

What about the computer security requirements of 201 CMR 17.00?

All of thecomputer security provisions apply to a business if they are technically feasible. The standard of technical feasibility takes reasonableness into account. (See definition of “technically feasible” below.) The computer securityprovisions in 17.04 should be construed in accordance with the risk-based approach of the regulation.

Does the regulation require encryption of portable devices?

Yes. The regulation requires encryption of portable devices where it is reasonable and technically feasible. The definition of encryption has been amended to make it technology neutral so that as encryption technology evolves and new standards are developed, this regulation will not impede the adoption of such new technologies.

Do all portable devices have to be encrypted?

No. Only those portable devices that contain personal information of customers or employees and only where technically feasible The "technical feasibility" language of the regulation is intended to recognize that at this period in the development of encryption technology, there is little, if any, generally accepted encryption technology for most portable devices, such as cell phones, blackberries, net books, iphones and similar devices. While it may not be possible to encrypt such portable devices, personal information should not be placed at risk in the use of such devices. There is, however, technology available to encrypt laptops.

Must I encrypt my backup tapes?

You must encrypt backup tapes on a prospective basis. However, if you are going to transport a backup tape from current storage, and it is technically feasible to encrypt (i.e. the tape allows it) then you must do so prior to the transfer. If it is not technically feasible, then you should consider the sensitivity of the information, the amount of personal information and the distance to be traveled and take appropriate steps to secure and safeguard the personal information. For example, if you are transporting a large volume of sensitive personal information, you may want to consider using an armored vehicle with an appropriate number of guards.

What does “technically feasible” mean?

“Technically feasible” means that if there is a reasonable means through technology to accomplish a required result, then that reasonable means must be used.

Must I encrypt my email if it contains personal information?

If it is not technically feasible to do so, then no. However, you should implement best practices by not sending unencrypted personal information in an email. There are alternative methods to communicate personal information other through email, such as establishing a secure website that requires safeguards such as a username and password to conduct transactions involving personal information.

Are there any steps that I am required to take in selecting a third party to store and maintain personal information that I own or license?

You are responsible for the selection and retention of a third-party service provider who is capable of properly safeguarding personal information. The third party service provider provision in 201 CMR 17.00 is modeled after the third party vendor provision in the FTC’s Safeguards Rule.

I have a small business with ten employees. Besides my employee data, I do not store any other personal information. What are my obligations?

The regulation adopts a risk-based approach to information security. A risk-based approach is one that is designed to be flexible while directing businesses to establish a written security program that takes into account the particular business's size, scope of business, amount of resources and the need for security. For example, if you only have employee data with a small number of employees, you should lock your files in a storage cabinet and lock the door to that room. You should permit access to only those who require it for official duties. Conversely, if you have both employee and customer data containing personal information, then your security approach would be more stringent. If you have a large volume of customer data containing personal information, then your approach would be even more stringent.

Except for swiping credit cards, I do not retain or store any of the personal information of my customers. What is my obligation with respect to 201 CMR 17.00?

If you use swipe technology only, and you do not have actual custody or control over the personal information, then you would not own or license personal information with respect to that data, as long as you batch out such data in accordance with the Payment Card Industry (PCI) standards. However, if you have employees, see the previous question.

Does 201 CMR 17.00 set a maximum period of time in which I can hold onto/retain documents containing personal information?

No. That is a business decision you must make. However, as a good business practice, you should limit the amount of personal information collected to that reasonably necessary to accomplish the legitimate purpose for which it is collected and limit the time such information is retained to that reasonably necessary to accomplish such purpose. You should also limit access to those persons who are reasonably required to know such information.

Do I have to do an inventory of all my paper and electronic records?

No, you do not have to inventory your records. However, you should perform a risk assessment and identify which of your records contain personal information so that you can handle and protect that information.

How much employee training do I need to do?

There is no basic standard here. You will need to do enough training to ensure that the employees who will have access to personal information know what their obligations are regarding the protection of that information, as set forth in the regulation.

What is a financial account?

A financial account is an account that if access is gained by an unauthorized person to such account, an increase of financial burden, or a misappropriation of monies, credit or other assets could result. Examples of a financial account are: checking account, savings account, mutual fund account, annuity account, any kind of investment account, credit account or debit account.

Does an insurance policy number qualify as a financial account number?

An insurance policy number qualifies as a financial account number if it grants access to a person’s finances, or results in an increase of financial burden, or a misappropriation of monies, credit or other assets.

I am an attorney. Do communications with clients already covered by the attorney-client privilege immunize me from complying with 201 CMR 17.00?

If you own or license personal information, you must comply with 201 CMR 17.00 regardless of privileged or confidential communications. You must take steps outlined in 201 CMR 17.00 to protect the personal information taking into account your size, scope, resources, and need for security.

I already comply with HIPAA. Must I comply with 201 CMR 17.00 as well?

Yes. If you own or license personal information about a resident of the Commonwealth, you must comply with 201 CMR 17.00, even if you already comply with HIPAA.

What is the extent of my “monitoring” obligation?

The level of monitoring necessary to ensure your information security program is providing protection from unauthorized access to, or use of, personal information, and effectively limiting risks will depend largely on the nature of your business, your business practices, and the amount of personal information you own or license. It will also depend on the form in which the information is kept and stored. Obviously, information stored as a paper record will demand different monitoring techniques from those applicable to electronically stored records. In the end, the monitoring that you put in place must be such that it is reasonably likely to reveal unauthorized access or use.

Is everyone’s level of compliance going to be judged by the same standard?

Both the statute and the regulations specify that security programs should take into account the size and scope of your business, the resources that you have available to you, the amount of data you store, and the need for confidentiality. This will be judged on a case by case basis.