Adding a new dimension to the CERIAS 10th Annual Security Symposium, five of the panelists with varied background came together on the final day to share their work and experiences on “Unsecured Economies: Protecting Vital IP”.

Setting the platform for this discussion was this report. “Together with McAfee, an international team of data protection and intellectual property experts undertook extensive research and surveyed more than 1,000 senior IT decision makers in the US, UK, Japan, China, India, Brazil and the Middle East regarding how they currently protect their companies digital data assets and intellectual property. A distributed network of unsecured economies has emerged with the globalization of many organizations, leaving informational assets even more at risk to theft and misuse. This report investigates the cybercrime risks in various global economies, and the need for organizations to take a more holistic approach to vulnerability management and risk mitigation in this ever-evolving global business climate.”

Karthik Kannan, Assistant Professor of Management Information Systems, CERIAS, Krannert School of Management, Purdue University was the first to start the proceedings. He gave a brief overview of the above report, which was the product of the collaborative research done by him, Dr. Jackie Rees and Prof. Eugene Spafford as well. The motivation behind this work, was that more and more information was becoming digital and traditional geographic boundaries were blurring. Information was being outsourced to faraway lands and as a result protecting leaks was becoming harder and harder. Kannan, put forth questions like: “How do perceptions and practices vary across economies and cultures?”, and sighted an example from India where salary was not personal information, and was shared and discussed informally. To get answers for more such questions, a survey was devised. This survey was targeted at senior IT decision makers, Chief Information Officers and directors of various firms across the globe. US, UK, Germany, Brazil, China and India were among the countries chosen, giving the survey the cultural diversity element that it needed. Adding more value to the survey was the variety of sectors: Defense, Retail, Product Development, Manufacturing and Financial Services. According to results of the survey, a majority of the intellectual property (47%) originates from North America and Western Europe, and on an average firms lost $4.6 million worth of IP last year. Kannan went on to explain how security was being perceived in developing countries, and also discussed how respondents reacted to security investment during the downturn. Statistics like: 42% of the respondents saying laid-off employees are the biggest threat caused by the economic downturn, showed that insider threats were on the rise. The study put forth many case studies to show that data thefts from insiders tend to have greater financial impact given the high level of data access, and an even greater financial risk to corporations.

Jackie Rees, also an Assistant Professor of Management Information Systems, CERIAS, Krannert School of Management, Purdue University took it up from where Kannan had left and brought to light some of the stories that did not go into the report. Rees explained the reasons behind the various sectors storing information outside the home country. While Finance sector viewed it as being safer to store data elsewhere; the IT , Product Development and Manufacturing sectors found it to be more efficient for the supply chain; and the Retail and Defense sector felt better expertise was available elsewhere. Looking at the perspective on the amount that these sectors were spending on security, 67% of the Finance industry said it was “just right”, while “30%” of Retail felt it was “too little”. The other results seemed varied but consistent with our intuitions, however all sectors seemed to agree that the major threat to deal with was “its own employees”. The worst impact of a breach was on the reputation of the organization. Moving on to the global scene where geopolitical perceptions have become a reality in information security policies, Rees shared that certain countries are emerging as clear sources of threats to sensitive data. She added that Pakistan is seen as big threat by most industries according to respondents while China and Russia are in the mix. Poor law enforcement, corruption and lack of cooperation in these economies were sighted as a few reasons for them to emerge as threats.

Dmitri Alperovitch, Vice President of Threat Research, McAfee Corporation began by expressing his concern over the fact that Cybercrime is one of the headwinds hitting our economy. He pointed out that the economic downturn has resulted in less spending on security, and as a result increased vulnerabilities and laid of employees were now the serious threats. Elucidating, he added that most of the vulnerabilities are used by insiders who not only know what is valuable, but also know how to get it. Looking back at the days when a worm such as Melissa that was named after the attacker’s favorite stripper seems to be having a much lesser malicious intent that those of today, where virtually all threats now are financially motivated and more to do with money laundering. Sighting examples, Alperovitch told us stories of an organization in Turkey that was recently caught for credit and identity theft, of members of law enforcement being kidnapped, and of how Al-Qaeda and other terrorist groups were using such tools to finance terrorist groups and activities. Alperovitch vehemently stressed on the problem that this threat model was not understood by the industry and hence the industry is not well protected.

Paul Doyle, Founder Chairman & CEO, Proofspace began by thanking CERIAS and congratulating the researchers at McAfee for their contributions. Adding a new perspective of thinking to the discussion, Doyle proposed that there has not been enough control over the data. Data moves over supply chain, but “Control” does not move. Referring to yesterday’s discussion on cloud computing, where it was pointed out that availability is a freebie, Doyle said the big challenge here was that of handling integrity of data. Stressing on the point he added that integrity of data is the least common divisor, and that it was the least understood area in security as well. How do we understand when a change has occurred? In the legal industry, we have a threat factor in the form of a cross-examining attorney. What gives us certainty in other industries? We have not architected our systems to handle the legal threat vector. Systems lack the controls and audit ability needed for provenance and ensured integrity. Trust Anchor of Time has to be explored. “How do we establish the trust anchor of time and how confidentiality tools can help in increasing reliabilities?” are important areas to work on.

Kevin Morgan, Vice President of Engineering, Arxan Technologies began with an insight on how crime evolves in perfect synchrony with the socio-economic system. Every single business record is accessible in the world of global networking, and access enables crime. Sealing enterprise perimeters has failed, as there is no perimeter any more. Thousands and thousands of nodes execute business activity, and most of the nodes (like laptops and smart phones) are mobile, which in turn means that data is mobile and perimeter-less. Boundary protection is not the answer. We have to assume that criminals have access to enterprise data and applications. Assets, data and applications must be intrinsically secure and the keys protecting them must be secure too. Technology can help a great deal in increasing the bar for criminals and the recent trends are really encouraging.

After the highly informative presentations, the panel opened up for questions for the next hour. A glimpse of the session can be found in the transcript of the Q&A session below.

Q&A Session: A transcript snapshot

Q: We are in the Mid-West, no one is going to come after us. What should I as a security manager consider doing? How do you change the perception that organizations in “remote” locations are also subject to attack?

Alperovitch: You are cyber and if you have valuable information you will be targeted. Data manipulation is what one has to worry about the most.

Morgan: Form Red teams, perform penetration tests and share the results with the company.

Doyle: Employ allies and make sure you are litigation ready. Build a ROI model and lower total cost of litigation.

Q: CEOs consider cutting costs. They cut bodies. One of the biggest threats to security is letting the people go. It’s a paradox. How do we handle this?

Kannan: We have not been able to put a dollar value to loss of information. Lawrence Livermore National Lab has a paper on this issue which might be of interest to you.

Rees: Try to turn it into a way where you can manage information better by adding more controls.

Q: How do we stress our stand on why compliance is important?

Doyle: One of our flaws as a professional committee is that we are bad in formulating business cases. We have to take a leaf out of Kevin’s (of Cisco) books who formulates security challenges into business proposals. Quoting an analogy, at the end of the day it is the brakes and suspensions are the ones that decide the maximum speed of the automobile, not the engine or the aerodynamics. The question is: How fast we can go safely? Hence compliance becomes important.

Q: Where do we go from here to find out how data is actually being protected?

Kannan: Economics and behavioral issues are more important for information security. We need to define these into information security models.

Rees: Governance structure of information must also be studied.

Alperovitch: The study has put forth those who may be impacted by the economy. We need to expose them to the problem. Besides we also need to help law enforcement get information from the private sector as the laws are not in place. We also need to figure out a way to motivate companies to share security information and threats with the community.

Doyle: Stop thinking about security and start thinking about risk and risk management. Model return-reward proposition in terms of risk.

Morgan: We need to step up as both developers and consumers.

Q: The $4.6 million estimate. How was it estimated?

Rees: We did a rolling average across the respondents, keeping in mind the assumption that people underestimate problems.

Q: Was IP integral to the business model of a company that there was a total loss causing the company to go bust?

Rees: We did not come across any direct examples of firms that tanked and fell because of IP loss.

Q: Could you suggest new processes to enforce security of data?

Doyle: We need to find ways from the other side. If we cannot stop them, how do we restrict and penalize them using the law?

Q: Infrastructure in Purdue and US has been there for long and we have adapted and evolved to newer technologies. However other old organization and developing countries are still backward, and it actually seems to be helping them, as they need to be less bothered with the new-age threats. What’s your take on that?

Kannan: True. We spoke to the CISO of a company in India. His issues were much less as it was a company with legacy systems.

Alperovitch: There is a paradigm shift in the industry. Security is now becoming a business enabler.

The main focus of the talk was to highlight the need for “information-centric security”
over existing infrastructure centric security. It was an interesting talk since John was
instrumental in providing real statistics to augment his thesis.

Following are some of the trends he pointed out from their research:

Explosive growth of information: Digital content in organization grows by about
50% every year.

Most of the confidential/sensitive information or trade secrets of companies are
in the form of unstructured data such as emails, messages, blogs, etc.

The growth of malicious code in the market place out-paces that of legitimate
code.

Attackers have found ways to get around network protection and get at the
sensitive/confidential information leaving hardly any trace most of the time.
Attackers have changed their motivation; they no longer seek big press and they
want to hide every possible trace regarding the evidence of attacks.

Threat landscape has changed markedly over the last ten years. Ten years ago
there were only about five viruses/malicious attacks a day, but now it’s about
staggering 15,000 a day.

The research conducted by the Pondemon Group asked laid-off employees if they
left with something from the company and 60% said yes. John thinks that the
figure could be still higher as there may be employees who are not willing to
disclose it.

These statistics show that data is becoming increasingly important than ever before. Due
to the above trends, he argued that protecting infrastructure alone is not sufficient and a
shift in the paradigm of computing and security is essential. We need to change the
focus from infrastructure to information.

He identified three elements in the new paradigm:

It should be risk-based.

It should be information centric.

It should be managed well over a well-managed infrastructure.

John advocated to adopt a risk-based/policy-based approach to manage data. A current
typical organization has strong policies on how we want to manage the infrastructure,
but we don’t have a stronger set of policies to manage the information that is so critical
to the business itself. He pointed out that it is high time that organizations assess the
risk of loosing/leaking different types information they have and devise policies
accordingly. We need to quantify the risk and protect those data that could cause high
damage if compromised. Identifying what we want to protect most is important as we
cannot protect all adequately.

While the risk assessment should be information-centric, one may not achieve security
only by using encryption. Encryption can certainly help protect data, but what
organizations need to take is a holistic approach where management (for example: data,
keys, configurations, patches, etc.) is a critical aspect.

He argued that it is impossible to secure without having knowledge about the content
and without having good policies on which to base organizational decisions. He reiterated
that “you cannot secure what you do not manage”. To reinforce the claim, he pointed
out that 90% of attacks could have been prevented had the systems came under attack
been managed well (for example, Slammer attack). The management involves having
proper configurations and applying critical updates which most of the vulnerable
organizations failed to perform. In short, well-managed systems could mitigate many of
the attacks.

Towards the end of his talk, he shared his views for better security in the future. He
predicted that “reputation-based security” solutions to mitigate threats would augment
current signature-based anti-virus mechanisms. In his opinion, reputation-based security
produces a much more trusted environment by knowing users’ past actions. He argued
that this approach would not create privacy issues if we change how we define privacy
and what is sensitive in an appropriate way.

He raised the interesting question: “Do we have a society that is sensitive to and
understands what security is all about?” He insisted that unless we address societal and
social issues related to security, the technology alone is not sufficient to protect our
systems. We need to create a society aware of security and create an environment for
students to learn computing “safely”. This will lead us to embed safe computing into day-
to-day life. He called for action to have national approach to security and law
enforcement. He cited that it is utterly inappropriate to have data breach notification on a state-by-
state basis. He also called for action to create an information-based economy where all
entities share information about attacks and to have information-centric approach for
security. He mentioned that Symantec is already sharing threat information with other
companies, but federal agencies are hardly sharing any threat information. We need
greater collaboration between public and private partnerships.

It’s an enlightening experience to listen to some of the infosec industry’s most respected and seasoned professionals sitting around a table to discuss information security.

This time it was Eugene Spafford , John Thompson and Ron Ritchey. The venue was Lawson computer science building. The event was a fireside chat as a part of the CERIAS 10th Annual Security Symposium.

Eugene Spafford started the talk by stating that security is a continuous process not a goal. He compared security with naval patrolling. Spaf said that security is all about managing and reducing risks on a continuous basis. According to him, nowadays a lot of stress is placed on data leakage. This is undoubtedly one of the major concerns today, but it should not be the only concern. When people are focused more on data leakage instead of addressing the core of the problem, which is in the insecure design of the systems, they get attacked which gives rises to an array of problems. He further added that the amount of losses in cyber attacks are equal to losses incurred in hurricane Katrina. Not much is being done to address this problem. This is partly due to the fact that losses in cyber attacks, except a few major ones, occur in small amounts which aggregate to a huge sum.

With regards to the recent economic downturn, Spaf commented that many companies are cutting down on the budget of security, which is a huge mistake. According to Spaf, security is an invisible but vital function, whose real presence and importance is not felt unless an attack occurs and the assets are not protected.

Ron Ritchey stressed upon the issues of data and information theft. He said that the American economy is more of a design-based economy. Many cutting edge products are researched and designed in the US by American companies. These products are then manufactured in China, India and other countries. The fact that the US is a design economy further encompasses the importance of information security for US companies and the need to protect their intellectual property and other information assets. He said that attacks are getting more sophisticated and targeted. Malware is getting carefully social engineered. He also pointed out there is a need to move from signature-based malware detection to behavior-based detection.

John Thomson arrived late as his jet was not allowed to land at the Purdue airport due to high winds. John introduced himself in a cool way as a CEO of a ‘little’ company named Symantec in Cupertino. Symantec is a global leader in providing security, storage and systems management solutions; it is one of the world’s largest software companies with more than 17,500 employees in more than 40 countries.

John gave some very interesting statistics about the information security and attack scene these days. John said that about 10 years ago when he joined Symantec, Symantec received about five new attack signatures each day. Currently, this number stands about 15000 new signatures each day with an average attack affecting only 15 machines. He further added that the attack vectors change every 18-24 months, and new techniques and technologies are being used extensively by criminals to come out with new and challenging attacks. He mentioned that attacks today are highly targeted, intelligently socially engineered, are more focused on covertly stealing information from a victim’s computer, and silently covering its tracks. He admitted that due to increasing sophistication and complexity of attacks, it is getting more difficult to rely solely on signature-based attack detection. He stressed the importance on behavior-based detection techniques. With regards to the preparedness of government and law enforcement, he said that law enforcement is not skilled enough to deal with these kind of cyber attacks. He said that in the physical world people have natural instincts against dangers. This instinct needs to be developed for the cyber world, which can be just as dangerous if not more so.

The panel discussion included a 5-10 minute presentation from each of the panelists followed
by a question and answer session with the audience.

The first presentation was from Lorenzo Martino. Lorenzo defined cloud computing as
‘computing on demand’ with the prominent manifestations being high-performance computing,
and use of virtualization techniques. He also included other forms of pervasive computing such
as body-nets, nano-sensors, intelligent energy grid, and so forth as cloud computing. The main
security challenge identified by Lorenzo was coping with the complexity of the cloud
environment due to the increasing scale of nodes. Another security issue identified by him was
the decreasing knowledge of the locality of nodes as well as their trustworthiness with
increasing scale of nodes. This in turn, introduces issues related to accountability and reliability.
He outlined two main issues to be resolved in the context of a cloud computing environment.
First issue is to strike the right balance between network security and end-point security.
Second issue is the lack of clarity on attack/risk/trust models in a cloud environment.

According to Keith, the best way to explain cloud computing to a layman is to think of cloud
computing services as utilities such as heat and electricity that we use and pay for as needed.
His main security concern in the cloud environment was on the legal ramifications of the locality
of data. For example, a company in the European Union (EU) might want to use Amazon cloud
services. But it may not be legal if the data at the backend is stored in United States (US) as the
data privacy laws of EU and US differ. Another concern raised by Keith was the lack of
standards among cloud computing services. Because of insufficient standardization, verification
of cloud services for compliance with regulations is very difficult. Finally, according to Keith,
despite all the current security challenges in an cloud environment, the cloud will ultimately be
widely adopted due to the lower costs associated with it.

Christoph reiterated that cloud services will become mainstream because of theirs flexibility
(don’t have to worry about under-provisioning or over provisioning of resources) and lower total
cost of ownership. But the key point to understand is that the cloud computing paradigm is not
for everyone and everything as cloud computing means different things to different people
based on their requirements. Citing the example of grid computing, Christoph highlighted the
fact that there is inherent lack of demand for security in the cloud. For example in grid
computing, more than 95% of the customers opt out of the grid security services due to various
reasons. According to Christoph, the main challenge in cloud security is that increased
abstraction and complexity of cloud computing technology introduces potential security
problems as there are many unknown failure modes and unfamiliar tools and processes. He
also added that the cloud security mechanisms are not unlike traditional security mechanisms,
but they must be applied to all components in the cloud architecture.

Security issues arising out of the sheer complexity of the cloud environment were also raised by
Dennis. There are layers of abstraction in the cloud environment with multiple layers of
technologies each with their own configuration parameters, vulnerabilities and attack surface.
There is also lots of resource sharing, and the hosting environment is more tightly coupled. All
such factors make articulating sound security policies for such environment a potential
nightmare. The questions that seem most difficult to answer in a cloud environment are how to
achieve compliance, how is the trust shared across different regulatory domains and how much
of the resources are shared or coupled. There is an inherent lack of visibility inside a cloud
environment. A potential solution is to expose configuration visibility in to the cloud for
performing root cause analysis of problems. Also, maintaining smaller virtualization kernels as
close as possible to the hardware so that they can be more trustworthy and verifiable will help
address some of the security risks.

Question and Answer Session with the Audience

A recurring theme during the question/answer session was related to trust management in the
cloud environment. The first question was that how do ‘trust anchors’ in the traditional
computing environment apply to the cloud? Dennis replied that the same trust anchors such as
TPM etc. apply but the interplay between them is different in the cloud. There are some efforts
to create virtual TPMs so that each virtual operating system gets its own vie of trust. But how
that affects integrity is still being worked upon. Keith however was more skeptical of the ongoing
work in trust anchors solutions because of the complexity of the provider stacks. Another
question related to trust was that why should a normal user trust his private information to a
cloud provider. Dennis replied that users will be driven to use the cloud services based on cost,
and use such utilities because of the savings passed to them. The onus is thus on the cloud
service providers to take of security and privacy issues. Christoph added that with respect to
trust, the cloud is no different than other computing paradigms such as web services or grid
computing. Companies providing cloud services will need to answer trust issues for their own
customers. The next question was on the legal/compliance hang-ups of the cloud with respect
to location of the data. Specifically, there were concerns on the chain of custody data and
accountability. Dennis replied that the legal and compliance requirements haven’t and can’t (yet)
keep up with change in technology. But this also provides an incentive for the providers to
differentiate themselves from the rest on the basis of auditing and legal support offered with
their services. Christoph suggested that such concerns should be addressed in the Service
Level Agreements (SLAs) which then become binding on the service providers. Lorenzo pointed
out that coming up with new regulations for the cloud environment will be a very difficult task as
even the current regulations such as HIPPA have gray areas.

Another interesting question was whether outsourcing IT services to a cloud provider is a win in
security because of the expertise at the cloud provider? Dennis agreed that at least with respect
to managing configuration complexity, it is better off for the organizations to outsource their
services to a cloud provider. Christoph pointed out that availability as a security issue is a big
win in the cloud environment as it almost comes for “free” with any cloud provider. But
maintaining in-house 24x7 availability is a very resource consuming task for most organizations.

There was a comment that there is a perception among customers that if the data is outside the
organization, it is not safe. In reply to this comment, Christoph reiterated that cloud computing is
not for everyone. If data is sensitive such that it cannot be outsourced then probably cloud
computing is not the right answer. Strong SLAs can help mitigate many concerns, but still cloud
computing may not be for everyone. Some related queries were on the safety of applications
and on the enhanced insider threat concerns in the cloud environment. Dennis echoed similar
concerns and mentioned that people in working in cloud environment are audited very carefully
as they have a lot of leverage. Christoph added that there are techniques beyond auditing to
mitigate insider threats such as application firewalls, no sniffing, scanning the data out of the
hosted images, intrusion detection of hosted images, and so forth. Such controls can be
articulated in the SLAs as well. Dennis gave an example that ESX with OVF-I can associate
security controls in the models for each hosted instance which also puts less burden on the
application developers.

Next question was whether it is possible to but keep applications local but still get the benefits of
cloud security? Dennis replied that it is possible but the organizations need to be careful about
applying security requirements consistently when using cloud only for some things. It is also
important to understand the regulations and the risks involved before doing so.

To the query as to what data can be put into cloud without worries about security (public data),
Keith replied that organizations needs to have a classification system in place to figure out what
is acceptable for public access.

Many questions that followed at this stage were related to general cloud computing
requirements such as on sustainability of cloud computing, applications of cloud computing,
distinction between a hosting provider and cloud provider, and so forth. This was expected as
the audience primarily consisted of people from academia who are yet to come to grips with the
aggressive adoption of cloud services in the industry.

Finally, at the end there were two very interesting questions related to cloud security. First
question was with regard to integrity of data in a cloud environment; how can the data integrity
be preserved and also legally proved in a cloud environment? Christoph pointed out that data
integrity is guaranteed as part of the SLAs. Dennis pointed out that many of the distributed
systems concepts such as 2-phase commit protocol are applicable to the cloud environment.
Integrity issues may be temporal as replication of data may not be immediate. But surely long
term integrity of data is preserved. The second comment was that whether the clouds are now a
more appealing target for attackers in the sense that does it increases reward over risk? Dennis
agreed that putting all our eggs in one basket is a big risk, and also that such convergence will
lead to a greater attack surface and will give much greater leverage to the attackers. But the
underlying premise is that the benefits of the cloud technology are amazing, so we have to work
towards mitigating the associated risks.

There has been a lot of discussion recently surrounding the issue of standards and standard adoption. Many questions have been posed and openly debated in an attempt to find the correct formula for standards. When can a standard be considered a “good” standard, and when should that standard be adopted?

According to Dr. Pascal Meunier of Purdue University CERIAS, standard adoption should be based on what he calls transitive trust. Transitive trust indicates that an evaluation of the standard using criteria appropriate to the adopters has been done by an outside source. This ensures the standard applies to the adopter and that it has been evaluated or tested. Dr. Meunier says this allows for sound justification that a standard is appropriate. Unfortunately, most adoption and creation of standards are focused on assumptive trust, or simply knowing someone, somewhere did an evaluation.

Another concern surrounding the creation and adoption of standards raised during the panel discussion was, when standards interfere with economical development or technological progress, should they be adopted, even if they are well-tested, “good” standards? Tim Grance from NIST responded by saying as of right now, standards are mostly voluntary recommendations and they must be in accordance with economical and technological desires of industry in order for them to be widely adopted and widely accepted. There are very few punishments for not following standards and thus there must exist other motivation for industries to spend time and money implementing these standards.

Along with this, the audience posed a question surrounding the practical use of a standard. Even if a partner does decide to comply with a standard there is no easy method of ensuring they actually understand the standard or have the same interpretation of the standard as other partners. Simply establishing a mutual understanding of a standard within an industry poses another obstacle that requires time and resources.

As a result of this, “good” standards may never be used in practice if they are too costly to implement. Therefore, currently used standards may be out of date, flawed, or simply untested. This discussion lends itself to the question of which is better, a standard which is known to be flawed or no standard at all? There is no clear answer to this question, as there exists sufficient evidence supporting both sides.

An argument for the idea that a standard is better than no standard (even if it is a flawed or insecure standard) is that in this scenario, at least the flaw will be know, recognized and consistent throughout the industry. However, others point to the idea that this would actually be detrimental, as now any entity which has adopted the standard becomes vulnerable to the standard’s flaws as opposed to only a small number of industries.

It is clear that industries need standards to follow in many scenarios. However, the difficult questions include when a standard is needed, when a specific standard should be adopted versus when it could reasonably be adopted, and whether or not a flawed standard is better than no standard at all.