The cloud plays an integral role in enabling the agility required to take advantage of new business models and to do so in a very convenient and cost-effective way. However, this also means that more personal information and business data will exist in the cloud and be passed back and forth. Maintaining data integrity is paramount.

Today's approach to security in the cloud may not be sufficient; it doesn't focus on putting controls close to data, which is now more fluid, and it doesn't discriminate one set of data from another. All data is not created equal and should not be treated in the same manner; a one-size fits all model doesn't work.

In this always-connected world, protection measures in the cloud need to focus on what really matters - the type of data, how it is used, and where it goes.

Data Classification In order to adequately protect data in the cloud, organizations need to start considering how to classify data. One approach is to use a three-tier data protection model to cater to data of different sensitivities and relevance across industries. This model would include:

Tier 1, Regulated: Data subject to regulation, or data that carries with it proprietary, ethical, or privacy considerations such as personally identifiable information (PII). Unauthorized disclosure of regulated data may have serious adverse effects on an organization's reputation, resources, services, or individuals and requires the most stringent level of control.

Tier 3, Collaborative: Collaborative and DevOps-type data that typically is publicly accessible, requires minimal security controls and poses little or no risk to the consuming organization's reputation, resources, or services.

Using this model, security teams can strategically partner with business users to understand requirements and determine the right approach for their organization. Small to mid-sized organizations, enterprises, and service providers can apply this model to begin classifying their data based on contextual attributes such as how the data will be accessed, stored, and transmitted. Once the data is classified, they can then apply appropriate data protection measures focused on protecting work streams and transactions that continue to evolve to enable business agility. Given that most of today's data breaches are a result of user-access issues, security considerations such as Identity and Access Management, Authorization, and Authentication are critical.

The Data Integrity ChallengeUnderstanding and classifying data is just a first step, albeit an important one. Organizations also need to determine how to ensure data integrity when the perimeter is amorphous and control of the endpoints and the data is diminished mobility and cloud services.

Business departments are increasingly encouraged to find efficient and innovative ways to generate new business. This requires identifying new applications and ways to support the business anywhere and anytime. Business users often make the decision to use the cloud before involving IT since they can get up and running in a fraction of the time and cost it would take to provision in house.

With this unprecedented change in operations and infrastructure comes an unprecedented need for ensuring data integrity - ultimately working through the life cycle of data that can, at any point, be within the confines of a company, out to a network of partners and suppliers, or floating in a cloud. The challenge in this fractured landscape is that the perimeter is amorphous, but legacy security solutions are not; designed for a time when there was a more well-defined perimeter. The result is that attackers now use various techniques to bypass traditional perimeter-based defenses and compromise data - be it through tampering, stealing, or leaking data. Point-in-time defenses are no longer sufficient.

To effectively protect data wherever it may be, defenses must go beyond simply blocking and detection to including capabilities such as data correlation, continuous data analysis, and retrospective action when data has been found to have been corrupted, tampered with, or exfiltrated.

A New Approach to Applying Controls In order to protect the classes of data described earlier - regulated, commercial, and collaborative - security teams need a mix of policy, process, and technology controls. These controls should be applied based on user and location context and according to a security model that is open, integrated, continuous, and pervasive:

Open to provide access to global intelligence and context to detect and remediate breaches and to support new standards for data protection.

Integrated solutions that enable policy to be automated and minimize manual processes can close gaps in security and support centralized management and control according to data classifications.

Both point-in-time solutions as well as continuous capabilities are needed to identify new threats to data.

Pervasive security delivers protection across the full attack continuum - before, during, and after an attack.

Let's take a closer look at the advantages of applying controls to protect data based on this model.

Openness provides:

The opportunity to participate in an open community of users and standards bodies to ensure consistent data classification and standards of policy and process.

Easy integration with other layers of security defenses to continue to uphold data protection best practices as IT environments and business requirements change.

The ability to access to global intelligence with the right context to identify new threats and take immediate action.

Integrated enables:

Technology controls that map to data tiers and also track data through different usage contexts and locations to support the fundamental first step of data classification.

Identity and access controls, authorization, and authentication that work in unison to map data protection to data classifications.

Technologies and services to constantly aggregate and correlate data from across the connected environment with historical patterns and global attack intelligence to maintain real-time contextual information, track data movement, and detect data exfiltration.

The ability to leverage insights into emerging new threats, take action (automatically or manually) to stop these threats, and use that intelligence to protect against future data breaches.

Pervasive translates into:

Defenses (including technologies and best practices) that address the full attack continuum - before, during, and after an attack. Before an attack, total, actionable visibility is required to see who is accessing what data from where and how, and to correlate that information against emerging threat vectors. During an attack, continuous visibility and control to analyze and take action in real time to protect data is necessary. After an attack, the key is to mitigate the damage, remediate, quickly recover, and prevent similar, future data breaches, data tampering, or data corruption activities.

The ability to address all attack vectors - including network, endpoints, virtual, the cloud, email and Web - to mitigate risk associated with various communications channels that could be used by an attacker to compromise data.

CIO, CTO & Developer Resources

Today's cloud-driven, always-connected world is enabling organizations to be very agile but it is also putting data integrity at risk. IT teams need to quickly adapt to this new way of doing business despite having less control of the endpoints and the data. Traditional data protection models fail due to their inability to discriminate one set of data from another. By putting in place protection measures based on the type of data, how it is used, and where it goes, and backed by a security model that is open, integrated, continuous, and pervasive, organizations can take advantage of new business opportunities the cloud affords without sacrificing data integrity.

Raja Patel is a Senior Director, Cloud Security Product Management, at Cisco, where he is responsible for the portfolio strategy and development of security solutions for Cisco's Security Business. His responsibilities include building solutions and managing operations associated with Cloud, Threat Intelligence, Web and Email Security. Raja has been at Cisco for 13 years and during this tenure he has product managed a broad portfolio of products within Cisco’s Enterprise Networking Business Group, developed and accelerated new consumption & business models such as Enterprise Licensing, and lead strategic initiatives to develop more agile business practices across Cisco.

Mr. Patel holds a BS in Aerospace Engineering with a Minor in Mathematics from Embry Riddle Aeronautical University, and an MBA in Global Business Management.

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...

Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...

The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...

René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions.
René is a member of the Society of Women Engineers (SWE) and a m...

Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.

Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.

Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...

Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.

As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...

The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-centric compute for the most data-intensive applications. Hyperconverged systems already in place can...

The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regulatory scrutiny and increasing consumer lack of trust in technology in general.

René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions.
René is a member of the Society of Women Engineers (SWE) and a member of the Society of Information Management (SIM) Atlanta Chapter. She received a Business and Ec...

Cloud computing budgets worldwide are reaching into the hundreds of billions of dollars, and no organization can survive long without some sort of cloud migration strategy. Each month brings new announcements, use cases, and success stories.