Internal or external self-serve portals are becoming more and more prevalent. And why not? They are great in help desk scenarios, customer service, and even sales. But quite a few are a flop. 52% of visitors will abandon them because they can’t find what they are looking for. Not good if 52% of visitors who want to purchase your widgets leave the site.

Now what if you had to develop a portal for cancer patients and their care givers? Sounds challenging doesn’t it? Also sounds like maybe I wouldn’t want that project. Moffitt Cancer Center defied the odds. After several years of development they succeeded in developing a portal for patients and caregivers. The goal was to empower and educate patients with highly targeted and useful information that was specifically relevant to their condition. Accommodating over 50K patients and caregivers seeking on demand, up-to-date, secure, and medically sound, information was not easy to achieve.

Security issues were critical specifically it falls under the healthcare industry, diverse internal and external repositories had to be available through a single search interface, the use of the portal had to support a wide variety of demographics and ages, and all information had to be continually accurate and medically sound.

If you would like to find out more about this project, please join David Stringfellow, Manager Portal and Web Technologies at Moffitt Cancer Center as we explore what makes a self-serve portal a success. To attend, please register here, the webinar will take place February 10th, 11:30-12:30EDT. If you are thinking about or developing a portal this session will provide insight from an expert who has been there and done that!

(If you have a few minutes and use SharePoint or Office 365, could you kindly take our metadata survey? You could win a free conference pass to Microsoft Ignite. We would greatly appreciate it)

Just another story that illustrates how stupid government thinks we are. Or, perhaps another story for us to illustrate how stupid government is. President Obama announced on January 12th new cyber reforms. He is calling on Congress to mandate that companies whose customer data is breached inform affected individuals within 30 days. But why don’t agencies that are hacked have to notify citizens when their data is compromised? Good question it seems.

On a more humorous note, the silence on the government’s responsibility to protect its own data became awkward, as pro-ISIS hackers allegedly leaked personal information on U.S. military members around the same time Obama was speaking.
There currently is no U.S. requirement for notifying breach victims within a certain time period. A hodgepodge of state regulations give companies varying guidance on contacting victims. Less than 30 percent of federal agencies recently surveyed notified affected individuals of high-risk breaches, the Government Accountability Office reported last year.

The Federal Agency Data Breach Notification Act, introduced by Rep. Gerry Connolly, D-Va., in the last Congress would require, among other things, notifying individual victims within 72 hours after discovering evidence of a personal data breach.

According to Connolly, “he does not feel the administration is applying a double standard by omitting agencies from its legislative agenda.”

Data privacy and security are hot topics, and should be. We’ve seen quite a few hackers last year and now we have Morgan Stanley and potentially Sony who were breached by employees, deliberately. Last year alone, approximately 43% of companies experienced a data breach and looking at the handwriting on the wall it’s only going to escalate.

The statistics are quite high for internal data leaks and breaches. According to Ponemon Institute sixty-nine percent of companies reporting serious data leaks responded that their data security breaches were the result of either their malicious employee activities or non-malicious employee error. In fact, the number one leading cause of data security breaches resulted from non-malicious employee error (39%). The Ponemon Institute concludes that these breaches are typically the consequence of complacency or negligence from lax or insufficient access controls to sensitive or confidential data. Only sixteen percent of serious data leaks were linked to hackers or external penetration.

If we consider unstructured content, a large part of the problem is, for the most part, unstructured content remains un-managed or mis-managed in the organization. Therefore, it becomes highly likely that there would be a breach of confidential information, either by accident or deliberately. The solution of course is to ensure security and access controls are in use, but to take a step back and proactively identify manage unstructured assets, a seemingly minor point, but can minimize the potential of an inside breach. Catch the breach before it occurs. Remove it from unauthorized access and portability.

(If you have a few minutes and use SharePoint or Office 365, could you kindly take our metadata survey? You could win a free conference pass to Microsoft Ignite. We would greatly appreciate it)

I am one who is continually harping on security and protection of all assets in an organization. I turned the table on myself the other day and started thinking about the mis-use and abuse of personal information by organizations. If we look at Morgan Stanley, why on earth was an essentially a junior level financial advisor given access to all client data? What were they thinking? Big mistake. What about from the marketing perspective? As a member of that profession, marketing loves to gather as much data as possible about clients to increase sales. In fact, our job depends on it. Just a fact of life maybe.

But what about other uses, or mis-uses of privacy data? Regardless of industry, including government, who does have access to my personal information? More people than I would think and more information than I would expect. Not all internal breaches are caused by nefarious purposes but the information is available for the taking.

I suppose it can be attributed to the ethics of the organization, how they protect data, and the importance they place on protecting privacy data. I’ve had my personal information compromised three times now. In the last incident, which was HIPAA data, it was entirely up to me to protect my identity. That included notifying all credit agencies, putting credit holds on all accounts, and purchasing credit monitoring software. To say the least it’s rather irksome. Given that most organizations don’t even report a breach until they absolutely must, we, the people carry the burden of someone else’s mistake. And then we have to figure out how to get our identity back.

I wonder how bad will this get? Since most employers are now doing comprehensive background checks, you do have some recourse. You can request your own Lexis/Nexis Accurint Person Report, which is free. At least you can see what your potential employer may see.

What does the IT, Finance, and Healthcare Industries All Share? A penchant for malware and data leaks it seems. The cloud access security company Skyhigh Networks released its fifth quarterly Cloud Adoption and Risk Report (registration required) and found that the financial services industry is the second-riskiest vertical based on employee behavior. A close third is healthcare. The findings are based on the average number of malware incidents and data exfiltration events collected over the last quarter from more than 10.5 million enterprise employees across major industry verticals.

I understand IT may be more susceptible. All of us IT type folks (including marketing) seem to be unafraid of software and potential security implications (unless you are a security professional). This makes some sense, since these companies tend to be early adopters and “have permissive policies regarding the use of cloud services.”

Now, as a person who puts money in the bank, and trusts (sort of) the healthcare industry, I was rather surprised that they are a very close second and third of industries that have the highest risk for malware and data leaks.

Though enterprises have begun adopting cloud applications to expand their business, employees are bringing many of their own apps into the workplace and on to corporate devices. In 2014, the average number of cloud services used by an enterprise came in at 738, 10 times more than what IT typically expects from its employees.

According to the article, “employees put many kinds of sensitive information into cloud applications that their corporate IT does not support, like Sharefile and Dropbox. And something as simple as logging into Evernote or a photo-sharing app with the same password as the one used for a corporate account can offer an easy avenue for hackers”.

Skyhigh considers cloud applications high-risk when they lack security features like multi-factor authentication and encryption and have grey areas in the user agreements around the rights to use data uploaded to the program. These applications may also have “a discouraging known-compromise history” and permit risky behaviors, such as anonymous use. According to the report, the average company uploaded 86.5 GB to a high-risk service.

According to the author, “the report raises an alarm because regulated companies are pretty flush with resources to build an infrastructure that maintains risk, he said, but at the end of the day, these verticals find they are not that much better in terms of risk.”

Not that much better? Of all the industries I can think of, finance and healthcare, as far as I am concerned should be leading the way, not trailing behind.

Organizations are increasingly reluctant moving to the cloud because of security issues, regardless if they have their heart set on Google Apps or Office 365. According to Bitglass, a Microsoft and Google partner, cloud adoption is advancing at a much slower rate than initially predicted. And although Bitglass is seeing Google being adopted more frequently, the biggest stumbling block for both mighty foes is the perceived (or real) lack of security for data and apps in the cloud.

Bitglass’ “Cloud Adoption Report” noted that 52 percent of large companies and one-third of small and medium businesses (SMBs) are not moving to the cloud because of security concerns. But not only that, concerns about security are not only not decreasing; they’re increasing. A previous report from October 2011 indicated 25 percent of businesses expressed some concern over cloud security, but that figure increased to 42 percent in July 2013.

“Because larger companies have more-established IT processes, they generally have a higher amount of paranoia with respect to cloud security issues. However, they also have the largest economic gains to be had from moving to cloud,” said Nat Kausik, CEO of Bitglass, in a prepared statement.

But for those who have accepted the cloud as the vision of IT, cloud-based email is the “bellwether” of adoption. It’s more likely for private companies to adopt cloud-based email than public companies, but of those organizations who have adopted cloud email, it looks as though Google (GOOG) has taken a lead in the market over major competitor Microsoft (MSFT).

According to Bitglass, 16.5 percent of private companies and 11.9 percent of public companies sampled had chosen Google’s Gmail for its cloud email solution, but that number dropped off 7.6 percent of private companies and 8.8 percent of public companies for Microsoft cloud email offerings.

“Since public companies are generally larger and older, they are more likely to have history and substantial ties to Microsoft,” Kausik said. “We believe that the lower rate of cloud adoption among public companies is due to additional regulatory and reporting burdens that private companies do not face. Given the compliance and audit capabilities lacking in most cloud apps, we expect third-party security technology will be required to help close this gap.”

So there you have it. Organizations are not jumping on the cloud bandwagon in droves because of security concerns and Google and Office 365 seem to be in a neck and neck race to gain the lead. Which one are you betting on?

In a Forrester white paper, they referred to ‘toxic data’, which I thought was a very powerful use of words, in reference to deploying DLP technology. With the recent announcement of DLP availability in Office 365 the white paper piqued my interest.

What I thought was interesting, was to protect data you first must know where users have stored it, not too much of a brain tease. With toxic data, the security professional may not know where the sensitive data is stored, therefore struggles with deploying the technology to protect the organization. Of course, Forrester recommended strong policies.

What was the key nugget I thought, was developing a life-cycle approach that continuously discovers data as users create it throughout the enterprise. The issue here, is how do you instill appropriate guardianship in the end users to be cognizant of security violations when they are storing data on laptops, mobile devices, external drives, and the list goes on and on.

This is a strong argument for auto-classification tools that are capable of identifying potential sensitive or confidential content, regardless of where it resides. 100% fool-proof, of course not, but it can achieve a significant reduction in time to find all potential toxic data and most importantly, reduces organizational risk.

Do you use DLP technology? If you are using Office 365 or thinking about it, do you think automatic identification and auto-classification for toxic data? WDYT?

“More than 100,000 international laws and regulations are potentially relevant to Forbes Global 1000 companies—ranging from financial disclosure requirements to standards for data retention and privacy. Additionally, many of these regulations are evolving and often vary or even contradict one another across borders and jurisdictions.”

Lorrie Luellig is of counsel, Ryley Carlock & Applewhite, PC

The above is an eye popping statement. The implications are not only for IT, who are somehow supposed to help make compliance happen, but the business issues surrounding the implications of non-compliance and the associated organizational costs. It’s a no win situation, even just to keep on top of the most pressing compliance challenges. Unfortunately, it’s not a choice.

One of the key premises of the white paper was, if “internal audit departments are using a disproportionate amount of resources on compliance activities, there could be significant lost opportunities for value-add governance, operational, strategic, and IT audits”.

On the brighter side, the good news is that internal audit departments are speeding technology adoption, specifically GRC and internal audit technology tools. This is one area where raising the bar on enterprise metadata management, capture, and use can become an important differentiator. Using a framework of tools, the internal audit function can proactively address compliance as a first step as opposed to a last step.

About a month ago, I had to go to the emergency room at the local hospital. A few weeks after, I received a letter from the hospital informing me that they had mistakenly given all my patient information to another patient, including all my personal information and social security number. The gist of the letter, was “oops –we’re sorry”. What angered me, it was my responsibility to notify all credit agencies, banks, etc. that I could incur an additional data breach on my accounts because of the hospital’s mistake.

Of course, being a hospital, HIPAA is really big deal. But, what about the impact on other businesses who have nothing to do with healthcare? In the U.S., the federal Health Insurance Portability and Accountability Act (HIPAA), provides protection of Personal Health Information (PHI). As of January, 2013 the law was expanded to include ‘business associates’, typically insurance companies, etc. that often deal with patient records.

A subtle change though, is that any company that creates, receives, maintains, or transmits PHI, which is the majority of companies, must also comply. Enforcement is also being stepped up. The latest Omnibus Final Rulings update to HIPAA and the Health Information Technology for Economic Clinical Health (HITECH), expanded their regulatory scope and added more random audits as well as stiffer penalties, up to $1.5 million for egregious violations. The Omnibus also covers Personally Identifiable Information (PII), which directly impacts all businesses.

Ensuring compliance and protection of PHI or PII, covers a vast array of information and sources. Faxes, emails, scanned content, recorded conversations, are all legally protected. In the US, the National Institute of Standards and Technology includes what would seem to be meaningless information that can be classified as PII, such as a home address, which seems somewhat innocuous as compared to a national identification number.

Will most organizations, outside of health care ever get audited? Most likely not. Although, as we have seen in Target the cost of the data breach was $148 million. Why is the theft of personal information so lucrative? A full identity profile can bring $500 on the black market where a credit card number or social security number can fetch around $1 (Politico).

Along with this theme, I thought I would take a look at the impact of the cloud as a source of data breaches. Netscope, created an infographic that illustrates how your chances of data breach rise as cloud usage grows. Netscope calls this the cloud multiplier effect. A report, published in June, by Netscope and the Ponemon Institute, found that as cloud application usage increases, so does the chance of a data breach. But what they also did, was estimate the economic impact to the organization.

The three major contributors to the cloud multiplier effect are the rate of cloud adoption; the growth of mobile; and ease and speed of sharing through cloud applications. A shocking find was that IT underestimates cloud application usage by 90%! In the example used, if your organization has 100 cloud applications and added 25 in a 12 month period, you would increase the possibility of a data breach by 75%. Not good news for anyone.

Not to continue picking on IT, 36% of business critical applications are in the cloud, and IT isn’t aware of at least ½ of them. And finally, 30% of business information resides in the cloud, and IT has visibility into less than 1/3 of the information.

The costs?

Loss or theft of 100K customer records – $20.1M

11.8% chance of it happening in the next year

Adjusted economic impact is $2.37M

Theft of high value information – $11.8M

25.4% chance of it happening in the next year

Adjusted economic impact is $2.99M

The probability of a data breach increases by 124% if the number of employee owned devices with access to cloud services increase by 50% over a 12 month period

The infographic is very interesting. It appears from a few of the statistics that cloud management should be a concern of any organization that uses the cloud. Based on the assumptions (fact?) that IT isn’t fully aware of the organizational use of the cloud, indicates that tools and technologies are needed that help manage the environment. The financial impact to the organization isn’t pocket change.