This week on NVTC’s blog, the Virginia Commonwealth University School of Engineering shares research on Big Data footprints that the Electrical and Computer Engineering Department is working on with the Huazhong University of Science and Technology.

Xubin He, Ph.D., professor and graduate program director of the Virginia Commonwealth University School of Engineering Electrical and Computer Engineering department, is working with Huazhong University of Science and Technology (HUST) to establish an international research institute focused on creating design techniques to improve data reliability and performance. Coordination efforts are currently underway to create rotation periods for students from VCU and HUST to conduct research within each university’s state-of-the art laboratories.

“This next step in our partnership with VCU helps both universities attract more high-quality research students, while enhancing the breadth and depth of our research,” said Dan Feng, Ph.D. and dean of the School of Computer Science and Technology at HUST. Feng also serves as director of the Data Storage and Application lab at HUST.

Managing big data

Data storage is a booming industry, with lots of opportunities. Just a decade ago, computational speed dominated research efforts and water cooler conversations. According to He, data is more important now. “Data empowers decision-making and drives business progress. No one can tolerate data loss, whether that data represents favorite photos or industry trends and analytics,” added He. And yet, trying to increase data capacity or replace obsolete data systems can shut down vital data centers for days.

Research teams from both universities find creative solutions to global data pain points. For example, these collaborative research teams reduced overhead costs associated with data failures by up to 30 percent. Their algorithms allow businesses to encode data that can be easily retrieved, instead of having to rely on costly data copies or redundant data centers.

Currently, in addition to HUST, He’s team also works with top data storage companies such as EMC, which ranks 128 in the Fortune 500 and had reported revenues of $24.4 billion in 2014.

The network effect

He has a simple philosophy to gauge the success of university research efforts — he looks at who else is there. “At top data storage and systems events such as USENIX’s File and Storage Technologies conference and USENIX’s Annual Technical conference, we’re presenting with peers from Harvard, MIT, Princeton and other premier universities we admire,” said He. These conferences typically accept about 30 presentation papers — that’s less than 20 percent of the global submissions they receive.

“Professor He’s leadership represents one of many efforts to build our international reputation in industry and academia,” said Erdem Topsakal, Ph.D. and chair of the Department of Electrical and Computer Engineering. “HUST is ranked 19 on the U.S. News World & Report’s Best Global Universities for Engineering list. When leading universities like HUST want to work closely with you, you know you’re doing something right.”

For more news from the Virginia Commonwealth University School of Engineering, click here.

LeaseWeb’s Brittany Beman introduces six tips to help you find the perfect cloud partner for your business.

By now, the concept of the cloud is ubiquitous, but for many business leaders the idea still presents more challenges than opportunities. Understanding the complicated technology, not to mention the vast array of delivery models, degrees of services and levels of security available, can be a daunting task for companies under pressure to adapt or adopt.

Support and services — For most businesses, concerns about cost, security, vendor management and technology take the lead in the search for a reliable cloud partner. Surprisingly, the ability of a provider to smoothly and effectively deliver customer support, SLAs and managed services is often minimized or overlooked, at the expense of the customer. When deciding which cloud partner best fits your needs, don’t underestimate the crucial importance of the support and services they make available. It’s the difference between a cloud partnership that takes your business to new levels and one that just adds to your daily hassles.

Architectural alignment — One of the biggest considerations is whether to use a hyper-scale or traditional hosting model. Practically speaking, a hyper-scale provider requires users to be responsible for operational, day-to-day tasks, while hosting providers oversee the day-to-day management of the infrastructure elements. It’s up to you to decide which is a better fit for your technical team and business needs.

Security and compliance — Data centers are a frequent target of malicious attacks, so it’s important to make sure that your cloud provider is prepared for every eventuality. This means everything from physical security and network threat recognition, to regular security audits to updated compliance certifications like HIPAA. Your data is your most valuable asset, so make sure it’s going to be treated that way.

Support for data sovereignty and residency requirements — In tandem with security and compliance issues, data residency is another issue that frequently stalls cloud and hosting projects. The growth of “bring your own device” (BYOD), big data and cloud projects is dragging sensitive data to third-party clouds and data centers. This makes many business owners uneasy, which is why it’s so important to address the location of your data, the laws governing the export of data wherever it’s stored and the security and encryption of that data.

Financial management — Traditional hosting companies typically offer a more basic cost scheme, based upon initial configurations with monthly utilization. This traditional model works well for companies with steady and predictable usage patterns. Hyper-scale cloud services, on the other hand, were built around granular per minute or hourly costs from their inception. Provisioning is primarily self-service and allows users to turn up server, storage and network services. This feature appeals to users who need to spin up environments in near real time and then turn them down when not needed. Consider your requirements to determine which model fits you – or if you want a mix of both.

Cultural and strategic alignment — Cultural fit with your service provider is a key point that never receives enough attention in the RFP process. For nearly all enterprises, using a cloud or hosting provider is truly a new venture, one that requires extensive internal buy-in. For first-time cloud buyers, the ongoing degree of partnership is an unknown factor. Each provider engages and on-boards clients differently.

If you’re in the process of picking a cloud partner for your business, remember that no one becomes a cloud infrastructure expert overnight. But with a smart approach, you can make an informed decision that will lead to great results for your company.

Ultimately, remember that you will only achieve the higher-performance and lower-cost environments you are aiming for by choosing the provider that fits your needs and requirements best.

To learn more, visit us at here to receive our full white paper on selecting the right cloud partner today.

CER is the king of BI. With state of the art capabilities, there are also potential roadblocks (employee turnover, user experience levels and changing data/accounting needs, other systems within the enterprise) that can keep companies from fully reaping these benefits. Unresolved, these challenges can cause inaccuracies, consume months of time and erode leader confidence.

1. Powerful Customization Capabilities

While running CER reports is simple, developing them can be challenging for users unfamiliar with Costpoint’s underlying data structure. To solve this problem, Deltek created standard reports (CER Reports) covering everything from project management to payroll to procurement. These prebuilt templates enable users to quickly generate reports that capture the most commonly used fields across a wide swath of businesses. However, they may not include user-defined fields or specific metrics that you have implemented for your specific needs.

To capture these data, enterprises will want to build custom reports or modify existing standard reports. Creating or modifying Cognos reports requires a strong working knowledge of both the Cognos tool as well as the structure of the Costpoint database. For example, you can’t simply click a button to tailor a report to your accounts payable process or labor management structure. You will need to understand where the pertinent data elements reside and how to access them using the Cognos toolset. Many intermediate users lack the skills to effectively craft custom reports.

2. Complexity of Government Contracting Accounting Data

Costpoint uses more than 1,800 inter-related data tables that capture a wealth of information about your company. Access to this complex store of data can be of great benefit to your organization, but creating a report that captures the data relevant to your needs challenges many organizations. Many users are unsure about which data to query and how to convey it on a well-designed report. Common questions include:

What data tables do I access?

How do I arrange the data?

Which charts do I use?

What rendering options are best?

Additionally, most organizations have budgets and forecasts and want to integrate this data with actual results within their reports. You may have used another system to create your budgets, such as TM1, Adaptive or even Excel. Accordingly, integrating this data into reports generally means pulling data from systems outside Costpoint, further compounding complexity.

3. Robust Security

Increasingly in today’s world, data breaches are affecting companies in all sectors. Breaches tarnish a company reputation, expose data and sabotage audit requirements. Fortunately, Cognos and CER deliver robust and highly configurable security controls. The hundreds of available settings, however, can stymie many organizations.

Novice and even intermediate users may not realize that out-of-the box Cognos installations may not incorporate security settings that are optimal for their company’s situation. Organizations could inadvertently expose proprietary data and confidential employee information. For instance, you might assume that granting access to the projects package would only enable users to see project-related data but close examination of that data reveals that confidential employee information may be included if it is not properly secured by appropriate user-specific security profiles. The default security settings may not be sufficient to provide the degree of security required in rapidly changing environments.

4. Analytics Development and Optimization

Unless you have a dedicated analytics staff with the required expertise, individuals developing your reports may not be effective. Why? Developing reports and maximizing efficiency requires experience with both Costpoint and Cognos.

Often, employees tasked with reporting business intelligence have other duties. CER management is a part-time responsibility. They may have deep functional knowledge, but minimal understanding of Costpoint data structures and Cognos query and reporting requirements.

If you are caught in this situation the results can be challenging. Inexperienced users take 10 to 20 times longer to develop a report than an expert. Reports can be late compressing the time available for meaningful analysis as well as diverting time away from other business-critical duties, which may better align with their hired skillset.

5. Meaningful Results

Inexperience can also cause inaccuracy in reported results. Novice users may query the wrong data or omit data that would dramatically improve the reports usefulness. Such mistakes could put contracts at risk or result in poor or ill-informed business decisions. You want your reports to carry the most meaningful data possible.

Imagine you’ve tasked your logistics manager with developing an incurred cost submission report. That individual skillfully maintains your supply chain. But he or she doesn’t use Costpoint every day and may not understand which direct and indirect cost tables to query. Your report might miss critical costs or include unallowable items.

Errors like these erode leader confidence. Just one inaccurate report, and senior managers may mistrust all your data outputs. You’ve damaged your reputation and possibly jeopardized your contracts simply because you didn’t fully understand the data structures and how to best capture the data that conveys the most meaning.

Signs You Need Help

How do you know if these challenges are hindering your data analytics? Talk to your users and business managers. If you hear the following, you are probably underutilizing the power of CER or you might need a CER tune-up.

Inexperienced users waste time collecting the wrong data and not enough time analyzing results. Inaccuracies cause executives to doubt the validity of all your reports. Poor security can expose proprietary data and compromise audit results. Depending upon the number of users, solving these challenges could save your company months of labor and lead to better, more timely business decisions.

If you owned a Telsa, you would make sure you were trained and educated in how to fully utilize it. At a minimum you would take it to experts to make sure it was operating efficiently and effectively.

What You Need to Know:

To fully benefit from Deltek CER, companies should routinely assess their CER configurations, processes and output. A CER tune-up is easy and should be a standard operating procedure within all Deltek organizations. Maximizing the power of CER will allow companies to reap substantial benefits, overcome any “obstacles” and enable their organizations to succeed.

Share and Enjoy

The greatest meaning of “big” in Big Data is the role of data in the digital economy. The question who owns the data is big too. With IoT and cloud, data ownership will matter soon even to those who don’t care now. Svetlana Sicular of the Gartner Blog Network explores this issue in this week’s blog post.

The greatest meaning of “big” in Big Data is the role of data in the digital economy. The question who owns the data is big too. With IoT and cloud, data ownership will matter soon even to those who don’t care now.

And there is no universal answer — data ownership is culture-specific. In some cases, nobody wants to own the data, in other cases, everybody wants to grab a piece (“it’s mine!” although the “owner” didn’t even know before you asked that this data existed). With participating external parties, things are even more complicated: for example, one party might learn that it does not have rights for the data it considered its own. To solve ownership — but not alleviate the problem! — some organizations decide that data belongs to their customers, citizens or third-parties, and the company is only a custodian.

What successful approaches to data ownership have I seen?

The universal first step is establishing an institute of data governance. I just published a research note on how to do this: EIM 1.0: Setting Up Enterprise Information Management and Governance. You don’t have to call it “data governance.” It could be “data advocacy” or simply a name reflecting the nature of taking care of data. It should resonate with a specific organizational or ecosystem culture.

The next steps would be specific to the culture and the nature of the business: figuring out what data is most vital. This will narrow down data ownership to the decisions that matter (which will save a lot of grief and lots of hours).

Subject matter experts make a step forward to own the data on which they are SMEs (bottom up).

Application business owners are offered to own the data, accept it and take it (unexpectedly) seriously, which is fruitful to everyone.

Data operators become de-facto data owners (which could be a solution, but could be a greater problem). Transparency in what is being done with data and explicit data access rules make it a solution.

When data ownership is hard to resolve on the high level, going more granular, and resolving data elements’ ownership (which is usually more obvious), answers the question.

A business executives assumes data ownership. The worst case is when such ownership belongs to an executive who has control, but has no idea about data. E.g. the executive owns the data, but does nothing because executives are busy doing other things. The best case is when this executive is a sponsor of data-related work.

The ownership is just part of taking care of the data. Look at the root of the issue: who can do what with which data without stepping on each other’s toes, avoid troubles with regulations and ensure you put data to work ethically. Data governance often starts with compliance and ownership, but — unavoidably — it ends up finding value in the data, which is big in the digital economy.

This week on NVTC’s Blog, LMI Senior Consultant Daniel DuBravecnotes that we need to prepare for personalized medicine and the evaluation of genomic data.

Today’s electronic health record (EHR) systems cannot properly handle genomic data. Interpreting these huge and complex data, particularly in a visual manner, is challenging. Even when EHR systems can access these data, few standards exist for how to structure them to ensure seamless system integration, interoperability, and interpretation. Most medical schools do not teach doctors how to interpret genetic data, and local-level care centers require training on proper data storage and network security.

Precision medicine predicts, prevents, and treats diseases at the patient level. Its growth has created the need for internationally recognized genomic EHR standards and policies, which would protect individuals by ultimately improving patient outcomes. We need to prepare for a future in which medicine is more personalized and better able to evaluate genomic data.

Real, Inspiring Stories

Recently, I met a colleague whose daughter is suffering from a genetic condition known as Stargardt disease. Sadly, her daughter is rapidly losing her vision. This disease, a form of juvenile macular degeneration, can only appear in children when both parents carry the mutated gene. If the gene had been identified at an early stage, medical practitioners would have had more time to investigate new drug therapies and gene-editing technologies to treat my colleague’s daughter. As part of her interoperable medical genetic record, physicians at research institutions who were also working on her case could have then viewed and collaborated by using this critical information. Hitting close to home, this is one of many stories that inspire us to prepare for the widespread application of precision medicine and genomic data analysis.

Making Genomic Data Useful for Medical Practitioners

The future of patient care requires connecting large external data sets with electronic healthcare records. Precision medicine will customize treatments down to a patient’s genes and behavior. By analyzing genetic data across thousands of people, scientists will discover preventative treatments and cures for challenging health issues.

Given the complexity of health and genomic data, one can analyze the same data in different ways and achieve different outcomes. “Well-designed data visualization could help doctors interpret the data more rapidly, arriving at more challenging diagnoses in less time,” says Erin Gordon, data visualization trainer and graphic facilitator at LMI.

Before developing a framework for integrating and analyzing disparate health data sets, we test our models for validity. “The quality of our medical data models has a direct impact on patient outcomes and daily operations in medical facilities,” says Brent Auble, a consultant with the Intelligence Programs group at LMI. To support LMI’s research into healthcare data management, our team set up a Hadoop cluster, which is a group of servers designed to quickly analyze massive quantities of structured and unstructured data.

Building the Future of Healthcare Analytics

Ultimately, to meet the growth in precision medicine and the use of health data analytics, future EHR systems need to:

automatically generate comparisons of multiple genomes,

identify and match genetic variants based on known diseases,

ensure patient data privacy, and

integrate and search medical publications and scientific research for relevant patient data.

Preparation is key in order to predict, prevent, and treat disease as medicine evolves.

Dan DuBravec is a senior consultant at LMI, leading IT implementation projects. Mr. DuBravec holds multiple EHR certifications, as well as a BS in product design from Illinois State University and an MS in educational technology leadership from George Washington University.

This week on NVTC’s Blog, Business Development, Marketing & Sales Vice Chair Jenny Couch of member company Providge Consulting shares critical changes to the IT landscape that your healthcare organization needs to have on its radar.

These days, technology seems to advance too rapidly for most of us to keep up. It’s certainly moving too rapidly for organizations to keep up with every single one of the “hot” trends.

In the noisy field of today’s latest tech, it’s all too easy to get caught up in the buzzwords and lists of “This Year’s Hottest IT Trends”, and miss the truly critical changes to the IT landscape that your organization needs to have on its radar.

The healthcare industry is uniquely positioned to be impacted by a convergence of critical IT trends within the coming years. But with budgets decreasing, and resource pools shrinking, it’s more challenging than ever to prioritize IT needs within the healthcare space.

We’ve highlighted the top five technology trends healthcare organizations must have on their radar in 2016.

Cloud computing. Whether it’s a pharmaceutical company needing to store large amounts of data from clinical trials, or a hospital with a newly implemented EHR system, healthcare organizations of all kinds are increasingly turning to cloud computing for a variety of uses. According to Healthcare Informatics, the global healthcare cloud computing market is expected to reach $9.5 billion by 2020. And 83 percent of healthcare organizations are already leveraging the cloud. Only 6 percent of organizations have no plans to take advantage of the cloud in the coming years. If you’re in that 6 percent, it’s time to reconsider your plans. Cloud computing can be used to decrease costs, improve access, and create a better user experience for any healthcare organization. But, it’s critical that your organization take a strategic approach to moving to the cloud. Learn more about how you can leverage the cloud to best support your organization here.

The Internet of Things. Take a look at that FitBit on your wrist. Think about the incredible amount of data that one tiny device is generating constantly. The number of steps you take, the calories you burn, your sleep pattern, the stairs you climbed. These devices get more accurate and more intricate with every passing day. We are not far off from a future when we’ll be able to monitor nearly every aspect of our health, and the health of our loved ones without setting foot in a doctor’s office. Healthcare organizations will have to find a way to address what will be tectonic shift in how care is delivered. Communication methods will need to be established to collect the data generated by wearable and mobile devices. Methods for collecting and analyzing the influx of data will need to be developed so patterns can be identified. The manner in which treatment is delivered will have to change as we move away from the traditional doctor’s office visits, and into a world where a diagnosis can be made through analyzing the information generated through a patient’s mobile device, car, appliances, wearables, etc. And while this future may not quite be a reality, it’s coming soon, and healthcare organizations need to start preparing today.

Data Explosion. Big data. Data analytics. Whatever term you use, the unparalleled rise in the amount and accessibility of data over the past few years is certain to have a massive impact on the healthcare industry. The explosion in big data occurred so quickly that 41 percent of healthcare executives say their data volume has increased by 50 percent or more from just one year ago. 50 percent in just one year. This incredible increase in data will allow medical professionals to more quickly and more accurately diagnose patients, but as with the Internet of Things, it will require fundamental shifts in how data is managed and how care is administrated. Healthcare organizations will need to train, or hire a workforce with the right data analysis and medical skill sets. Regulations, processes, and platforms will need to be developed or implemented. Healthcare organizations who ignore this trend do so at their own peril. For as Accenture notes in a report released earlier this year for those who take advantage of the wealth of opportunity within big data, “Greater operational excellence and improved clinical outcomes await those who grasp the upside potential.”

Efficiency in IT. If you haven’t heard the phrase “Doing more with less” in the past few months, it’s probably time to climb out from under that rock you’ve been living under. With healthcare spending wildly out of control in the United States, every healthcare organization from physician’s offices to the largest hospital chains are being asked to do more with less. IT is a particularly ripe area for cutting costs, and resources. In 2016, the emphasis on doing more with less in IT will continue. Expect to see IT departments pursue options such as moving to the cloud, outsourced managed services, and bring your own device to help decrease IT operating costs.

Cybersecurity. In 2014, 42 percent of all serious data breaches occurred at healthcare organizations. Sadly, this trend is certain to continue its upward trajectory in the coming years. Healthcare organizations who have not adequately upgraded their systems, and developed a thorough cybersecurity strategy are especially vulnerable to attack. Now is time to evaluate your systems, processes, and resourcing. Make sure your organization is positioned to proactively protect against attacks where possible, and identify and respond rapidly to breaches when they do occur.

Planning your 2016 health IT projects and priorities? Looking for a partner that will truly understand the challenges you are facing and the need to ensure success? Get in touch with us today. Our experienced health IT experts know the obstacles you face, and are ready to partner with you to deliver your projects on time, and on budget in 2016 and beyond.

This post was written by Jenny Couch. Couch is a project management consultant, and Providge’s Business Development Manager. She loves efficiency, to-do lists, and delivering projects on-time and on-budget.

This week on NVTC’s blog, Kathy Stershic of member company Dialog Communications shares her final thoughts of her Brand Reputation in the Era of Data series.

Over the past few weeks, I’ve outlined 8 Principles that will help marketers protect and strengthen their brands in an era of radical change, where there is great temptation (and quite likely management pressure) to push boundaries further than ever before. Throughout this time and many preceding months, I’ve had countless conversations with people about the state of their data as well as the modern conveniences upon which they’ve come to rely. I’ve heard a Big Data expert actively advocating for stretching the law (or hinting at crossing the line) for the sake of competitive advantage. I’m sure he is not alone in that opinion. We are, all of us, currently in the Wild West.

While technology is accelerating what’s possible, the ideas outlined in the 8 Principles come back to common fundamental and timeless human needs that will outlast every wave of technology: People protecting what’s theirs, seeking respect and dignity, wanting control of their lives, enjoying freedom and avoiding harm. The brands they will choose for anything more than a one-time experience will be those who understand those concerns, and actively work to enable them.

There is more to brand reputation than being the app of the moment. Not every new thing will be transformational. But businesses who innovate as well as who truly respect their customers and actively work to earn trust stand a far greater chance of longevity than those who rely on buzz about the shiny new object, or who exploit to maximum advantage thinking the ‘sheeple’ won’t notice. It will take work. It will take awareness. It will take intention. It will take courage. And it will take leadership.

Eventually today’s Wild West will give way to a more mature market dynamic. Embracing these 8 Principles may help ensure your company is there when that time comes – or even leading the way.

This week on NVTC’s blog, Kathy Stershic of member company Dialog Communications continues her Brand Reputation in the Era of Data series by sharing principle eight: actively demonstrating respect for your customers.

The final of these 8 Principles clarifies a concept implied across the other seven. To become and remain a successful brand, businesses must actively demonstrate customer respect. Just saying ‘We respect our customers!’ is not enough. Prove it.

This can take many forms, from being transparent and honest about data collection and sharing practices to moderating your outreach below the annoyance level to integrating this attitude into your culture and policies – and many other opportunities mentioned through these posts.

Disrespectful practices were often brought up in the comments I’ve gotten. One respondent noted that “I want to feel like a vendor respects my data as much as I do.” People do not like bait-and-switch, confusing changes to privacy policies or anything that feels sneaky. They don’t like the burden of responsibility to stop something, like too much email or too many pop-ups. When everyone is tired or busy from their own lives, wearing people down or hoping they won’t notice might produce a short term win, but not long-term loyalty.

Having a straightforward dialog with your customers – even the ones who are unhappy with you – is another way to show respect. Everyone messes up – own it! Apologize, make it right and move on. If it wasn’t your fault, but there’s a small cost to making someone feel respected anyway – do it! Nordstrom figured this out a long time ago.

Nothing about customers wanting to feel respected and treated fairly is new. What is new is the exponential increase in vendor relationships enabled through technology. With the tremendous choice the modern customer enjoys, utility, benefit, quality and value are now table stakes. A differentiated and trusted experience, that includes feeling respected, is what will stand out. Someone’s choice of your product or service is a privilege. One of the best quotes from the respondent feedback sums it up: “Respect the customer and the customer will respect you.”

This week on NVTC’s blog, NVTC member company Kathy Stershic of Dialog Communications continues her Brand Reputation in the Era of Data series by sharing principle seven: applying technology thoughtfully while preserving customer data.

Recently, Chapman University published the results of its survey America’s Top Fears 2015. Respondents were asked their fear level about different factors ranging from crime to disasters to their personal futures. FIVE OF THE TOP TEN THINGS PEOPLE FEAR ARE RELATED TO MIS-USE OF THEIR DATA! That includes cyber-terrorism, corporate tracking of personal information, government tracking of personal information, identity theft and credit card fraud. That’s out of 88 possible things to be afraid of!

There is a tidal wave of automation being applied to data collection and usage practices. I suggest that just because you can do something doesn’t always mean you should. We are approaching a tipping point around the creep factor of having everything that one does be tracked. People are tired of constant advertisements, witnessed by the increased adoption of ad blocking technology, and especially Apple’s recent iOS 9’s robust blocking capability for Safari – which has been heralded as the potential death of online advertising. As ads are blocked, marketers need to find other ways to get their message through, such as direct contact with mobile devices. That will require permission from each user. And that means you have to be delivering a lot of value while also showing some restraint in the level and frequency of contact.

Another interesting wrinkle is the October 6 ruling by the EU Court of Justice that struck down what is called Safe Harbor, a policy that allowed self-certification by U.S. companies to say their data protection standards were sufficient for EU citizens, who are protected by strict privacy law. Israel followed suit on Oct. 20. What happens next is yet to be determined, but everyone is scrambling to figure out how to protect their international business by the end of January grace period.

When practices get abused, people fight back or tune out. It’s human nature. In e-chatting during a webinar this week with its moderator Chris Surdak, a big data expert, (who I thought discussed unbridled capitalism more extremely than anyone I have ever heard), he noted regarding privacy that “The backlash will be epic, if we ever get there.” Hmmm. A thoughtful approach to what you collect, how you collect and use it, how long you keep what you collect, with whom you share it and what they do with it will better serve and protect your business and your brand through changes in customer sentiment and the regulatory environment.

This week on NVTC’s blog, NVTC member company Kathy Stershic of Dialog Communications continues her Brand Reputation in the Era of Data series by sharing principle six: comply with all applicable laws and regulations - then exceed them.

There are a LOT of laws and regulations out there that govern data handling and privacy. They vary according to where you conduct business. The European Union has the strictest set of laws that are built on the principle of human rights. The United States has what’s called a sectoral approach, that is different laws are set for different sectors – like HIPAA for healthcare, Gramm Leach Bliley for Finance, the Cable TV Privacy Act, the Electronic Communications Privacy Act and on. In the US, 47 of 50 states also currently have data breach notification laws, all of them slightly different. Asian countries adopt data protection laws and sectoral laws. Many Latin American countries have constitutional guarantees, data protection laws, and sectoral laws. Yikes! It’s a lot to comply with – and just to keep things fun, laws and regulations are changing and updating all the time.

Realistically, marketers are not going to know every legal requirement that impacts their organization. But you should at least be aware of the basic principles of what’s allowed in the places you do business, then coordinate with Legal (I know, I know!) on how to stay out of trouble. This discovery can also happen through a process called a Privacy Impact Assessment, mentioned in my previous post.

Observing laws and regulations must be standard operating procedure. But just being compliant really isn’t enough to enhance your position in a fickle and frenetic market. Think about it this way – do you want your child to just stay out of trouble at school, or be a leader in the classroom? Where’s the attention going to go? You sure don’t want to stand out in a bad way – like being one of the 256 app providers who violated the privacy terms they contracted with Apple.

Going beyond the legal minimum and making extra effort will help your business differentiate as a trusted source. Simplified privacy policy language will help. Minimizing data collection and retention (yes, you CAN get rid of stuff!) will help. So will being transparent at all times about your practices and behaviors. Use creative ways to tell the story to your customers and stakeholders – through vignettes, through messaging, through customer service scripts – put it out there. Earning trust marks like TRUSTe really sends the message that you take data stewardship seriously.

Your customers expect you to comply with the law. They want to feel like you care and are proactive about protecting their data. I firmly believe that the great majority of people want to do the right thing; it comes back to mindfulness and balance between enthusiastic pursuit of business objectives and a bit of thoughtful restraint.