Blog

As the “the new oil,” data is an invaluable commodity and a source of power in the modern world. But just storing massive amounts of it doesn’t transform an organization into a data-driven organization! For the true value of data to be (continuously) unlocked, it must be curated across the enterprise to derive required, meaningful, and actionable insights.

Large companies worldwide collect massive amounts of data in various formats. With previous data storage platforms, the data must first be formatted to be uploaded into the system. This limits the type of information that can be stored on each platform and leads to disparate and disjointed systems. On the other hand, data lakes allow the raw data to be stored regardless of format and to remain unstructured until needed, so the source data is very ...

The GDPR is an overhaul of country-specific rules and regulations that puts personal information back into the hands of the individual. Personal data is defined as any information relating to an already identified individual or that can identify an individual directly or indirectly.

The regulation went into effect in April 2016 but provides a grace period on compliance that will end on May 25, 2018.

The GDPR is not only a law but also a helpful guide on how to manage and govern data because it is all about good data hygiene, knowing what data has been collected, how the data is used, and how to use it the right way.
Another benefit of implementing the GDPR is that companies now need to follow only one set of privacy rules rather than the myriad of different rules imposed by individual countries in the European Union (EU).

The combination of LumenData’s extensive expertise in Enterprise Information Management and Reltio’s Modern Data Management approach provides clients access to clean and enhanced customer and account data for future strategic initiatives. LumenData’s strategic alliance with Reltio can help simplify the process to create an enterprise-wide data strategy and to provide on-going visibility into data management in the cloud, on-premises, or through a hybrid of both.

RELTIO SAT DOWN WITH TARUN BATRA, CEO OF LUMENDATA, TO LEARN MORE ABOUT LUMENDATA’S DATA MASTERING SOLUTIONS.

Planning for the future with data management in mind means that you’ll have to deal with the realities of the Internet of Things sooner rather than later. The IoT is likely going to be the source of the very biggest in Big Data and keeping track of it all is going to be a major challenge in the years ahead.

If you want to know more about the IoT, the editorial team over at Cloudwards.net has put together an article about what the Internet of Things is. The article discusses what’s possible with the IoT, how it works, as well as its security and privacy implications. We recommend interested people give it a read and think about all the changes in data governance ahead of us.

There is a lot of truth and wisdom in the quote, “The perfect is the enemy of the good.”

The quote definitely applies when trying to get the most out of data. However, in this context, it should really be “Perfect is the enemy of budget and time.”

Pragmatic data governance protects your budget and timelines against perfectionism. The end goal is, of course, to have all issues resolved, but being able to start governing the data before the end is vital. To do this, data governance has to be built around the concept of managing “imperfect” data and incrementally improving that data.

A key failure of data governance in almost every organization is the wanting to come to agreement before making data available. A far more practical and useful ...

You may find that the higher up in the organization you go, the greater the support you’ll receive for data governance in relation to big data and big data success, but most of the work to manage data is not done at the top of the organization.

Success or failure will actually be determined by how willing the people dealing with the data on a daily basis are to adopt and be involved in the data management. The better defined the value proposition and implementation process are, the greater the level of success your organization will enjoy.Data management and governance is an evolutionary process, so there is no final process or last step. To continue to maintain the standards that you have ...

Whether your big data management plan for success contains two steps or twenty steps, the initial point for most successful companies is data governance. Typically, the focus of data governance tends to be on only inbound data feeds with little regard for the metadata, but if attention is not paid to determining the true purpose of the data, to applying and following business rules, and to managing change, downstream applications will most likely continue to receive lousy data.

Data governance for big data can become complicated, so maintaining practicality is critical. Without a pragmatic governance approach, the value of data is easily undermined. Attaining a high degree of data accuracy is not always necessary because the art of pragmatic data governance is to ...

According to recent TDWI research, 89% of organizations still consider big data to be more of an opportunity. Only 11% of organizations view big data to be a problem because few of these “opportunities” yield outcomes that generate revenues higher than the cost of getting the “insights” from big data. Unfortunately, what many organizations don’t realize is that big data equates to big problems for organizations without an effective data management plan in place to handle it.

Many organizations still struggle with “small” data, so for them, more data does not necessarily mean better. To make data management even more off-putting, organizations that do recognize big data as an adversary characterize it as something new: “Big Data Management” (BDM)—yet another ...

We all want a “zero wait” infrastructure. This has spurred many organizations to push all data through a real-time infrastructure. It’s important to recognize that “zero wait” means that the information is in ready form when a user needs it, so if the user needs information that includes averages, sums, and/or comparisons, there is a natural need to have a data set that has been fully processed (e.g., cleaned, combined, augmented, etc.). Building the data infrastructure with this in mind is very important.

The popular point of view is that real-time processing is the “modern” solution and that batch processing is the “archaic” way. However, real-time processing has also been around for a long time, and each mode ...