MIKE’s Integrated Content Repository brings together the open assets from the MIKE2.0 Methodology, shared assets available on the internet and internally held assets. The Integrated Content Repository is a virtual hub of assets that can be used by an Information Management community, some of which are publicly available and some of which are held internally.

Any organisation can follow the same approach and integrate their internally held assets to the open standard provided by MIKE2.0 in order to:

Build community

Create a common standard for Information Development

Share leading intellectual property

Promote a comprehensive and compelling set of offerings

Collaborate with the business units to integrate messaging and coordinate sales activities

Reduce costs through reuse and improve quality through known assets

The Integrated Content Repository is a true Enterprise 2.0 solution: it makes use of the collaborative, user-driven content built using Web 2.0 techniques and technologies on the MIKE2.0 site and incorporates it internally into the enterprise. The approach followed to build this repository is referred to as a mashup.

Feel free to try it out when you have a moment- we’re always open to new content ideas.

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

One of my favorite books is SuperFreakonomics by economist Steven Levitt and journalist Stephen Dubner, in which, as with their first book and podcast, they challenge conventional thinking on a variety of topics, often revealing counterintuitive insights about how the world works. One of the many examples from the book is their analysis of the Endangered Species Act (ESA) passed by the United States in 1973 with the intention to protect critically imperiled species from extinction.

Stop reading now if your organisation is easier to navigate today than it was 3, 5 or 10 years ago. The reality that most of us face is that the general ledger that might have cost $100,000 to implement twenty or so years ago will now cost $1 million or even $10 million. Just as importantly, it is getting harder to implement new products, services or systems. The cause of this unsustainable business malaise is the complexity of the technology we have chosen to implement.

You’ve likely read all about them–the massive security breaches and cyber attacks hitting major corporations like Home Depot, Target, and even the New York Times. These damaging attacks have cost these companies millions of dollars in damages, and they’re just a portion of all the security risk stories out there. As a small business owner, you may be tempted to think your company doesn’t have to worry as much about cyber attackers inflicting damage on your operations. After all, compared to a big business, your company has relatively few resources and doesn’t leave nearly as big of a footprint on the market. That belief, however, could leave you and your business vulnerable.

One of my favorite books is SuperFreakonomics by economist Steven Levitt and journalist Stephen Dubner, in which, as with their first book and podcast, they challenge conventional thinking on a variety of topics, often revealing counterintuitive insights about how the world works.

One of the many examples from the book is their analysis of the Endangered Species Act (ESA) passed by the United States in 1973 with the intention to protect critically imperiled species from extinction.

Levitt and Dubner argued the ESA could, in fact, be endangering more species than it protects. After a species is designated as endangered, the next step is to designate the geographic areas considered critical habitats for that species. After an initial set of boundaries is made, public hearings are held, allowing time for developers, environmentalists, and others to have their say. The process to finalize the critical habitats can take months or even years. This lag time creates a strong incentive for landowners within the initial geographic boundaries to act before their property is declared a critical habitat or out of concern that it could attract endangered species. Trees are cut down to make their land less hospitable or development projects are fast-tracked before ESA regulation would prevent them. This often has the unintended consequence of hastening the destruction of more critical habitats and expediting the extinction of more endangered species.

This made me wonder whether data governance could be endangering more data than it protects.

After a newly launched data governance program designates the data that must be governed, the next step is to define the policies and procedures that will have to be implemented. A series of meetings are held, allowing time for stakeholders across the organization to have their say. The process to finalize the policies and procedures can take weeks or even months. This lag time provides an opportunity for developing ways to work around data governance processes once they are in place, or ways to simply not report issues. Either way this can create the facade that data is governed when, in fact, it remains endangered.

Just as it’s easy to make the argument that endangered species should be saved, it’s easy to make the argument that data should be governed. Success is a more difficult argument. While the ESA has listed over 2,000 endangered species, only 28 have been delisted due to recovery. That’s a success rate of only one percent. While the success rate of data governance is hopefully higher, as Loraine Lawson recently blogged, a lot of people don’t know if their data governance program is on the right track or not. And that fact in itself might be endangering data more than not governing data at all.

Stop reading now if your organisation is easier to navigate today than it was 3, 5 or 10 years ago. The reality that most of us face is that the general ledger that might have cost $100,000 to implement twenty or so years ago will now cost $1 million or even $10 million. Just as importantly, it is getting harder to implement new products, services or systems.

The cause of this unsustainable business malaise is the complexity of the technology we have chosen to implement.

For the general ledger it is the myriad of interfaces. For financial services products it is the number of systems that need to keep a record of every aspect of business activity. For telecommunications it is the bringing together the OSS and BSS layers of the enterprise. Every function and industry has its own good reasons for the added complexity.

However good the reasons, the result is that it is generally easier to innovate in a small nimble enterprise, even a start-up, than in the big corporates that are the powerhouse of our economies.

While so much of the technology platform creates efficiencies, often enormous and essential to the productivity of the enterprise, it generally doesn’t support or even permit rapid change. It is really hard to design the capacity to change into the systems that support the organisation. The more complex an environment becomes the harder it is to implement change.

Enterprise architecture

Most organisations recognise the impact of complexity and try to reduce it by implementing an enterprise architecture in one form or another. Supporting the architecture is a set of principles which, if implemented in full, will support consistency and dramatically reduce the cost of change. Despite the best will in the world, few businesses or governments succeed in realising their lofty architectural principles.

The reason is that, while architecture is seen as the solution, it is too hard to implement. Most IT organisations run their business through a book of projects. Each project signs-up to an architecture but quickly implements compromises as challenges arise.

It’s no wonder that architects are perhaps the most frustrated of IT professionals. At the start of each project they get wide commitment to the principles they espouse. As deadlines loom, and the scope evolves, project teams make compromises. While each compromise may appear justified they have the cumulative effect of making the organisation more rather than less complex.

Complexity has a cost. If this cost is full appreciated, the smart organisation can see the value in investing in simplification.

Measuring simplicity

While architects have a clear vision of what “simple” looks like, they often have a hard time putting a measure against it. It is this lack of a measure that makes the economics of technology complexity hard to manage.

Increasingly though, technologists are realising that it is in the fragmentation of data across the enterprise that real complexity lies. Even when there are many interacting components, if there is a simple relationship between core information concepts then the architecture is generally simple to manage.

In summary, the measure looks at how many steps are required to bring together key concepts such as customer, product and staff. The more fragmented information is, the more difficult any business change or product implementation becomes.

Consider the general ledger discussed earlier. In its first implementation in the twentieth century, each key concept associated with the chart of accounts would have been managed in a master list whereas by the time we implement the same functionality today there would be literally hundreds if not thousands of points where various parts of the chart of accounts are required to index interfaces to subsidiary systems across the enterprise.

Trading simplicity

One approach to realising these benefits is to have dedicated simplification projects. Unfortunately these are the first projects that get cut if short-term savings are needed.

Alternatively, imagine if every project that adds complexity (a little like adding pollution) needed to offset that complexity with equal and opposite “simplicity credits”. Having quantified complexity, architects are well placed to define whether each new project simplifies the enterprise or adds complexity.

Some projects simply have no choice but to add complexity. For example, a new marketing campaign system might have to add customer attributes. However, if they increase the complexity they should buy simplicity “offsets” a little like carbon credits.

The implementation of a new general ledger might provide a great opportunity to reduce complexity by bringing various interfaces together or it could add to it by increasing the sophistication of the chart of the accounts.

In some cases, a project may start off simplifying the enterprise by using enterprise workflow or leveraging a third-party cloud solution, however in the heat of implementation be forced to make compromises that make it a net complexity “polluter”.

The CIO has a role to act as the steward of the enterprise and measure this complexity. Project managers should not be allowed to forget their responsibility to leave the organisation cleaner and leaner at the conclusion of their project. They should include the cost of this in their project budget and purchase offsetting credits from others if they cannot deliver within the original scope due complicating factors.

Those that are most impacted by complexity can pick their priority areas for funding. Early wins will likely reduce support costs and errors in customer service. Far from languishing in the backblocks of the portfolio, project managers will be queueing-up to rid the organisation of many of these long-term annoyances to get the cheapest simplicity credits that they can find!

You’ve likely read all about them–the massive security breaches and cyber attacks hitting major corporations like Home Depot, Target, and even the New York Times. These damaging attacks have cost these companies millions of dollars in damages, and they’re just a portion of all the security risk stories out there. As a small business owner, you may be tempted to think your company doesn’t have to worry as much about cyber attackers inflicting damage on your operations. After all, compared to a big business, your company has relatively few resources and doesn’t leave nearly as big of a footprint on the market. That belief, however, could leave you and your business vulnerable. A study from the National Cyber Security Alliance shows that one out of every five businesses becomes a victim of cyber attacks every year, with an even larger portion targeted by hackers. Small businesses need to work to improve their network security, because it’s not a question of if a cyber attack happens but when.

Of course, many small business owners are aware of the security risks and would like nothing more than to invest in the features that would help them repel hackers. The problem is many of these features require money, time, and other resources, and since many small businesses can only barely make ends meet, security issues tend to fall by the wayside. Luckily, there are still several methods you can employ that will increase your network security and keep your small business safe at no extra cost. One of the first and foremost measures that will provide added protection for your network is having stronger passwords for all your accounts as well as your employees’ accounts. A strong password is long–usually around eight characters or more. The password should contain capital letters, numbers, and symbols, and should not have simple words or phrases, no matter how memorable they might be to you. In addition to stronger passwords, small business leaders should also keep tight control over who has administrative access, which can be made easier through existing tools already found on many desktops and laptops (the Local Group Policy Editor for Windows 8 is a good example of this).

Much of a small businesses network security will depend on the workers. While employees may do great work, they also represent a weakness in a security system. Employees can be susceptible to spearfishing attacks and social engineering, which may introduce malware into the network and infect other systems. The best way to combat this is to educate your employees. Teach them the common methods hackers use to gain access to a network and demonstrate the methods they can practice to prevent these attacks from happening. Small business owners should also make sure every employee’s mobile device is secure, since mobile technology is an increasingly popular target for cyber attacks.

All small business owners should also identify the points in their network that are the most vulnerable. While this can be done with purchased software, it’s well known that wireless routers are a favorite entry point for persistent hackers. In fact, recent research shows that around 80% of the 25 best-selling wireless routers for small and home offices on Amazon had notable security vulnerabilities. Hackers can easily exploit wireless routers, often to damaging effect. There are a number of ways you can make your wireless router more secure for your small business. First, you should never use the default IP ranges that come with the router since attackers can easily predict certain addresses and use them. Second, make sure you turn on your router’s encryption while turning off WPS. And third, as mentioned earlier, make sure your router’s password is particularly strong. Change it from the default password you’re given and turn it into one that’s near impossible to crack.

As always, there are security products out there available for purchase, each with varying degrees of features, quality, and price. While the options are plentiful, you should always take into consideration a number of different factors. You need to ensure you have the staff that is trained to fully utilize the software. You also should make sure the program can be configured quickly and easily to suit your needs. Also, try to get software that will have new capabilities added to it as your small business grows over the years. With these considerations in mind, you’ll be sure to pick a security product that’s right for your business.

Attacks happen, and unfortunately there’s no way to prevent them 100% of the time. The best you can do to protect your small business network is to have the security features that will give you a fighting chance. With improved network security, you’ll be able to grow your business with confidence and a safe outlook for the future.

There are a number Business Drivers for Better Metadata Management that have caused metadata management to grow in importance over the past few years at most major organisations. These organisations are focused on more than just a data dictionary across their information – they are building comprehensive solutions for managing business and technical metadata.

Our wiki article on the subject explores many factors contributing to the growth of metadata and guidance to better manage it:

Did You Know?
All content on MIKE2.0 and any contributions you make are published under the Creative Commons license. This allows you free re-use of our content as long as you add a brief reference back to us.

No doubt you’ve heard of bring your own device (BYOD) already. It’s been nearly impossible to avoid the hype surrounding BYOD and all the benefits supporters say it offers. While companies may be focused on BYOD, another trend has slowly but steadily been catching on. It’s called bring your own network (BYON), and while it might sound similar to BYOD, it’s actually creating even more headaches for IT departments. While BYOD is something businesses can adopt and regulate, BYON mostly operates in the shadows. Bring your own network essentially means employees are using their own mobile devices’ 3 and 4G capabilities to create or access wireless hotspots. This is often done when workers determine that the current business network does not meet the demands of their jobs. As you can imagine this trend is causing more than its fair share of problems, particularly when it comes to security.

Hadoop is an excellent tools for collecting and sorting massive volumes of data, but businesses must also use analytics and visualization tools on top of Hadoop in order to reap the full benefits from big data. Here’s a quick list of apps to successfully manage and leverage the massive amounts of information generated as organizations grow.Read more.

Calls for increased transparency and accountability lead government agencies around the world to make more information available to the public as open data. As more people accessed this information, it quickly became apparent that data quality and data governance issues complicate putting open data to use.
“It’s an open secret,” Joel Gurin wrote, “that a lot of government data is incomplete, inaccurate, or almost unusable. Some agencies, for instance, have pervasive problems in the geographic data they collect: if you try to map the factories the EPA regulates, you’ll see several pop up in China, the Pacific Ocean, or the middle of Boston Harbor.”Read more.

Calls for increased transparency and accountability lead government agencies around the world to make more information available to the public as open data. As more people accessed this information, it quickly became apparent that data quality and data governance issues complicate putting open data to use.

“It’s an open secret,” Joel Gurin wrote, “that a lot of government data is incomplete, inaccurate, or almost unusable. Some agencies, for instance, have pervasive problems in the geographic data they collect: if you try to map the factories the EPA regulates, you’ll see several pop up in China, the Pacific Ocean, or the middle of Boston Harbor.”

A common reason for such data quality issues in the United States government’s data is what David Weinberger wrote about Data.gov. “The keepers of the site did not commit themselves to carefully checking all the data before it went live. Nor did they require agencies to come up with well-formulated standards for expressing that data. Instead, it was all just shoveled into the site. Had the site keepers insisted on curating the data, deleting that which was unreliable or judged to be of little value, Data.gov would have become one of those projects that each administration kicks further down the road and never gets done.”

One of the data governances issues Lemieux highlighted was data provenance. “Knowing where data originates and by what means it has been disclosed,” Lemieux explained, “is key to being able to trust data. If end users do not trust data, they are unlikely to believe they can rely upon the information for accountability purposes.” Lemieux explained that determining data provenance can be difficult since “it entails a good deal of effort undertaking such activities as enriching data with metadata, such as the date of creation, the creator of the data, who has had access to the data over time. Full comprehension of data relies on the ability to trace its origins. Without knowledge of data provenance, it can be difficult to interpret the meaning of terms, acronyms, and measures that data creators may have taken for granted, but are much more difficult to decipher over time.”

I think the bad press about open data is a good thing because open data is opening eyes to two basic facts about all data. One, whenever data is made available for review, you will discover data quality issues. Two, whenever data quality issues are discovered, you will need data governance to resolve them. Therefore, the reason we’re failing to get the most out of open data is the same reason we fail to get the most out of any data.

No doubt you’ve heard of bring your own device (BYOD) already. It’s been nearly impossible to avoid the hype surrounding BYOD and all the benefits supporters say it offers. While companies may be focused on BYOD, another trend has slowly but steadily been catching on. It’s called bring your own network (BYON), and while it might sound similar to BYOD, it’s actually creating even more headaches for IT departments. While BYOD is something businesses can adopt and regulate, BYON mostly operates in the shadows. Bring your own network essentially means employees are using their own mobile devices’ 3 and 4G capabilities to create or access wireless hotspots. This is often done when workers determine that the current business network does not meet the demands of their jobs. As you can imagine this trend is causing more than its fair share of problems, particularly when it comes to security.

One of the main points of contention with BYON is how it allows employees to completely avoid the corporate network. Perhaps they do it because they think the company’s network is too slow for what they need. Or maybe they use a different network to gain access to sites that have been blocked. It’s easy to see that avoiding network filters and security measures can lead to significant problems for businesses. Security measures, such as firewalls and antivirus protection, are put in place to protect the network and the devices that have been granted access to it. Employees that use their devices to avoid the corporate network may also represent a weak point that hackers can attack and exploit, gaining access to data and business systems should the workers ever connect to the regular network at any point.

When a business enacts a BYOD policy, it allows them to carefully monitor and create controls for all devices being used for work. However, by using BYON, many employees choose to go behind the IT department’s back, using devices that haven’t been outfitted with the sometimes necessary controls that can improve BYOD security. Without these controls, IT workers will have no way to monitor each employee’s device, nor can they install the protective measures that can serve as a deterrent to more security threats. This is especially important because when employees use their mobile devices for work purposes, they also pose the risk of accidentally accessing unauthorized content or downloading malicious apps and malware. Without security controls, IT has no way to detect malware and no way to wipe a device that gets lost or stolen. Perhaps some employees see this as a benefit, but the fact remains that a device without controls is a bigger risks than one without.

BYON also increases the risks of data leakage. With insecure access points in play, hackers will likely have an easier time infiltrating a mobile device and perusing its contents. If an employee uses the device for work, it may contain company data and other sensitive information that can be used by a hacker to spread further damage. IT departments are normally able to monitor data within the company, but when it comes to devices connected to other networks, IT has no way to ensure data security. Devices that utilize BYON are essentially outside the IT department’s jurisdiction, and that can lead to numerous problems usually not foreseen by the employee.

These security challenges that stem from bring your own network are certainly troubling, but there are solutions that companies can put in place. Many businesses may choose to fully and unequivocally embrace BYOD by establishing clear and precise guidelines over what is permitted while also communicating these policies to employees. Many workers use BYON not knowing it is against company policy, so clear communication can help avoid these problems. Companies should also run business risk assessments to more accurately identify where the weak points are in their network and what data might be in danger of leaking or getting stolen. An outright ban on Wi-Fi hot spots may also be necessary, but that’s for the most extreme cases.

Bring your own network is usually a response to restrictive network policies. Employees want to use their own devices at peak performance outside network restrictions, but the consequences of doing this usually lead to more security problems. Activities outside the network can actually create bigger security threats than what companies see with BYOD, so it’s important for businesses to address BYON problems before they become damaging. An early response can help a company direct its focus to other important matters while keeping networks and systems safe.

Hadoop is an excellent tools for collecting and sorting massive volumes of data, but businesses must also use analytics and visualization tools on top of Hadoop in order to reap the full benefits from big data. Here’s a quick list of apps to successfully manage and leverage the massive amounts of information generated as organizations grow.

1. Roambi: With this application, mobile workers have the ability to access and analyze the same business data they use in the office in order to make smart decisions quickly. Mobile workers need more ways to manipulate data and not be limited by business tools which are often stripped down for mobile use. Enterprise level mobile workers cannot afford to lose any capabilities if they are expected to accomplish business objectives in a timely fashion.

Roambi’s goal is to change the mobile business app landscape by improving the productivity and decision-making of the mobile workforce. The app changes the way people share, interact with and present data from the mobile perspective.

The Phoenix Suns, a professional basketball team, is one such organization who uses analytics for both on-the-court and business decisions. Although skeptical at first, the Suns have found Roambi to be both valuable and easy to use in their business decisions. Utilizing this Big Data app has boosted sales and marketing while being able to make the best decisions possible on the fly.

2. Datameer: Although it seems basic on the surface, Datameer surpasses Excel and other spreadsheet programs by allowing the user link to active data sources as well as import flat files and joining two tabs together into a third like joining tables in a database.

Datameer makes it possible to integrate all data, analyze all data and visualize data helping organizations achieve their respective goals. It is purpose-built for Hadoop enabling the raw data to move to new insights quickly using the different aspects of Hadoop like Oozie.

Big Data integration is made possible with built-in connectors to all common structured and unstructured data sources. Datameer eliminates the need for ETL and pre-defined schemas.

Analyzing big data is made simple because all information is easily integrated into Hadoop with Datameer’s data integration wizard. It’ll help organizations understand the important questions and understand the effect of every transformation made and make the proper analysis adjustments as more data is processed.

Datameer also makes it possible for infographics to be produced with the WYSIWYG Business Infographic Designer. Images can be imported and videos can be embedded while free-form text can be written to organize big data in an aesthetic way.

3. SAS Visual Analytics: SAS seeks to help organizations find answers to questions quickly and then continue to ask more questions helping companies achieve their big data goals. This application uses guided exploration, in-memory processing and advanced data visualization to make it clear. SAS seeks to be a scalable solution of any organization handling data of any size.

With mobile tethering, reports can be explored without internet connectivity. Mobile workers and executives can easily access and explore dashboards anytime regardless of location. SAS mobile apps are available for iPad and Android.

Social media analytics are made easier with Visual Analytics. It can be used to tap into the millions of tweets sent each day to track customer comments in call logs and identify the “hot topics” of the day. SAS makes it possible to pick up on the buzzwords that could lead to the next round of sales or mitigate a current branding issue.

4. Esri ArcGIS: The GIS stands for Geographic Information making it easy to create data-driven maps and visualizations. Esri ArcGIS has set to enable organizations to visualize and analyze big data in a way that reveals patterns, trends and relationships that reports don’t. Esri tech can pull disparate places, streams or even web logs.

Esri seeks to expose patterns through the use of maps which can prove beneficial to organizations regardless of industry or focus. Retailers identify the competition is and where promotions are the most effective. Even banks understand why loans are defaulting and climate change scientists can see the impact of shifting weather patterns. Esri’s analytics also perform predictive modeling using spatially enabled big data to help organizations develop strategies from if/then scenarios.

Conclusion

This is just a quick overview of some of the big data apps available. As Hadoop and big data adoption becomes more mainstream, more tools for analytics and visualization will likely surface. What big data app have you found effective?

The way networks have been built and managed for years may be about to change. That may not come as a big surprise considering how quickly technology evolves from year to year, but the fact remains that networks have been done a certain way for a long time and it may not be long before things are done differently. One of the more popular topics being discussed of late is that of software-defined networking (SDN). The discussions largely center on the benefits SDN can bring to new networking strategies, but any talk of networks will naturally flow into the issue of security. SDN may be a new approach to building, designing, and managing vast networks, but before it’s implemented on a larger scale, its impact on network security will have to be examined as its benefits and drawbacks are properly analyzed.

A Look at SDN

To better understand what software-defined networking is, it’s best to compare it with traditional networking practices. All traditional networks are composed of a controller, or control plane, and the physical network itself, or the data plane. At the heart of the idea of SDN is the separation of these two planes, which allows administrators to better optimize each one. Supporters of SDN say the main reason to do this is to simplify networking, making it more flexible and agile when dealing with different network flows. Management tasks are simplified, which may be applied to security issues as well. This is mostly done by using the same kind of cloud architectures used in cloud computing along with more reactionary resource allocation.

A New Approach

This new design requires a different approach from those adopting the SDN model. The traditional method had builders designing the network first, then adding in the proper security measures later. SDN, however, must be thought of as a major component of the network and designed with this in mind from the very beginning. SDN security measures then become a foundational element of the network, designed directly into the workloads and communications systems. Security isn’t looked at as just another aspect of the network to be dealt with later but rather as one more component to build the rest of the network around.

Benefits of SDN

On the surface, this sounds like a refreshing and effective new approach to addressing network security, and there are certainly benefits that come with it. With the traditional network, firewalls were often difficult to place since network boundaries were ill-defined. Software-defined networking can address this frustrating quirk by actually routing all network traffic through a central firewall. This re-routing also makes data analysis from network traffic much easier, which in turn can be used to detect security threats. SDN also allows for stronger encryption to be used within the designed framework of the network, which can increase the chances of valuable data remaining secure.

There are other ways in which SDN can improve on network security. As mentioned above, a SDN allows for a more dynamic network, which can respond to threats quickly through easy-to-manage network restructuring. SDN also provides for some handy security tools and capabilities, such as instantly enacting a quarantine around networks and endpoints that have been infiltrated by outside attackers. Software-defined networking also makes a larger number of security responses available, like emergency broadcasts, tarpits, and reflector nets.

Weakness of SDN

All of these benefits may sound like implementing SDN is a slam dunk case, but it does come with some drawbacks that are worth considering. SDN is still a new and immature technology, which means developers are still hard at work figuring out how best to properly utilize it. That also means more security vulnerabilities may become evident as time moves on, and since additional security measures can’t simply be added on like in the traditional model, some of those vulnerabilities may not be addressed. It may also be easier for hackers to launch a distributed denial of service attack (DDoS), since attackers only need to infiltrate a single device on the network. And due to the nature of SDN, if one part of the network goes down, the entire network goes down with it.

Conclusion

This of course doesn’t mean that all companies should shy away from SDN permanently. More advances will be made with software-defined networking which maximize its benefits while minimizing or even eliminating its weaknesses. As a new technology, there is still a lot of work to be done in optimizing it. Many businesses are pursuing the goal of better protection for their network, and SDN is just one way this goal may be achieved. Time will tell if the reality of SDN will live up to its potential.

Their view is that individual data quality flaws don’t influence the overall outcome when the data is analyzed because each flaw is only a tiny part of the mass of big data. “In reality,” as Gartner’s Ted Friedman explained, “although each individual flaw has a much smaller impact on the whole dataset than it did when there was less data, there are more flaws than before because there is more data. Therefore, the overall impact of poor-quality data on the whole dataset remains the same. In addition, much of the data that organizations use in a big data context comes from outside, or is of unknown structure and origin. This means that the likelihood of data quality issues is even higher than before. So data quality is actually more important in the world of big data.”

“Convergence of social, mobile, cloud, and big data,” Gartner’s Svetlana Sicular blogged, “presents new requirements: getting the right information to the consumer quickly, ensuring reliability of external data you don’t have control over, validating the relationships among data elements, looking for data synergies and gaps, creating provenance of the data you provide to others, spotting skewed and biased data. In reality, a data scientist job is 80% of a data quality engineer, and just 20% of a researcher, dreamer, and scientist.”

This aligns with Steve Lohr of The New York Times reporting that data scientists are more often data janitors since they spend from 50 percent to 80 percent of their time mired in the more mundane labor of collecting and preparing unruly big data before it can be mined to discover the useful nuggets that provide business insights.

“As the amount and type of raw data sources increases exponentially,” Stefan Groschupf blogged, “data quality issues can wreak havoc on an organization. Data quality has become an important, if sometimes overlooked, piece of the big data equation. Until companies rethink their big data analytics workflow and ensure that data quality is considered at every step of the process—from integration all the way through to the final visualization—the benefits of big data will only be partly realized.”