This blog examines past, current, and best practices, techniques, and lessons learned of various business intelligence implementations.

Analytics

May 20, 2016

Within seconds, a company executive in the U.S. can know exactly how many parts their global manufacturing plants are producing. A delivery company can tell you exactly to the minute when their truck will be arriving. A utility company can monitor usage across the country and know when it’s reaching a peak. All of this can be done because of the Internet of Things (IoT) and Big Data.

The IoT is basically a collection of Internet - enabled devices or sensors, other than your computer, which are connected to the Internet and can send and receive data. Big data is what you get when all of this information is collected and analyzed.

Devices such as smartphones, scanners, sensors, and GPS can gather and distribute a lot of information. IoT technology allows the input from these devices to be pulled together. Once it's all been collected, companies can utilize big data analytics tools to improve business operations, manage equipment and people, target marketing and make their business run more effectively and efficiently.

The IoT is forcing people and companies to change the way they look at things. Information is being funneled fast, in large amounts, structured and unstructured and from places we never thought we’d get information from. Refrigerators talking to smartphones for a shopping list, or fitness trackers measuring your burned calories, sensors sending vital health data to doctors to monitor their patient‘s health in real time, and anything else you can imagine. Vendors can then use that information for marketing directly to consumers or provide better and timely service. Inventory in stores could soon be reliant on just a sensor on a shelf that indicates when an item needs to be restocked.

The next step for businesses is to figure out how to make the most of the data pouring in from things like smart meters, devices, and sensors. How is this data going to affect your next business decision and how is it all going to be analyzed?

Companies need to plan for a continuing influx of data as more devices become connected and interconnected. You need the bandwidth to store data, the real-time analytical tools to analyze it, and the ability to monetize it and turn it into something profitable. Without a plan, you could be left behind.

April 29, 2013

Today’s market is all about data.
Consumers want to capture information relevant to their user experience;
marketers want to capture that information to customize offerings for the
consumer; enterprises want to turn data into business intelligence so as to
secure a core competitive advantage; and data center vendors want to push
virtualization so as to support the massive amounts of data to be captured,
stored, mined and managed.

From an enterprise standpoint, there are a number of opportunities to capture
data. In fact, companies throughout the world are capturing data at every touch
point and market feed, hoping to extract the information they need to improve
their product offerings and their market positioning. Without a clear strategy
in place to direct the capture, organization and management of that data,
however, it does nothing more than consume space on the server.

To truly make the most of the data capture, the enterprise needs to understand
the source and why it’s selected, the type of data they want to capture and
what they hope to do with that data once they have it in hand. Let’s examine a
few possibilities:

The mobile consumer – this individual
is in a position to share an immense amount of information with the enterprise,
including location, purchase history, preferred communication channel and even
the information they want to receive via email, text and social media channels.
When captured, this information should not only be stored with the contact
information, it also can be categorized according to the consumer profile,
compiling the likes, dislikes, habits and preferences of a specific target
customer.

The point of sale – whether in person
or through the contact center, the point of sale is one of the best places to
capture valuable data. Customers will share a wealth of information about their
lives, their preferences, their plans for the future and so much more during
this interaction. When that information is captured in the right format, offers
can be generated that match their preferences perfectly, creating an
opportunity for a cross-sell or upsell conversion. That information should also
be stored in the customer account and associated with the profile so as to
develop broader-reaching solutions in the future

The free offer – individuals who
respond to the free offer or complete a form for more information provide a
goldmine of consumer data. The first data capture must be short to ensure
completion, but the follow-up call is the perfect opportunity to ask all the
right questions to qualify the person as a lead, promote them to another buying
opportunity or simply move them to a non-sale opportunity. Regardless of the
classification, the point is to classify the individual and their information
so the company can turn that information into intelligent data and potential
opportunities.

The sheer volume of data being produced by consumers
and the enterprise is putting significant pressure on today’s businesses to
capture that data and turn into a business opportunity. Companies must pay
attention to how they capture, the speed in which they capture, how they
organize and then use that data. The core strategy needs to focus on each of
these elements with a clear direction on how captured data will be used to
promote the core competencies of the business. It also needs to ensure data
capture is immediate as sometimes two minutes is too late. With valid channels
to capture the information in real-time, the enterprise is well on its way to
turning big data into business intelligence.

February 22, 2013

In 2011, Saugatuck completed a survey of 200 enterprise IT users and
business leaders and roughly 30 vendors that found cloud-based business
intelligence and analytics would be among the fast growing of the cloud-based
business management solutions in the market over the next two years. This
growth represents an 84 percent compounded annual rate, but did the prediction ring true?

Among companies that are currently using business intelligence tools and have
been since 2007, the adoption of business intelligence has remained flat. The 2012 Successful BI Survey shows that
approximately 25 percent of the employee base relies on business intelligence
tools, a figure that has not changed in the last five years. Given the adoption
of new technologies and integration into mobile capabilities, this result may
come as a surprise to most.

For others, however, the result of this survey simply demonstrates that the
wrong element is being measured to truly understand what is happening in
business intelligence in 2013. The tools for gathering the data don’t matter
nearly as much as what platform companies are using to access the data and what
they are doing with it once it’s in the data center. It is the challenge of
enterprise in this next generation, and one that is easily overcome with data
analytics and the strategic use of the cloud.

The stagnant adoption of business intelligence tools in the enterprise and the
small business is not due to a lack of understanding of the value it presents,
but instead the result of significant investments in legacy systems that
demanded a focused approach to every network and data center deployment and
integration. The process was often cumbersome and expensive, which limited
access for a number of potential users. Now, as more companies are embracing
the cloud, the playing field is about to change.

The cloud is expanding business intelligence and analytics to include multiple
users throughout the organization, simplifying access and making business
intelligence and the use of analytics more ubiquitous. The cloud provides one
level for managing the complexities of business intelligence, including the
gathering of analytics components, networking and storage. As big data
continues to play a dominate role in a company’s ability to effectively
compete, it’s no longer enough to simply manage information.

All companies are examining the best way to manage the exponential growth in
unstructured data, forcing key decision-makers to determine the best way to
analyze this data in real-time to support the effective use of this
information. While Gartner is predicting the growth of the business
intelligence market to hit 9.7 percent this year, business analytics in the
cloud is expected to grow three times faster.

Businesses of all sizes are flocking to the cloud for business intelligence and
analytics as it provides vast computing and storage resources without
significant investment. Plus, the ability to gather and act on granular
information is a key competitive advantage and one that is difficult and costly
to achieve without business intelligence analytics in the cloud. As the data
bubble continues to expand, those able to embrace the cloud will enjoy greater
capacity and capability when turning that data into actionable intelligence.

June 22, 2008

Many experts consider Microsoft Excel to be the most commonly-used tool for data manipulation and analysis.Why?Because it’s familiar and inexpensive (most users already have the Microsoft Office Suite installed on their desktops, so technically, it’s “free”).In fact, in 2005 Microsoft claimed that there were over 150 million Excel users around the world, most of whom are leveraging the tool for reporting purposes.

However, relying on Excel alone for business intelligence can create several major problems.The first is usability.While financial staff and business analysts may be proficient in the most sophisticated Excel features, which are required for the kind of in-depth analysis needed to support effective strategic decision-making, the average business user is not.

Another issue is consistency.If multiple users maintain spreadsheets on separate PCs, there are bound to be concerns regarding accuracy and latency.Data warehousing and business intelligence guru Claudia Imhoff states that Excel is “largely responsible for ‘single version of the truth’ failures in business intelligence”.This can be particularly detrimental when it comes to Sarbanes Oxley compliance, where the validity and integrity of financial data is critical.

Over the past decade, business intelligence vendors have recognized the popularity of Excel as a data analysis vehicle, and realized that spreadsheets simply aren’t going away anytime soon.Excel and BI solutions must work together, linking seamlessly to provide everyone – from the executive to the power user to the front line worker – with timely and valuable insight in the format they’re most comfortable working with.

Today, the various BI applications on the market offer varying degrees of Excel integration. The available Excel-related features can be broken down into three levels:

Simple. Some vendors don’t actually integrate their BI offerings with Excel, but provide an Excel-like interface and spreadsheet-style output.This approach hasn’t been particularly successful, because many die-hard Excel users won’t switch to another tool, no matter how similar and intuitive it is.

Moderate.Many BI solutions provide the ability for users to generate reports in Excel format.This method will certainly satisfy the needs of users who want to analyze information in Excel by allowing them to perform one-time extractions of data into spreadsheets, then save the spreadsheets to their desktops before further manipulating that data.There are some serious limitations, however.For example, there is no central repository of queries and data requests, and no audit trail.

Sophisticated. Some business intelligence applications deliver truly deep Excel integration.The BI tool tracks all extraction and manipulation activities such as ownership, calculations and formulas, formatting, and more.They even support PivotTable output and other advanced features.

But most importantly, Imhoff and other experts agree that it takes more than just seamless integration between a BI environment and Excel to make these two work together the way they should.Internal controls – such as tracking exports, limiting what data can be populated into Excel spreadsheets, and linking the spreadsheets back to the original data source and performing automatic refreshes at periodic intervals – can help guarantee consistency and accuracy in reporting and analysis processes, no matter which tool users prefer.

Financial services firms have traditionally been known as pioneers when it comes to business intelligence, paving the way for new and exciting uses of reporting and analysis tools. BI has helped many of the leading banks, brokerage firms, and insurance companies to improve operational efficiency, boost revenues, and gain a competitive advantage.

There are a variety of ways that BI can be successfully applied in financial services scenarios to deliver value, including:

A complete view of the customer

Mergers and acquisitions are rampant throughout the financial services industry, thanks to heavy deregulation over the past decade.Companies that once competed aggressively against each other are now all joining forces to form larger, full-service financial conglomerates.

But, until these “mega-companies” fully integrate their disparate environments – initiatives that could potentially take years to complete – business intelligence can provide the most effective way for these firms to combine data from various heterogeneous systems, and empower employees with a single, timely, and fully accurate view of the customer.This is particularly important when it comes to service and support activities, where representatives need to have a complete understanding of client activities across all lines of business.

Increased sales and revenue opportunities

Consolidation has lead to increased product diversification, presenting greater opportunities to improve customer value by selling additional products and services.

Business intelligence can help facilitate more successful up-sell and cross-sell initiatives by aggregating and presenting data from siloed CRM and sales force automation systems across multiple platforms, giving sales reps insight into what products and services each client already has, and which ones they may be interested in.

Risk management

Financial services firms face greater risks than companies in other industries.Data breaches, transaction fraud, poor lending decisions, and other threats must be avoided at all costs in order for companies to maintain profitability, protect their reputations, and avoid stiff fines and penalties.Some of the more advanced business intelligence solutions can be effectively applied to enhance all facets of risk management.For example, predictive analytics can be employed for use in credit scoring applications, while a combination of data mining and real-time alert systems can be utilized for fraud detection purposes.

Performance measurement

As financial entities grow larger, it becomes increasingly difficult to effectively measure and manage performance across all offices and lines of business.Yet, in order to optimize core operations and facilitate sustained business growth, they need to monitor the metrics that matter most.Which products are selling well and which ones aren’t?Which branch offices are on target to meet their sales quotas and which ones are underperforming?Which customers are the most and least profitable?What are average client churn rates?BI systems provide this kind of insight, even across multiple technology environments that have yet to be fully integrated.

For years, the majority of the business intelligence solutions on the market were developer tools that enabled technical staff to rapidly build and deploy custom reporting applications.But, studies show that some BI projects can take up to a year to complete, and cost millions of dollars in time and labor.So, while this approach was feasible for larger enterprises who had the IT personnel and budgets available to take on large BI initiatives, small and mid-sized organizations were left wondering how they would find the required resources.

More and more frequently, vendors are introducing packaged analytics for a variety of core business functions such as human resources, accounting and finance, supply chain management, production and manufacturing, sales and marketing, call center management, and more.These solutions are designed to accelerate deployment and roll-out by:

Providing a broad portfolio of pre-developed reports and key performance indicators for each functional process they support.

Leading analysts including Gartner expect that in the next several years, corporate buying trends will shift more towards pre-packaged BI applications, and away from “build it yourself” solutions.

This, in theory, makes sense.While activities and workflows vary from company to company, there are many similar processes and metrics that exist in all businesses.For example, all sales organizations need to measure revenues against quota, all human resources teams need to track payroll expenses and attendance, and all call centers need to monitor first call resolution and average length of interaction.

But, several questions come to mind when considering pre-packaged analytics.First, does the BI company that developed the application have in-depth expertise in that specific business area?Some solution providers are experts in technology and reporting concepts – but lack true knowledge and insight into the business issues that drive information needs in the first place.

The second issue is flexibility. Every department within every company will have unique information requirements.Do these pre-packaged applications come with the tools needed to customize existing reports, or to create new ones from scratch?Or are customers simply forced to limit their analysis activities to only those pre-developed reports provided?Industry experts claim that a good pre-packaged reporting system will support 60 to 80 percent of required information needs out-of-the-box, with the remaining 20 to 40 percent requiring custom development.

The truth is, whether or not a pre-packaged BI application can deliver value depends primarily on who it is purchased from.It takes time and effort to find the right vendor with the right domain expertise, who can effectively combine pre-packaged reports with user-friendly customization capabilities.The right provider can deliver the kind of package that can get companies up-and-running quickly and cost-effectively, and meet the majority of their information needs, so they can rapidly realize the true potential of BI.

May 14, 2008

For years, business intelligence solutions have given companies the power to develop more successful strategies and make better, more informed decisions.By tapping into the data that is locked away in enterprise systems, BI tools give companies greater insight into how their businesses run, and the factors that affect success.

But, do BI systems that access only internal data present the whole picture?Or does additional intelligence exist in other sources beyond corporate walls?

The emergence of Web 2.0 is transforming the face of business intelligence, offering new and more innovative ways for companies to leverage information to boost overall business performance.The ability to take data from back-end business applications, and combine it with unstructured information from sites across the Web, gives new meaning to the word insight.

The concept of combining structured and unstructured data is nothing new.Some of the more advanced BI tools on the market provide the ability to report on information contained in emails or documents, while others can analyze the content of audio files. But, many companies are beginning to realize that the world of Web 2.0 has generated a variety of new information resources, such as blogs, RSS feeds, and wiki pages.

For example, a company is conducting in-depth competitive analysis, to identify critical marketplace trends that may impact future sales.It can use its existing BI solution to pull data about competitive deals from its customer relationship management (CRM) and sales force automation (SFA) systems, such as those that were won or lost when another vendor was involved.It may also be able to combine that internal information with the content of analyst reports, in-house research, and other documents.

But Web 2.0 takes it even further, providing a wealth of additional information that can shed new light on the perceptions that potential buyers have about the company, and its competition, and their likelihood to buy from one over the other.The online communities and social networking sites that encompass Web 2.0 give companies unhindered insight into what their target audience really thinks, needs, and wants, and as a result, can make traditional business intelligence activities such as competitive research, sales forecasting, and marketplace trend analysis more accurate and more valuable.

Or, perhaps a company wants to better assess customer satisfaction levels.Survey data can be pulled from help desk and CRM systems to better understand if clients are satisfied with the products or services they purchased.But, by combining that data with product reviews, ratings, and other customer-generated content from across the social networking spectrum, organizations can gain the most complete and accurate picture possible.

In fact, the additional context that this approach can add to standard BI is so important, that Aberdeen Research claims that more than half of best-in-class corporations are investigating ways to pull data from Web 2.0 sources, and combine it with other information assets – while almost one-third already to so.

The need is clear, but the technology is just now catching up.While document and content management solution providers have already begun introducing related capabilities, the BI tool vendors are frantically trying to rush to market with applications that can locate and pull data from Web 2.0-based sources.What the future holds remains to be seen.

May 10, 2008

Business intelligence is a rapidly-evolving discipline.And, as companies strive to maximize the value of their enterprise information assets, new technologies and techniques continue to emerge at a rapid pace.The latest BI methodology to spark interest among users and industry experts alike is in-memory analytics.

The primary goal of in-memory analytics is to eliminate standard disk-based BI deployments, which are typically relational or OLAP-based.These traditional implementations come with numerous drawbacks such as poor flexibility, limited scope of analysis, and slow response times.With in-memory analytics, the reporting software performs all needed analytical functions at runtime – including data retrieval and storage, manipulation, calculation, formatting, etc. – within the memory of a 64-bit server.

Why are so many vendors and their clients suddenly embracing in-memory analytics? And what are the benefits to this approach?

The key advantage of in-memory analytics is speed.Because queries and related data reside in the server’s memory, report generation does not require any network access or disk I/O.This will dramatically increase the performance and reliability of the data warehouses and back-end databases in which the required report data exists – particularly when the report in question has a large answer set.Therefore, regardless of the size and complexity of the query, or the amount of information it will return, users who leverage in-memory analytics will get faster answers, without any negative impact to operational systems.

The second key benefit is affordability.In the past, memory costs were prohibitive, and 32-bit architectures offered limited processing power and storage.But today, the costs associated with memory continue to decline, while 64-bit computing delivers much greater memory stores.This makes in-memory analytics a less expensive and more feasible way to operate an enterprise business intelligence environment.

In-memory analytics can also dramatically reduce dependence on IT personnel, because it reduces the data management burden for reporting and analysis purposes.For example, it can potentially eliminate the need to build, deploy, and maintain OLAP cubes. And, it can cut down drastically on data warehouse maintenance.

The benefits of in-memory analytics are clear, and experts agree that more and more companies will utilize this technique in the coming years.In fact, Gartner anticipates that by 2012, 70 percent of Global 1000 companies will load detailed data into memory as the primary means of improving the performance of BI systems.

Mashups are the latest Internet buzzword.Many experts are touting mashups as the best thing to ever happen to Web applications.But what exactly is a mashup?And, what will in mean to Web-based business intelligence in the years to come?

A mashup is sort of like a puzzle.It’s a fully-interactive Web application that is “mashed up” from the best pieces of other Internet-based systems and data sources.These components are combined to create entirely new services for end users.An e-commerce site that allows buyers to calculate shipping costs by drawing directly from the functionality of US Postal Service’s Web application for shipping estimates represents a mashup in its simplest form.

Mashups have been around for quite a while, but have mostly been applied in consumer- or social networking-based environments.The primary reason companies have been slow to embrace mashups are their potential for increased risk.

According to Jason Bloomberg, senior analyst and principal at ZapThink, a service orientation and enterprise Web 2.0 advisory firm, “clearly no business would risk allowing any of its employees to assemble and reassemble business processes willy nilly, with no controls in place.”He stresses the importance of rigid corporate policies and strict governance – guidelines that most organizations currently have yet to implement.

Yet, the value of mashups is undeniable, and companies are quickly realizing that they provide a fast and highly effective way to build powerful new mission-critical Web applications, or add more robust functionality to their existing ones.

Mashups become particularly beneficial in the case of business intelligence, where they can extend reporting and analysis capabilities in ways that never before seemed possible.With mashups, reporting applications can draw directly from any Web-based information source, providing richer and deeper insight by combining intelligence gathered from across the Web with data that resides in enterprise systems.Or, a BI dashboard can be enhanced by incorporating elements of Google Maps or other Web applications into the environment, allowing end users to dynamically present report output through those vehicles.

But perhaps the greatest impact mashups will have is through their ability to enable IT staff to truly embed BI into other applications.Through a mashup of various APIs, end users can conduct more detailed and comprehensive reporting directly from within a CRM or ERP application.Or, report output can become fully embedded within automated workflow or business process management systems, where the generation of a report can dynamically trigger a subsequent event, based on its content.

Imagine this scenario: A warehouse employee pulls a product off the shelf, and logs the transaction in the company’s inventory management system.An updated inventory report is automatically run each hour and sent to the warehouse manager via email.However, the employee’s removal of the product has caused stock to dip below acceptable levels, and the warehouse manager is away from his desk, and therefore, unaware of the problem and unable to correct it.With a mashup, that report – and the issue it indicates – can automatically and instantly trigger a re-order through the company’s purchasing system, replenishing inventory before customer orders go unfulfilled.

The benefits that can be achieved through the use of mashups, as well as the ways in which they can be applied to extend and enhance BI applications, are virtually unlimited.With the correct policies and procedures in place, companies can fully leverage mashups to ensure that intelligence drives the most effective decision-making and process execution possible.