Thursday, March 27, 2014

Recently, I was browsing www.flipkart.com and www.indianroots.in and had picked up few items in the online shopping cart, only to abandon it, i.e., I did not complete the buying process - because of slow site response and incomplete product information.

In e-commerce world, its a common problem. An abandoned cart is an online "shopping cart" where a shopper has begun the process to purchase items by adding one or more item to their cart but then leaves the site without completing the purchase.

According to an IBM survey, every year roughly US $83 Billion of sales is lost due to poor customer experience. That's more than the total revenue of all retail e-commerce.

Solving this problem is of paramount importance to any e-retailer. So to begin with, product management must have a seat at the business strategy table and "own" the total customer experience and must work towards improving total customer experience.

In this context, product managers have to work on building better customer experience and improve the customer experience, increase customer engagement to stem such losses from abandoned cart.

Note: In this article, I have taken e-commerce as an example. However the concepts explained here is equally applicable to enterprise software products.

A fresh look at the customer experience– and why it matters

To start with, product management should "own" the customer experience and understand customers. Understanding customer in e-retail implies investing in technology and data analytics and build consumer behavior models and then use this insight to deliver differentiating customer experiences - i.e., tailor the product to match customer needs.

Technology is just an enabler and must not be mistaken for the solution or for strategy. Technology is used to understand the customer and tailor the entire marketing channel and web interactions.

Let me explain this in detail in the next section. Essentially, Technology is used everywhere, but the overall solution depends on strategy - which can be grouped into three distinct steps:

Step-1: Understand Customer ContextStep-2: Act Pro-Actively on the insightsStep-3: Take a broader view of Customer Experiences

Understand Customer Context

Understanding Customers in e-commerce world is a lot more than understanding customer demographics: Age, Sex, Income Group, Geographic Location, User history, user platform (PC, tablet, mobile), Time of browsing/purchase. It also involves knowing which marketing channels the customer has been exposed to or used in the past - i.e, knowing did the customer use the information in a email or social network or on-line advertisement, or search engine.

Knowing all this information that makes up the customer context in real time leads to the next step. Note that the key word here is "real time".

However understanding customer context in real-time is a major challenge. Capturing and understanding the full context of each and every customer interaction across all channels is a gargantuan task - and this calls for high investments in technology.

In the world of e-retail, every single interaction as an opportunity to understand customer context. To do this successfully, e-retailer must be hypersensitive to not only behavior, history and preferences, but also the real-time circumstances customers are facing.

To explain this in simpler terms, think of a old physical retail shop. The customer is greeted by a sales associate who scans the customer demeanor and quickly gauges customer's interests. He can gain additional insight by asking probing questions such as "Why do you need it? Or Where do you plan to use it? And other questions. In e-commerce world, we need to use technology to gain such insight and then tailor the buying experience.

Product management can tackle this issue by automating all processes across channels to calibrate the web responses based on the behavior and context of each individual engagement, which is described in the next section.

Act Proactively on the Insights

Once the customer context is understood, the next step is to act proactively on it. In e-commerce world, this could mean:

1. Adjusting the web site in real-time to match the context and creating right-time, right-place offers.
2. Tracking customer interest and communicating that through different channels in real time and also off-line.
3. Providing a high quality of online customer engagement during the moment customer is on your website.
4. Identifying & rectifying customer struggles (such as slow responses etc.) in real time.
5. Automatically creating customer communication portfolios for the future

All these actions are based on insights gained. This could be due to real time analytics that can serve as valuable guideposts for creating high quality customer engagement.

Monitoring, tracking and understanding the quantitative value of customer experiences also provides critical insight that can directly influence marketing actions.

Leading e-retailers devote invest heavily to understanding what's going on and then use the resulting knowledge to drive action. They measure results, identify what's working and what isn't, and follow up to verify that their efforts are making a difference.

Real-time action is very important and must be coordinated across all channels of customer interactions: Online, Mobile devices, social media etc., to create an integrated and consistent customer experience.

When real-time understanding of customer context is achieved and used to coordinate activities, it becomes possible to create highly sought-after right-time, right-place offers.

Monitoring, tracking and understanding the quantitative value of customer experiences also provides critical insight that can directly influence the company's actions – and help to ensure that the correct choices are consistently made.

To win customers' mindshare, trust and loyalty, it's essential to coordinate activities and messages across all channels at all times and keep them aligned to the customer context. This is a highly dynamic and challenging undertaking. But for the customer, it is a seamless experience that sends the message that the company truly knows and understands them.

Take a broader view of Customer Experiences

Each customer experience is an event in itself. Each event taken individually does not constitute an overall customer experience or customer relationship.

Taking a broader view implies in-depth understanding of the full scope and nature of the engagement with customers and using analytics to guide the future actions and build customer relationships that grow and strengthen over time.

Customer experience is influenced by every customer touch point, even the call center, website, email, mobile app and shipping. Customer experience can be improved by addressing any difficulty tied to that experience. Note that every single customer touch point creates a customer experience, from the website, Mobile app, Facebook business page, Email marketing campaign etc. All the details from each of the customer touch points create an overall customer experience.

All leading e-commerce companies maintain all transactional history which is then used in future. Taking a broader view of customer experience helps in enhancing the lifetime value of the customer. Leading e-commerce companies integrate all inbound actions with outbound communications across multiple channels. For example, if customer had abandoned a cart, a follow up email is sent requesting customer to complete the transaction, while the online advertisement points to the product that was in that abandoned cart.

Product Management, having taken up ownership of customer experience and sets up broad strategic goals such as:

1. Complete channel integration.
All channel activities should be mapped and prioritized to foster integration for creating a unified customer communications and touch points.

2. Building an insight-driven organization.
Technology must be used to create a holistic view of the customer relationship and this insight be understood by the entire organization and act on relevant insights.

3. Build an enterprise wide focus on customer experience.
It is essential to understand in depth how customers engage with your website to identify opportunities to serve them better. This implies placing the customer at the center of the business.

While these are long term goals and has a direct impact on the financial front. But these goals also have important challenges. Integration issues will continue to exist as the number of channels and the channels characteristics change with new technology. Even today, leading e-retaileres struggle integrating mobile and social channels with other campaigns and tactics.

Product Management should also have a clear vision of the end goals for each quarter/year. As these goals are tough to achieve, it needs to be broken down into smaller end goals for each quarter. From a product planning perspective, building superior customer experience is a journey with several milestones. Product management lays out the plan for the enterprise and the shows the way forward.

Putting Customer Experiences First

Enhancing customer experience should be part of the overall strategy. Product management defines this strategy and gets a buy-in from all stake holders. Once the strategy is agreed upon, product management has to break down customer experience into measurable metrics, identifies steps to achieve those goals and then give directions to each departments/groups to reach those goals.

In the initial stage, the goal could be to create technologies to measure those metrics.
The broad strategy can be like:
1. How can the company orchestrate the engagement of its customers across all its channels?
2. How to understand the context and motivations of each customer engagement?
3. How to differentiate and create highly personalized customer engagement?
4. How to create new and unique communications?

Leading e-retailers across the world are investing to find the answers to these questions and then set about creating superior multi-channel experiences by making full use of every tool available to them.

E-retailers should go beyond each transaction and follow up with a definite action plan to create a very positive user experience. For example:
1. Contacting customers to get their feedback on their experience.
2. Monitoring/Tracking delivery commitments to ensure customer fulfillment
3. Identifying cross-sell and up-sell opportunities.
4. Designing other customized offers for future use

Consider how Amazon has taken ownership of the entire customer experience by using technology, business analytics, insights and engagement in a carefully orchestrated way. When customers log into Amazon website, they encounter a seamless blend of live web information and interaction, relevant web-store displays that is both customized to them and made more convenient by presenting it in multiple platforms.

The entire process flows smoothly across channels. For example if a customer has clicked on a web advertisement and he is instantly taken to the relevant product page. The page also shows all other products that are relevant to the customer.

This sends a powerful message that Amazon has invested in building a better customer relationship by making it easier to interact and conduct transactions. This cross-channel integration displays broad understanding of the customer, knowledge of customer context and ability to take systematic action, all brought to life through a single customer interaction.

This acquisition indicates the value of Internet of Things (IoT) and brings new platform for Social Networking, taking users of Social Networking to a new level of Virtual Reality interactions.

Oculus VR brings in multiple technologies to force. First is a wearable mobile device Oculus Rift headset, a connected device (Like Google Glass an IoT), but its a lot more involved, the Virtual Reality technology will completely immerse the users into a virtual world.

When this technology is merged with Social Networking, one can create new experiences - which was impossible before. For example, a group of friends can play a game of Soccer - Virtually!

People can meet virtually and yet have a real life like experience. Oculus makes a virtual reality technology, the Oculus Rift headset, that allows you to enter a completely immersive computer-generated environment, like a game or a movie scene or a place far away. This incredible technology makes you feel like you are actually present in another place with other people. Its unlike anything one has experience before.

Actually, this level of Virtual Reality (VR) has been used in expensive flight simulators and other military training equipment. Oculus brings that platform into a more affordable price for general public.

"I'm excited to announce that we've agreed to acquire Oculus VR, the leader in virtual reality technology," said Facebook CEO Mark Zuckerberg in a statement today.

Huge Impact on Facebook

Facebook is no tech slouch. Facebook has tremendous investments in new technologies - such as is face recognition technology called "DeepFace Program". It is also creating a host of new programming tools such as Facebook Hack.

Recently Facebook acquired WhatsApp, which brings in real-time mobile to mobile communication technology. And now with Oculus acquisition, Facebook can take this VR technology, mobile-to-mobile communication technology from WhatsApp and merge both of it to its social networking platform & with new programming tools & visual processing technology, Facebook can create a new experiences for its users. And hope that youth of the world will continue to use Facebook, and prevent the decline in its market base.

"Over the next 10 years, virtual reality will become ubiquitous, affordable, and transformative," says an Oculus blog post. And with Oculus VR platform Facebook can change social networking forever.

The re-branding exercise is a BIG message. Satya Nadella, the new CEO of Microsoft is sending a message to the whole world that Microsoft is moving beyond Windows and he will transform the company in new ways.

Azure Cloud Services

For a long time, Microsoft's Windows Azure Cloud services was the second largest cloud offering and today Azure is a cloud power to reckon with. During this time, cloud services has grown and matured and is now ready for prime time. In face of this change, Azure must adopt and embrace an Open & Extensible cloud infrastructure. Else customers will move to other service providers such as RackSpace, Vmware's vCHS, or Verizon's Terremark, etc.

As a product manager at EMC's ASD division which makes solutions for building private/public cloud services, I have always preached the need for an open and extensible system and to me, it looks like Microsoft is finally embracing a Open and Extensible cloud services, and re-branding exercise is a very public acknowledgment.

Microsoft Azure - An Open Cloud

For a long time, Azure, ever since its creation supported only Windows OS to be run on the virtual machines, and also restricted it to Microsoft tools - MS-SQL, .Net, etc.

But now things have changed and once run Oracle databases & middle ware, Java, PHP, Phyton etc. Today, customers can even run Linux as a guest OS on the virtual machines.

As Microsoft embraces Open architecture, I am sure, Microsoft will allow customers to use VMware ESX hypervisor and Linux or Android or Windows as guest OS. Microsoft will open up Azure to a host of other non-Microsoft technologies as well.

Microsoft Azure - An Extensible Cloud

Azure was always a public cloud, run on Microsoft premises and managed by Microsoft. Now with the launch of Vmware's vCHS - which is a hybrid cloud, Microsoft will have follow suit and allow hybrid cloud services as well - where customers can choose to use Azure in their private cloud or public cloud or a mix of both. This implies that Azure platform will have to be made open and extensible to interact with customer's private cloud.

I would predict that Microsoft will soon release a new set of cloud development tools to help customers extend Azure cloud services to their in-house private clouds and also enable customers to move data & compute workloads in & out of Azure's public cloud.

Monday, March 24, 2014

Year 2014 marks not just a change of leadership at Microsoft, with Satya Nadella taking over helm from Steve Ballmer, but it is also the year when Microsoft Windows steps down and hands over the crown to Android.

Its been a well known fact that PC sales had been declining for last 5 years. And by end of 2014, Windows has completely lost its position in Personal Computing space. ( Though Windows will continue to be relevant OS on the servers)

Android OS is already ruling the end user compute market with smart phones and tablets (with a market share of over 75%), and in 2014, both Lenevo,Acer & HP will release PCs with Android pre-installed.

Sales of desktop and notebook systems are plummeting as consumers choose to spend their money on smartphones and tablets, so PC OEMs have to come up with ways to make PCs sexy again. And one idea that some OEMs are trying is replacing Windows with ... Android.

While there are still few drawbacks in Android OS, the major one being multi-tasking and does not support Adobe Flash, it is still a complete home computer and is ideal for home use.

Why is Windows Dead

Windows 8 is far the best OS for the PC. When it comes to dealing with work loads such as creating audio & video content, graphics, and multi-tasking, Windows 8 is far superior to Android, but most PC users do not create content on their PCs, at the most, they may write a few documents, edit pictures or videos and write emails. The most popular uses of PC for most users are web browsing, social networking and entertainment: gaming or movies. For these applications, Android and Google Apps are perfect. For example, I can do all my home computing work on a Android MiniPC, and best of all, all my work is saved automatically on web. So I don't have to worry about backing it up or carrying hard drives around.

But the real killer of Windows PC is not its capability to deal with traditional work loads. The Windows killer are the completely new capabilities which comes with Internet of things. New compute devices such as Smart Lamps,Car management systems, wearable devices etc., are being built on Android platform and these smart devices do not have connectivity to the windows based PC!

The seamless integration of these new smart devices with Android platform - makes Android PC a very compelling argument to replace the old traditional Windows PC.

Dominance of Android

Android already dominates two of the three platforms:

1. Android already rules the Mobile & Tablet Space - where it has 75% market share
2. Internet-of-things is being built on Android
3. Android can unify everything by becoming the OS of choice in the PC platform.

As I write this article, several CIO's in various organizations are looking at replacing their PCs with virtual desktops (VDI), and I see eight good reasons to opt for Android based PC for deploying Virtual Desktops.

1. Price
With minimal hardware requirement and free software, the cost of basic hardware is so low that Windows based PC's cannot match.

2. Seemless usage
Android has seamless integration with Google Apps. For most enterprise office requirements, Google Apps is adequate. For other apps, users can subscribe to s SaaS service - such as Office365, Saleforce.com, and other SaaS services.

3. Transparent control
IT departments can easily control the SaaS and Virtual desktops. Power users can use the virtual desktops that are running on powerful servers and the central IT department can roll out just the Virtual Desktop service, which can be accessed either on any Android platform. Another major advantage here is that all data is in the cloud - either private or public, this makes it easier it to secure and check for compliance.

4. Cost management
For enterprise IT groups, supporting different PC systems: Laptops, Desktops, Workstations etc., is a major cost and operational headache. Now with Android and BYOD (Bring Your Own Device), enterprises can bring in efficiency of ITaaS (IT as a Service), and have better control over the costs through VDI.

5. Performance
Android is a very simple platform and does not demand high compute resources. Power users can be directed to the right types of VDI, thus eliminating the need for multiple hardware platforms. A simple 1GHz dual core ARM CPU with 4GB RAM gives just the same level performance as a top end x86 laptops. And with a boot time in seconds, users will love the performance.

6. Browser based Workloads
With cloud computing becoming prevalent, all the work is done through a web browser -- so with Android OS well integrated to the best browser, users will have best possible user experience.

7: Apps and add-ons
Android platform has millions of apps, and a very good Mobile device Management (MDM) tools which can control the apps users can access, Users will have all the right tools they need for the future. As Internet of things becomes popular, users will find that Android based apps will allow them to seamlessly integrate with the new devices and new types of business focused tools can be developed.

8: Elegance
With a full touch screen and ultra high resolution screens in sleek formfactors, users will love their Android PCs. It will rival the MacBook Pros and MacBook Air laptops. The looks coupled with high performance and efficiency will make Android PCs the ultimate end-user computing device.

Closing Thoughts

The failure of Windows 8.x and the confusion Microsoft created with its Surface tablet, Microsoft has annoyed all its partners and is running short on friends. PC vendors are forced to look for new market opportunities.

In 2014, Considering its historically install base, Windows will still be the top OS for PCs. But by 2016, Windows will be relegated to history books.

Sunday, March 23, 2014

Data forms the foundation for all Big Data Analytics. The first step in Big Data Project is to define the business objective. The second step is to understand the data needed for that big data project.

To make the most of big data, you have to start with the data you understand and trust.

To understand this better, lets continue on our retail example.

In many retail operations, there are multiple ID's for customer. There is a customer ID, Account number, Loyalty Card Number, Customer Email ID, Customer Web Login ID. All these Ids were is different data types & formats.

So when the company started a big data project, defining the term customer ID was a big challenge. Adding to this confusion, each department has a different ways/systems to track customer purchases. This meant that project team had to spend extra time and efforts to identify and document various data sources and determine which data to use in which reports. While business leaders were hampered with multiple & inconsistent reports coming from duplicated or missed data.

To harness the power of big data, companies must ensure that source of all information is trustworthy and protected. We call this as Master Data - which represents the absolute truth and forms the basis of all analytics. Once the master data is created & protected, organizations can trust the analytics and take decisions.

It is equally important to have a common business glossary for key terms and data sets. Once a common terminology is established, the next step is to define where each piece of data will come from and where it will go - i.e, define all sources of data and the destination of that data. This will help everyone involved in the big data project know exactly what each data terms mean, what each key metrics mean, where the data should originate, and where the data will be stored - thus establishing a "universal truth" for all the business data.

Trustworthy Data

Business leaders are eager to harness the power of big data. However, as the opportunity increases, ensuring that source information is trustworthy and protected becomes exponentially more difficult. If this trustworthiness issue is not addressed directly, end users may lose confidence in the insights generated from their data— which can result in a failure to act on opportunities or against threats.

To make the most of big data, you have to start with data you trust.

Defining universal truth is just a first step. Next step is to secure that the data being generated from the source is indeed "trustworthy". Unfortunately, it is not enough just to establish policies and definitions and hope that people will follow them. To have a truly "trustworthy" data, organizations must be able to trace its path through their systems & have adequate security systems to ensure that the data follows the defined paths only and does not get tampered with.

Data collected by the system must be controlled via some form of access and version control, so that the original data is not lost and any changes made can be tracked. There are lots of tools in the industry to secure & track data.

Defining "good data"

In case of big data, remember that the volume of data can be really large and can easily overwhelm any data collection system. So the data collection system must be tuned to collect only the "Good Data" - i.e., data that is useful in analysis, reporting and decision making.

For example, sensor in a retail shop can collect data on number of people at a particular isle will keep collecting data even when the store is closed. It can also collect lots of other data such as ambient temperature etc., which is may be irrelevant to the analysis.

So organizations must be able to extract only those bits of data necessary to support a particular objective. This way unnecessary data can be kept out of the system and avoid hardware/software costs.

Data Lake

Once all the good data is identified and collected, it has to be pooled together for analysis tools to work. Technically, one need not pool all the data into one system, instead knowing where the good data is stored in a safe trustworthy location - which together forms the data lake - a collection of all valid data.

Please note that Big Data is also characterized by "Variety", "Volume", & "Velocity".

Continuing on our retail operations example, There are a whole lot of "Variety" of data sources that can be tracked:

Variety of data souks also implies different data types: Structured transactional data, Unstructured video/camera data, and Metadata. So you now have multiple sources of data feeding in huge volumes of data at a rapid rate. This implies that your analytics must be able to process this large volume of data in a reasonable time - which can give analytical results in a meaningful way.

Often times, companies do not want to process all the data in real time - because of business objectives and due to customer behavior. For example, customers may react to a marketing campaign at different times, and this reaction can be seen in social networking sites, or tweets, or SMS, or email or web comments at different times. So one must collect the data over a period of time for any useful analysis. This results in creating a system that can handle this huge volume of data in the data lake.

Data Lake Illustrated

All the data collected could reside in multiple locations - which is logically pooled together to form a "Data Lake".

Data collected in this data lake extends beyond on-premise applications and data warehouses. It may include, data from social networking sites, customer tweets, web sites, etc. This type of external data can be harder to collect and analyze than traditional transactional data. Potential insights can be buried in unstructured documents such as: User generated documents, spreadsheets, reports, email and multimedia files.

All this data is collected and secured in this data lake. The secure trusted data in the data lake then forms the basis for the big data analytics project.

Data from multiple sources is collected, sometime selectively to limit the volume of data, and even processed to make the unstructured data more useful. Even the metadata - i.e, the data which describes the main data is also collected. For example, in case of a photograph, time/date, location, type of camera, ambient conditions at that time, etc. are all metadata. Metadata is very important in understanding unstructured data.

Data lakes act as repositories of all valid information: Log files, User database, transactional information, behavioral information. Analytics is often run on the data stored in the data lake.

Analyzing big data at rest

By analyzing large volumes of stored data, organizations can discover patterns and insights that allow them to optimize processes and profitability. In environments where you quickly need to gain insight on hundreds of millions or even billions of records, the ability to match master data at the big-data scale is a must-have. Examples include reconciling large lead lists or matching citizens or customers against watch lists. Additionally, organizations want to analyze large volumes of stored data to discover patterns and insights.

Analyzing big data in motion

The data lake must be capable of handling the high volumes of data that is being generated.

With certain kinds of data - often coming from sensors, there is no time to store it before acting on it because it is constantly changing. In cases of fraud detection, or health care issue or traffic management etc.,. In such cases gaining real time insights with high speed real time analytics is vital. High-velocity, high-volume data calls for in-motion analytics.

Handling such volumes of data can be daunting. How to analyze hundreds of millions of information bits in real time calls for systems that can analyze data in motion. These data in motion analytics require dedicated systems to process data as it gets generated and alerts are sent in real time. In addition the processed data is also captured in the data lake for future analysis.

Data in motion analytics systems analyzes large data volumes with micro-latency. Rather than accumulating and storing data first, the software analyzes data as it flows in and identifies conditions that trigger alerts - such as credit card frauds. Along with the alert, the processed data is also stored in the data lake and can be analyzed with other data for better business outcomes.

Securing Data Lake

The main advantage of creating a data lake is that it acts as a single repository of all data, which users can trust and use in their analytics. Access to this data lake can be controlled and have a unified data protection systems. Data lakes will require role-based access, policies, and policy enforcement.

Often times, all data entering into the data lake are tagged and metadata is added. Metadata information on data security, access controls end user rights are tagged to the data. Each data element is tagged with a set of metadata tags and attributes that describes the data, who can access it, and how it should be accessed and handle it. This tagging is rule based and can be enforced with data management tools.

Protecting the entire data lake is often cheaper than securing each individual component of data. This way, organization can check, monitor and report on data security, privacy and regulatory compliance. Data lakes thus acts as single secure & trusted repository of information.

This is the third part in the Understanding Big Data series. The first two parts are:

Thursday, March 20, 2014

Business leaders need to clearly understand the Business Objectives before embarking on a Big Data project. The first step is to define the business objectives. In a recent survey by IDC, the most popular business objectives of Big Data projects are:

Customer Centric Outcomes

Operational Optimization

Risk Management

Financial Management

New Business Model

HR Analytics

The objectives have to be well defined from a business perspective, else the bag data project will fail. Remember that with big data, you are searching for a needle in a hay stack - but to find that needle, you need to know what the needle looks like, behaves like and feels like.

Every business objective of any big data project has four main parts:

1. Context of the project
2. Needs that the project is trying to meet
3. Vision of what success might look like
4. Outcome in terms of how the organization will benefit from the result

All Big Data Projects must start by defining the objectives. It could take anywhere from a few hours with one person to months or years with a large team. Even the briefest of projects will benefit from some time spent thinking up front.

Now, lets take a deeper look at these four main parts of business objectives:

Context

Every business objective has a context. It is the defining problem that the big data project must solve, Who are the people interested in the results, what are they trying to achieve?

Contexts emerge from understanding who are interested, say for example Sales managers, or Marketing managers or Finance Managers etc., and why they are doing what they are doing?

The context sets the overall tone for the business objectives and guides the big data project. Context provides the background to the project and help define the mission. The Context provides a project with larger goals and helps to keep us on track. Contexts include larger relevant details, like deadlines, that will help us to prioritize our work.

Context comes from people, so with time new contexts emerge as new groups of people come onboard and have different missions.

Here are few examples of contexts:

A marketing manager wants to know the effectiveness of the marketing in social media.

A sales manager wants to predict the how much he will sell next week

A store manager wants to know how often customers visit a particular store.

Needs

Correctly identifying needs is critical. By clearly explaining the needs, business objectives of Big Data projects can be well defined, planned and accomplished. The need must be defined, perhaps a definite problem has to be well articulated.

When we correctly explain a need, we are clearly laying out what it is that could be improved by better knowledge that can be gained by the Big Data project.

Business faces challenges. And these challenges results in specific needs, That that could be fixed by intelligent data analysis. These needs should be presented in terms that are meaningful to the organization. If your method will be to build a model, the need is not to build a model. The need is to solve the problem that having the model will solve.

Big Data analytics is the application of math and computers to solve problems. Business leaders will have to determine which questions can be answered by data analysis.

Continuing on our retail example, Here are some fairly common needs:

Business leader wants to expand operations to a new location. Which one is likely to be most profitable?
Some of our customers leave our website too quickly. We don't understand who they are, where they are from, or when they leave, and we need to know how to retain them.
As a business leader, I need to choose between two marketing campaigns and choose the most effective one.

Vision

Vision is a glimpse of what the results of business objectives will look like - even before implementing a big data project. It is like a mockup, describing the intended results.

Think of vision as a model of a house - a mockup without much low level details - but just enough to give a good picture of what the finished house will look like.

A mockup is a low-detail idealization of what the final result of all the work might look like. Keep in mind that a mockup is not the actual answer we expect to arrive at. Instead, a mockup is an example of the kind of result we would expect, an illustration of the form that results might take. Whether we are designing a tool or pulling data together, concrete knowledge of what we are aiming at is incredibly valuable. Mockups, in one form or another, are the single most useful tool for creating focused, useful big data analytics to work.

Having a good vision is the part of scoping that is most dependent on experience. The ideas we will be able to come up with will mostly be variations on things that we have seen before. There is no shortcut to gaining experience, but there is a fast way to learn from your mistakes, and with big data analytics, you can try out as many of them as you can. Especially if you are just getting started and work on a smaller data set, you can visualize outcomes even before confirming it with complete analytics.

Vision can take the form of a few sentences reporting the outcome of an analysis. For example a simplified graph that illustrates a relationship between variables, or a user interface sketch that captures how people might use a tool. The Vision primes our imagination and starts the wheels turning about what we need to assemble to meet the need.

Before you start a big data project, you need some vision of where we are going and what it might look like to achieve your goal.

Continuing again with our retail example:

Need: A retailer is trying to measure its successes of its email marketing campaign.
Vision: Present key performance indicators on a regular basis. It will consist of graphs and text.

Need: A retailer is looking for new locations to expand.
Vision: Report that shows each location and expected sales for each location.

The most useful part of creating mockups is that it lets you work backward to fill in the data of what we are actually looking for. If you are looking key performance indicators of an email campaign, then you know where to look for information and come up with metrics and valuation models. This help to get all pieces fall into place faster.

Outcome

Lets assume that we have all the results, then we need to understand how this insight will help the organization and what will happen once the reports and insights are generated.

How will it be used?
How will it be integrated into the organization?
Who will own this reports/Actions?
Who will use it?
How will its success be measured?

If you don't understand the intended use of the reports that is being produced, then it will easily get lost on the corporate jungle and all the hard work will go to waste.

The purpose of the big data project must be established while defining the business objectives.

Figuring out what the right outcomes are boils down to three things.

First: Who will see the results? List of people who will use the reports/insights. The persons who will see & use the reports/insights must have the skills to interpret the results, give valuable feedback to the project team and explain any modifications to the initial business objectives (if any).

Second : Who will maintain this big data analytics system. Is this a one time activity or does the organization need repeated runs of this analytics? If there is a need for continuous runs, then who will own and maintain the system?
Who are they? What are their requirements?
What should the project do to build a repeatable & sustainable big data analytics?

Third: What will be the business outcome of this analytics? What will change with this reports? How can we verify that concerned leaders are taking suitable action based on the analytic reports?

It is very important to think through the outcomes before embarking on a project. Therefore the outcomes must be defined into the business objectives.

A good and well defined business objective is the one which has the defined the context, Identified the right needs, and has a good vision of what the results might look like and gives the required outcomes - so that the big data analytic project reports will do something useful.

Seeing the Big Picture

Successful Big Data Analytics Projects start with a well defined business objectives. The business objective needs have a coherent narrative about what the organization might accomplish by analyzing big data, and what problem it is solving.

Once the business objectives are defined, the next step is to identify the right tools and methods to use. Just focusing on maths or software - without a a good business objective will result in wasted time and energy.

While big data promises the sky with a beautiful rainbow, it is really a big challenge to successfully harness the power of big data. To begin with, lets understand what is big data.

Big data is just that - huge volumes of data which the current IT systems cannot handle and organizations do not process this data as this data does not fit into the current databases or data warehousing technologies.

To understand this better, let me explain with an example.

Consider a retail operation and you would like to know: How did sales this month compare to the previous month and also compare with the same month last year?

The traditional way of reporting this will be look at sales transactions & look at sales revenues, also look at ERP systems and report on the merchandise sold. But this does not give the full picture of customers. It does not capture the number of footfalls. It does not tell the number of items customers looked at before buying one product. It does not tell how much time each customer spent in the shop, it does not tell what items customer wanted to buy - but could not do so due to lack of inventory in that shop. So in other words, the traditional IT reporting systems present only a partial - a tiny bit of the full picture.

While Big Data can do this by capturing data from various other systems.

What is Big Data?

Big data is characterized by four parameters: Variety, Volume, Velocity and Veracity. These are often referred to as 4V's.

Variety
Big Data in simple terms implies looking at the same parameters, in our example of monthly sales, in different ways - which is often called as Variety of Data Sources

Continuing on our retail operations example, There are a whole lot of "Variety" of data sources that can be tracked:

1. RFID enabled shopping carts to tell how much time each customer spent in the shop.
2. RFID readers in the cart to identify which items customer put in the cart and then did not buy.
3. Cameras and sensors that shows which areas of the shop was overcrowded and that prevented customers from shopping in that isle.
4. Sensors & RFID readers that show which items customers looked at before selecting an item.
5. Sensors data which tells when each customer came to the shop
6. Sensor data which tells the time taken at the billing section.
7. Internet search information from each customer at the online store.

This type of wide variety of data is valuable, yet today business IT systems cannot handle it or even use it for analyzing monthly sales.

Wide variety of data sources also implies that IT systems must be agile to accommodate a wide variety of data types: Transaction data, Video feeds, sensor data etc and seamlessly integrate with various data processing technologies.

All of this data is stored in a dizzying array of formats. Some gets stored in structured formats in databases or enterprise resource planning (ERP) applications, but much of the video feeds, sensor data, photos are all unstructured data formats that must be managed, stored and processed.

Data doesn't rest once it is in storage. It must be moved from application to application and from system to system so managers and executives can interpret the data and come to meaningful conclusions.

Volume

In traditional IT systems, each customer transaction translated to few KB of data, but if we were to consider all the different variety of data sources, the volume of data generated per transaction will run in several Mega Bytes or even Giga bytes!

Increasing data volume is at the heart of the big data challenge. Large data volumes can cause many obvious technical problems, such as excessive batch processing times, bottlenecks and so on.

The sheer volume and complexity of big data means that the traditional method of managing data does not apply to this new sources of data. A complete new system has to be built to collect, governm and process this large volumes of data. New automation systems are needed for integration, governance and using it.

Researchers at IDC estimate that by the end of 2013, the amount of stored data will exceed 4 zettabytes, or 4 billion terabytes. That's 50 percent more data than the digital universe held at the end of 2012, and four times as much as in 2010.

Velocity

Every second of every day, businesses generate more data. What used to be several MB of data per hour with the older systems, is now several GB!

In our retail example, data is coming in several mega bytes every second. Data is coming in faster than ever. Data from video cameras, RFID readers, sensors are being collected in seconds and microseconds.

The IT systems must be able to understand data as it is streaming in, store that information, quickly process this information, and move data quickly from one application or repository to another, where it can be processed and analyzed.

Unfortunately, many of the older data integration solutions lack the high performance that big data projects require. There is not enough time to collect the data and process it in real time or near real-time.

Validity or Veracity

The first three Vs: Variety, Volume and Velocity define big data.

But the fourth V veracity is most important for business analytics.

Veracity is the validity of data. Once the data is collected and stored, understanding which of the data sets is valid for a particular business analysis is of paramount importance. Running analysis on invalid data results in invalid or useless results.

For Big data to be valuable, the data must be valid.

Often, big data is collected into a vast pool of data - which is often referred as "Data Lake". Big data coming into this data lake has be indexed, categorized and verified to ensure that the data is accurate, current and complete - before running analysis and making business decisions.

For any organization to take advantage of the opportunities available with big data, it must have IT systems, processes and solutions that can handle all four Vs.

It starts with discovering different sources of data, setting up systems for collecting that data, building IT systems to govern, process and store large volumes of data. The IT system must be agile and capable to accommodate a wide variety of data and seamlessly integrate with various technologies and it must automatically discover, protect and monitor sensitive information as part of big data applications.

Organization recognize that big data contains valuable information. They are eager to analyze it to obtain actionable insights that could help them take better decisions that helps improve sales, profits and identify new revenue opportunities.

Forward thinking organizations - such as Flipkart & Myntra in India, Amazon, Wal-Mart, WholeFoods in the USA are already realizing some of these benefits.

While Big data promises big advantages, but capturing these sorts of benefits from big data requires knowing what the business needs and being able to find key items within the larger mass of big data.
One has to start with articulating the business goals, and that helps determine:

1. What data to collect
2. When & how to collect
3. How to store & process

Then only business can generate the analytics and reports necessary to support those objectives. Now business leaders can make meaningful decisions.

Monday, March 03, 2014

2014 is the year when Internet of Things becomes are reality & it will start disrupting the tech industry

2014 is shaping up to be interesting year in the tech industry. Several new & exciting technologies are being introduced. The main focus this year is to move mobile smart devices beyond the cell phone space - and create exciting new products:

The term "Internet of things" emerged as a buzzword over the last year or so to describe the phenomenon of network-connected devices, such as thermostats, that in the past were simple standalone appliances.

The phrase "Internet of things" stands for devices that can be connected wirelessly to Internet and other compute devices - such as phones or tablets or directly to Internet Cloud.

These new connected devices is a gold mine for Telecom companies, car manufacturers, software/App vendors and networking equipment makers. But this new technology can also disrupt their existing business.

The writing on the wall for tech companies is clear: The Internet of things is coming, and you better disrupt or prepare to be disrupted.

The number of these smart & connected devices will grow rapidly and is expected to reach 500 billion or more by the end of the decade. When we think about such large number of connected devices generating huge volumes of data - it will change the very nature of technology industry.

The financial rewards of this new technology is huge. It will touch every aspect of our lives and disrupt every single industry we know of - just like Internet did 25 years ago.

To understand how this Internet of Things can change things, consider the game of basketball. Sensors on the player's shoes can tell how well the players are performing and help coaches plan a better game - by using real time analysis of player & ball movements.

GoQii can change how professional athletes train. GoQii can capture all the movements, exercise, heart rates, calories burnt, sleep patterns etc and feed that information to a real time analysis systems that can help players & coaches plan better.

The possibilities to change things are endless. Successful companies will have to built easy-to-use, high value apps and that will happen over time.

And there are dangers as well. Loss of privacy, digital identity theft and ever widening of the digital divide are the biggest risks. Legal systems must evolve to deal with this new everything-connected world.

Companies around the world are still grappling on how to take advantage of this new technology and many will perish, while several new ones will emerge.

Closing Thoughts

The benefits are so massive that it is inevitable that Internet of Things will be everywhere. By connecting devices over the Internet/Wireless over mobile networks, companies can manage a wide range of new services for their customers.

Leading companies are not waiting for market to emerge. Google paid $3 Billion for Nest's Smart Thermostat technology! Soon others will follow with big money and explore new opportunities.

As the devices get smaller and smarter, as well as the circuitry that drives them, we will have more products that integrate into our life. Soon the question will not be: Whether you're wearing a piece of smart tech, but why not?