Disclaimer

Category: Cloud

As always, Amazon Web Services (AWS) made a bunch of announcements at their recent Chicago Summit. The new features have been reported to death elsewhere so I won’t repeat that, but there were a few observations that struck me about them…

Firstly, the two new EBS storage volume types aimed at high throughout rather than IOPS – are 50% and 25% of the normal SSD EBS price, so are effectively a price cut for big data users. As I’ve commented before, the age of big headline grabbing “across the board” cloud price reductions is largely over – and now the price reductions tend to come in the form of better price/performance characteristics. In fact, this seems to be one of Google’s main competitive attacks on AWS.

Of course, I welcome the extra flexibility – it’s always comforting to have more tools in the toolbox. And to be fair, there is a nice table in the AWS blog post that gives good guidance on when to use each option. Other cloud vendors are introducing design complexity for well-meaning reasons also, e.g. see Google’s custom machine types.

What strikes me about this is that the job of architecting a public cloud solution is getting more and more complex and requires deeper knowledge and skills, i.e. the opposite of the promise of PaaS. You need a deeper and deeper understanding of the IOPS and throughout needs of your workload, and its memory and CPU requirements. In a magic PaaS world you’d just leave all this infrastructure design nonsense to the “platform” to make an optimised decision on. Maybe a logical extension of AWS’s direction of travel here is to potentially offer an auto-tiered EBS storage model, where the throughput and IOPS characteristics of the EBS volume type is dynamically modified based upon workload behaviour patterns (similar to something that on-premise storage systems have been doing for a long time). And auto-tiered CPU/memory allocation would also be possible (with the right governance). This would take away some more of theundifferentiated heavy lifting that AWS try and avoid for their customers.

So…related to that point about PaaS – another recent announcement was that Elastic Beanstalk now supports automatic weekly updates for minor patches/updates to the stack that it auto-deploys for you, e.g. for patches on the web server etc. It then runs confidence tests that you define before swapping over traffic from the old to the new deployment. This is probably good enough for most new apps, and moves the patching burden to AWS, away from the operations team. This is potentially very significant I think – and it’s in that fuzzy area where IaaS stops and PaaS starts. I must confess to having not used Elastic Beanstalk much in the past, sticking to the mantra that I “need more control” etc and so going straight to CloudFormation. I see customers doing the same thing. As more and more apps are designed with cloud deployment in mind and use cloud-friendly software stacks, I can’t see any good reason why this dull but important patching work cannot be delegated to the cloud service provider, and for a significant operations cost saving. Going forward, where SaaS is not an appropriate option, this should be a key design and procurement criteria in enterprise software deployments.

Finally, the last announcement that caught my eye was the AWS Application Discovery service – another small tack in the coffin of SI business models based on making some of their money from large scale application estate assessments. It’s not live yet and I’m not clear on the pricing (maybe only available via AWS and their partners), and probably it’ll not be mature enough to use when it is first released. It will also have some barriers to use, not least that it requires an on-premise install and so will need to be approved by a customer’s operations and security teams – but it’s a sign of the times and the way it’s going. Obviously AWS want customers to go “all in” and migrate everything including the kitchen sink and then shut down the data centre, but the reality from our work with large global enterprise customers is that the business case for application migrations rarely stacks up unless there is some other compelling event (e.g. such as a data centre contract expiring). However, along with the database migration service etc, they are steadily removing the hurdles to migrations, making those business cases that are marginal just that little bit more appealing…

Digital can pose a range of risks for a bricks and mortar (B&M) retailer including:

Declining market share as customer loyalty to its established, traditional brand is eroded away by disruptive new on-line entrants and more innovative high street competitors

Poor ROI from implementing new in-store digital technologies because they fail to create a superior personalised customer experience across its physical and online channels

The inability to deliver better inventory management using big data and analytics due to immature organisational capabilities in these areas across its supply chain

So how could scalable retail artificial intelligence in the cloud – Brand AI – potentially turn these challenges into unique opportunities for competitive advantage during the next five years? Here are some disruptive ideas…

Brand AI as a personal human relationship

A retailer could personify its brand as a virtual customer assistant accessible anywhere, anytime using voice and text commands from a mobile device. But unlike today’s arguably bland, soulless smartphone versions that focus on delivering simple functionality, Brand AI would have a unique, human character that reflects the retailer’s values to inform its interactions and maturing relationship with an individual customer. Intended to be more than another ‘digital novelty’, this disruptive form of customer engagement builds on and enhances a B&M’s traditional brand as a trusted long-term friend throughout the entire customer journey by offering compelling, timely presale insights, instant payment processing and effective after sales support and care.

Brand AI as an invisible omni-channel

A customer is empowered to select what personal data they choose to share (or keep private) with the Brand AI to enrich their relationship. Social, location, wearable or browsing and buying behaviour data from complementary or even competing retailers could, potentially, be shared via its cloud platform. The Brand AI can analyse this liquid big data using its machine-learning capabilities to create dynamic real-time personalised actionable insights seamlessly across a customer’s physical and digital experience – it is the heartbeat of the retailer’s invisible omni-channel offering.

Critically, Brand AI can transform every retail store visit into a memorable, exclusive customer experience distinct from anything a competing digital disruptor could offer. For example, the Brand AI can advise in-store sales staff in advance what specific products a customer wants or needs that particular day to help personalise this human interaction, provide on-the-spot guidance and critical feedback about products available immediately to drive a purchasing decision, or tailor in-store digital experiences such as virtual reality or media walls to create genuine moments of customer delight. In addition, the AI can capture the customer’s emotional and physical reactions via wearables to these experiences (such as a raised heartbeat when seeing a new product for the first time). Such insights can then be explored later by the customer (including socially with family and friends) using the AI on the retailer’s integrated digital channel to sustain their retention.

Brand AI as an operating model

A further opportunity for using Brand AI is its potential ability to streamline inventory management to improve the customer experience and reduce operating risk. Key processes such as store returns and transfers could benefit from such an approach – not only would the invisible omni-channel AI enable a customer to easily raise the need to return goods, it can also capture the specific reasons why this is happening (rather than this information having to be interpreted by different customer service staff using prescriptive reason codes, for example). Also because the Brand AI has an established personal relationship with the customer it can proactively order a replacement for home delivery or pick up (store or other convenient location) or suggest a suitable alternative product or other cross-sell opportunities to keep the customer satisfied and minimise revenue losses for the retailer.

Managers can also use the AI to help interrogate and identify trends from this complex dataset on returns and transfers. Inventory management reporting and insights are available on demand in a manager or team’s preferred format (such as data visualisation) to support stock purchasing decisions, resolution of supply chain performance issues or investigate region or store specific fraud and theft. And because these analytics are running in the cloud they can be aligned to existing organisational capabilities in this area.

The illustrative benefits for a bricks and mortar retailer using scalable artificial intelligence in the cloud (Brand AI) potentially during the next five years include:

Refreshes the competitive advantages of an established, traditional high street retail brand using new disruptive forms of marketing and customer advocacy

Materially de-risks strategic investment in new in-store digital technologies by explicitly linking these capabilities to an holistic, long-term customer experience

Can improve organisational agility using big data and analytic capabilities to improve existing business processes that directly benefit the retailer and its customers

Digital technology is driving new forms of customer engagement that are rapidly eliminating the functional silos between online and offline retail channels. As a result many high street retailers are already experiencing falling footfall in their physical stores as customers increasingly switch to online competitors for better convenience, choice and prices.

So what might the Liquid Big Data customer experience be like for a global retailer selling ready-to-assemble home furniture, appliances and accessories for example? Here are some ideas…

An eidetic world

Traditionally high street retailers focus on their brand as a source of differentiation to attract customers to their physical stores. Yet conversely, digital empowers customers to focus on their specific wants or needs regardless of provide. That’s why their online competitors invest so heavily in user experience design to continually optimise how customers use their channels to browse and buy the products they sell – choice and accessibility as a form of differentiation. To combat this challenge, B&M businesses are increasingly using digital technology (such as touch screens, beacons and virtual reality) to differentiate the in-store experience as something equally empowering or seamless as being online.

However, by choosing to replicate the online experience, in-store risks ignoring a source of competitive advantage unique to B&M: a customer’s physical experience with a product and the wider environment.

Using Liquid Big Data, the retail customer experience does not have a beginning or end nor is it location specific – it’s contextual. Powered by a smartphone app provided by collaborating retailers and suppliers, wearable technology (such as a watch or glasses) could capture the people, places and objects an individual customer likes, loathes or loves throughout their entire lives. Even if such encounters are fleeting, these moments are captured with photographic, eidetic clarity in the individual’s private cloud. The customer can then choose which of these experiences to share with the retailer via the app to create a unique, personalised shopping experience in-store every time they visit.

This could be the raised heartbeat of seeing Rome architecture for the first time – could our example global retailer offer this customer discounts on its in-store Baroque furniture offerings? Another customer loves the feel of velvet – could an in-store sales team member suggest some appropriate soft furnishings? One customer really liked his girlfriend’s coffee table she had at university three years ago – could today’s store visit be an opportunity to find something similar for their new home?

Liquid Big Data enables a high street retailer to use the eidetic physical world as a way to effectively personalise its in-store customer experience using digital technology – enhancing its existing brand as a form of differentiation that can’t be imitated by its online competitors.

Products with personality

Harnessing the power of the eidetic world may not be sufficient long term to differentiate the in-store customer experience versus online. Although it offers a targeted customer experience it doesn’t necessarily make a customer’s relationship any closer or more intimate with the specific products a B&M business sells – a key driver at the heart of the competing online experience.

Yet the customer experience of an online retailer is ultimately a passive, limited engagement typically contingent upon the specific browsing or buying history the customer has with their channel or brand or other self-selecting activity such as social media engagement.

In response, a high street retailer with its partners, suppliers or competitors could use Liquid Big Data to take personalisation to a deeper level – use cloud based artificial intelligence (AI) to create direct relationships between individual customers and the products they sell.

The idea is to personify a product using AI with a user experience similar to smartphone personal assistants or virtual customer service agents. A customer can have a text or voice conversation with the product to explore its suitability for purchase (including reviews or endorsements) and select any desired tailoring or customisation. A customer may also enable the product AI to access his or her eidetic memories or social media profile to help shape and personalise their relationship. The AI can either be used on request or continually available to provide product updates or after sales support. In addition, products may also talk to each other in the cloud as a form of machine learning to identify potential new product designs or opportunities for complements that better meet their individual customers’ needs.

Such insights are then gathered by the retailer and participating stakeholders to inform the customer experience in-store (and beyond), support product development and address any supply chain issues.

For example, our global retailer has found that people across the world keep asking the same question about the performance of a specific brand fridge freezer it sells. Could there be a quality issue with this particular product that needs investigating? Customers in a specific region like the way the product is sold in-store by staff based on their after sales conversation with the AI – how can this be replicated in other regions where demand is falling? The collective cloud AI has also designed a new cooling, energy-efficient feature for the next model – a potential hot seller that could be delivered in collaboration with the supplier?

The potential headline benefits of a high street retailer using the Liquid Big Data customer experience include:

Enables new forms of personalisation and innovation deeper than anything previously available in the market by integrating real life and digital customer experience

Challenges the seemingly unbreakable competitive advantage of online retailer competitors and other digital disruptors (such as platforms and social media channels)

The closer IT expenditure is to the front line of genuine business need, the better the return on investment should be. So the positives arising from the growth in shadow IT – spend on digital applications and services by business teams rather than the IT function – are huge. Estimates suggest that shadow IT expenditure now accounts for over 30% of total spend and 55% of digital spend. And a key driver of this growth is the increasing prevalence of cloud solutions which can be deployed by a business team with minimal support from IT.

But the full scale of benefits will only be realised if risks created by business owners’ unfamiliarity with technology solution governance and inefficiencies generated by distributed decision-making are identified and managed. The traditional IT-led approach to solution governance, based on large ERP or CRM implementations, will not work for Shadow IT solutions – it is over-engineered for the rapid evolution demanded by business teams. A new model is required – one that is business-led and balances the need of business functions for speed and flexibility with the assurance that IT teams can provide.

So what risks does business ownership of IT solutions create? Operational risk increases in direct proportion to any gap between the knowledge managers need for effective supervision and the knowledge they actually have. The increasing digital divide between senior managers and their younger, junior tech-savvy colleagues is one such example. And as cloud offerings enable solutions to be deployed by functional teams without IT oversight, the need for digital understanding among senior managers is increasing. Research by the Harvard Business Review Analytics Services concluded “Digital acumen is essential for business leaders in today’s hyper-competitive, technology enabled world. But most companies lack the knowledge and skills needed to succeed in the digital aspects of their business.”

With high risk activities – such as proprietary trading in investment banks – these knowledge gaps can be catastrophic. But most cloud solution deployments will not come into this category. A more relevant analogy can be found in the recent history of data and reporting solutions. These are often owned and deployed by business functions – marketing, finance, risk, compliance, operations and HR – in which case multiple reporting solutions are typically being licensed when one would do, generating inefficiency and excess maintenance costs.

Alternatively the deployment may be centrally owned (by IT) with space in the enterprise data warehouse made available to different functions to do with as they would wish. This typically results in multiple ungoverned cottage industries with no documentation of which marts are being used for what purpose and what would happen if they were removed (and probably multiple versions of the truth as well).

This is the type of trap that business-owned, cloud based applications will fall into if there is a lack of management understanding of how such solutions should be governed. Governance has always created tension between business functions and IT teams, with the former seeing the controls IT teams introduce as being over-engineered and a brake on rapid progression. In the absence of IT involvement, the risk – as we have seen with reporting and analytics solutions – is that such disciplines are ignored.

Obviously a balance is required. With digital implementations, there need to be good enough levels of governance. Our experience with delivering data management and reporting solutions over the past fifteen years has given us relevant insights into what this looks like. As one client put it, ‘you provide enough governance to keep IT happy and not so much as to delay delivery’.

So with that in mind, herewith our primer for business leaders on good enough governance.

Ownership

Every cloud solution should have an owner who maintains a business case for the solution’s continued use as part of their accountability to whoever the budget holder is. Unlike traditional implementations where most of the investment is sunk up front, the rental model for cloud solutions requires a living business case with quantifiable improvements in KPIs the solution is delivering tracked against ongoing and forecast costs (including potential spikes). Such an approach facilitates the solution being swapped out should a new one that will generate greater value become available.

Monitoring

The business case requires the determination or inference of linkages between the operational metrics that the solution can impact and the strategic goals and financial objectives of the organisation. These metrics and the hypothesised linkages need to be tracked so both the operational efficacy of the solution and its strategic relevance can be tracked. Hence the second component is the creation of a dashboard to support the living business case. The dashboard also needs to track compliance related metrics and cover change request progress.

Responsibilities

Effective governance requires a sequence in solution deployment of requirement documentation, solution design, delivery, test, release and support, with the same process applying for subsequent changes requests. In the traditional model, these activities are performed by different teams. Cloud solutions typically follow a DevOps model whereby these activities are carried out in rapid sequence by a single business team. Either way, all stages need to be completed so both processes for how changes will be managed and who will be responsible need to be defined.

Oversight

The governance committee needs to have both business and IT representation – IT teams’ experience of solution design and demand management being particularly important to success. The governance committee needs to meet on a regularly scheduled basis – monthly or quarterly – and focus on organisational (e.g. responsibilities), security and the commercial model (to avoid the risk of unbudgeted spikes in costs).

Documentation

There are two facets to the knowledge that needs to be captured in documentation – explicit and tacit. The former includes the business requirements the solution is meeting, process maps for the processes that the solution enables, and the underlying policies and procedures. It should provide all the information required for someone new to operate the solution from scratch under normal conditions. Tacit knowledge covers what to do in abnormal conditions, when problems arise and the process isn’t running smoothly – e.g. who to contact if an important feed is not available, fixes for when the solution doesn’t run as it should, answers to common questions about the outputs generated. Tacit knowledge is typically captured as FAQs and answers. The basic principle should be that a solution SME can’t progress to a new role unless all the necessary knowledge that their replacement will need has been codified and documented.

Integration

Cloud solutions don’t stand in isolation. Typically they require data inputs of some form and generate data outputs. Where does this data come from, how is static data in the solution maintained, what happens with the outputs? All integration points need to be included in the documentation.

Compliance

Cloud solutions need to comply with the organisation’s security policies for access control and data protection. Equally the organisation’s security policies need to evolve to reflect the new cloud-based world – relying on firewalls to lock data in a chamber with one door in and one door out is no longer feasible. Cloud enables and encourages collaborative working practices and the inter-connectivity of system to system processes – data is moving all over the place – and security policies need to evolve to reflect this new reality while still effectively mitigating risk. And the more integrated a cloud solution is, the greater the risk that it opens a gate to other parts of the IT estate, hence controlling access or levels of access is critical. Any data that resides in the solution also needs to be secured (e.g. via encryption or tokenisation) and where that data is hosted needs to comply with data protection legislation and organisational policy.

The rise of cloud requires IT teams to operate differently to how they have historically. Control is no longer an option, collaboration will become the norm. In turn, business owners of cloud solutions need to make the IT function their friend. That will require compromises on both sides – less governance than IT are used to applying, more than business solution owners would like. We believe that addressing the seven factors above will provide the ‘good-enough’ governance required to mitigate operational risk without inhibiting agility and slowing progress to a halt.

With thanks to my colleagues Manoj Bhatt, Mark Howard, Andrea Pesoli and Venkatesh Ramawamy for their contributions to this piece.

Digital Thinking – the ability to disrupt markets using cloud-enabled big data and analytics capabilities – can positively transform how an organisation differentiates itself from competitors and optimise costs.

So how can an organisation maximise the benefits of Digital Thinking? Here are some ideas…

The Mirror Technique

A digital disruptor turns a sector on its head by making it a lot harder for incumbent players to strategically identify, assess and respond to the threats they pose. For example, Uber, Airbnb and TaskRabbit don’t buy or supply the services in the sectors they compete in – they are not direct competitors in a traditional sense. Rather they are convenient intermediaries that offer both customers and suppliers faster, smarter ways to transact with each other using their own monetised cloud services.

Arguably for many organisations (large and small) Digital Thinking like these outlier disruptors can be challenging, particularly where barriers of entry to their sector are considered to be high. To help validate any threats or opportunities from such market innovation, an organisation can apply “The Mirror Technique” – no matter how impossible or unfeasible it looks, what would be the exact reverse of its current approach to market or operating model? What would be the impact on the organisation and its customers if a direct competitor or new entrant implemented this approach first? And critically, how fast could available cloud capabilities realise these potential disruptive forms of competitive advantage?

This application of scenario analysis can help an organisation identify and assess the risks and opportunities it faces from previously unforeseen cloud-powered Digital Thinking – could it be standing on a burning platform or an untapped goldmine?

Big Data Learning

Gathering large volumes of unique data about customers and operating model performance is emerging as a key source of competitive advantage for many organisations. For example, UK high street retailers are capturing data about customer buying behaviours across their physical/in-store and online channels to better personalise their offerings against the experience delivered by Digital Disruptors like Amazon.

A key challenge is learning how to yield these disjointed, complex data sets into something coherent that delivers effective moments of delight for an individual customer using an efficient value chain. Furthermore, by an organisation treating its big data capability as a barrier of entry for competitors, there is a risk of delaying the time to market of any resulting initiatives because it’s having to learn by itself, in isolation, how best to use this form of Digital Thinking.

One way to potentially accelerate and de-risk this learning process is to for an organisation to collaborate with its partners or suppliers in the gathering and application of big data using a cloud platform. In addition, it may even want to consider coopetition with its competitors where participants share their experiences and capabilities together for mutual benefit – “Big Data Learning” enables greater competitive advantage for all the more its shared.

The face to face test

Digital disruptors exploit analytics to inform the design of new forms of personalised engagement including customer centricity, social media and media content. Combined with their ability to offer competing services with lower (or even zero) switching costs for customers; such Digital Thinking risks rendering an organisation’s established market offerings obsolete.

Another outlier disruptor vividly illustrates the competitive advantage of such a move – FaceBook’s $16 billion acquisition of cross-platform mobile messaging service WhatsApp. Not only has WhatsApp decimated Telcos’ revenues from text messaging services, these incumbent players will have no direct access to this huge Social Media channel (estimated 900 million global user base) and the customer insights it generates – Facebook is using Digital Thinking to create an unbeatable barrier of entry as well using this unique analytical capability as a platform for future growth.

However, unlike many traditional organisations, disruptors arguably have no experience of physically serving customers directly to inform these capabilities. This weakness can be exploited by using “The Face To Face Test”, where an organisation applies its own tacit, historic experience of customer engagement to develop a new disruptive market approach using analytics. This test asks questions about how one of their own real life customers would react if a digital service was delivered to them physically by an employee. Example questions include: what information should your employee intuitively, instinctively know about this individual customer before they start serving them? What things shouldn’t they know or ask about? How can your employee genuinely surprise and delight this long standing customer every time?

By applying analytics as a representation of a physical employee drawing from real world experience, an organisation can arguably personalise their customer engagement approach in ways beyond the reach of digital disruptors.

Accelerated design advantage

In the so-called “arms race”, big name tech, automotive and pharmaceutical companies are reportedly spending billions of dollars annually to realise their own IP in this area of Artificial Intelligence. A potential strategic implication is that these first movers will create barriers of entry that prevent other competitors (including small or medium sized enterprises) using AI as a disruptive source of rapid, responsive service design and organisational agility.

A global Liquid Big Data Platform could enable a form of co-opetition between these competitors to realise shared Machine Learning capabilities as a source of competitive advantage that would be unfeasible using their own limited resources. Also, by sharing with each other data or insights about their customers or services could lead to forms of innovation first movers can’t deliver in their silo positions.

Public sector power house

In the UK, health and social care organisations are exploring ways to share Big Data collaboratively to deliver better outcomes for their service users and wider society. A key technical challenge they face is interoperability – the ability of different systems to talk to each other effectively – as their data is often on different legacy networks and applications arguably not originally designed for such a cross-boundary approach.

A cloud-based Liquid Big Data Platform could enable these organisations to overcome these technical barriers to focus on the real value of this business model – joined up preventative and reactive care delivery. Also, if this platform is scalable it could enable organisations with the right analytical capabilities to efficiently power such services in other countries – global collaboration as a source of public service improvement.

Global cost optimiser

Many organisations are migrating their IT assets to cloud to enable cost savings and increased market responsiveness. This includes applications, data and other digital assets that are the source of their competitive advantage. For example, many digital disruptors exploit cloud capabilities to create platforms for services across different countries or the emergence of government transactional services on one shared platform.

An agile Liquid Big Data Platform could continually optimise such benefits by seamlessly moving these assets to different geographies or markets that offer the lowest costs and best support. For example it could be continually transferring hosting services to different countries with the most favourable exchange rates or where there are higher skilled technical development resources.

Liquid Big Data is when competitors use Cloud technology and ways of working to openly share and analyse large volumes of data together for their mutual benefit. Yet an organisation engaging in this form of co-opetition risks losing competitive advantage over its peers and increases the threat of new entrants stealing market share. But could the strategic value of such a move outweigh these risks? Here are some ideas…

The customer comes first

Using Liquid Big Data to join up the customer experience across not just an individual organisation’s sales channels but complementary and even competitor offerings would demonstrate its commitment to personalisation. Customers themselves are already using digital services (such as price comparison websites) to disrupt the silo experience of individual brands to personalise their customer experience – how can organisations gain shared competitive advantage by working together to supercharge this form of empowerment? This approach could help address falling demand in physical stores being experienced by the UK Retail sector for example.

Agile supply chain performance management

Liquid Big Data can drive greater collaboration between organisations and their large (and small) suppliers to help reduce the risk of producing unwanted stock or inventory and deliver better resolutions to other supply chain issues. For example, by sharing real-time sales and operating performance data enables them potentially to work closer together to deliver more accurate, timely forecasting of demand that improves their management of Lean or Agile-like approaches such as just-in-time manufacturing. In addition, it creates opportunities for both partners to adopt new ways of working to further strengthen the agility of their own supply chains.

Data as currency

Given the rise of digital currencies for business to customertransactions using the Cryptocurrency approach, could there be an opportunity to extend this model to enable the trading of Liquid Big Data between organisations instead of cash payments? These business to business transactions could be occurring at different speeds – for example, the instantaneous sharing of insights between organisations wanting to sell tailored complementary services to the same customer or one-off trading of large volume, complex service performance data between suppliers to help them build collaborative services for the same client.

Increased resilience against cybercrime

Every day, many organisations face the risk of hackers trying to disrupt their digital services or steal their large volumes of customer personal data. To mitigate such risks, organisations could collaborate to build and jointly manage secure Cloud services to protect these critical Big Data assets together. Although this approach does not mean these competitors are sharing data with each other, potentially it could enable the creation of a secure Liquid Big Data platform that could be sold as a service to other organisations for their mutual benefit.

Component full life view

Some organisations are trialling the use of IoT sensors in their goods or products to track their performance through the supply chain and customer experience. This approach could also be used to gather data on “long life components” used in consumer electronics, cars or aircraft. Such Liquid Big Data could then be shared with competitors to help validate sector-wide benchmarks for component longevity or be combined with other information (such as environmental factors) to identify other issues that affect their performance.