Do you want to become more familiar with how your company can use artificial intelligence (AI) and machine learning (ML) but feel a bit lost amongst the buzzwords and hype?

Driving business outcomes with AI doesn’t need to be overwhelming.

It’s all about exploring which business problems you want to solve, how good predictions can help you achieve those outcomes, and then taking practical steps to get there while implementing an organization-wide AI strategy.

At Amazon Web Services (AWS), we’re seeing customers use AI and ML to drive better decision-making and evolve to provide innovative solutions across their industries. Many are working with AWS Partner Network (APN) Partners that have earned the AWS Machine Learning Competency.

Where to Begin

For many, AI and ML tend to conjure up ideas that seem both fantastical and un-relatable to your common business goals. With so much discussion about AI strategy and ML in the market, it can feel challenging to reconcile the hype with the measurable business benefits you can achieve by implementing an AI strategy within your organization.

AI needn’t be intimidating or out of reach, though. It’s a surprisingly simple concept and can have a positive impact on virtually any use case your business grapples with.

The use of AI has become increasingly relevant not because it’s a new concept, but because today’s software and technological capabilities have made it so much cheaper to make predictions, which are at the heart of AI. Predictions are the basis of every decision explaining why AI is so pervasive.

In this post, I will explore why you should re-label AI as “cheap predictions.” We’ll also look at the universal use cases that AI can address, and I’ll provide information to help you understand ways that AI can support your business needs.

Using ‘Cheap Predictions’ to Understand the Value of AI

At the heart of AI, which I define as any mechanical process that can be interpreted as human intelligence, are predictions. In today’s world, AI means inexpensive predictions.

So, how do we define predictions?

Simply put, predictions mean using the data that you have to generate the data you don’t have. Being able to make good predictions can help inform your judgment and take actions that address business use cases, overcome challenges, and drive insights leading to new opportunities.

The scope of prediction problems has expanded to many use cases where a simple prediction problem needs to be solved. The authors write, “Given this information, what would a human do?”

Most business problems across industries fall into three categories:

Regression: Predict a number. Here’s an example: predict movie revenue given the genre, list of actors, writers, directors, and time of release.

Classification: Predict which group. Here’s an example: Predict whether a customer will be active in six months given various features of the customer—city, broker, data, credit score, age, etc.

Clustering: Discover groups. Here’s an example: grouping customers into categories to discover common behaviors and help understand the relationships among them.

As predictions become cheaper, there will be more predictions made and more complements to predictions that arise. “These two simple economic forces drive the new opportunities that prediction machines create,” explains Agrawal et al. in Prediction Machines. “At low levels, a prediction machine can relieve humans of predictive tasks and so save on costs.

“As the machine cranks up, prediction can change and improve decision-making quality—but at some point, a prediction machine may become so accurate and reliable that it changes how an organization does things. Some AIs will affect the economics of a business so dramatically that they will no longer be used to simply enhance productivity in executing against the strategy; they will change the strategy itself.”

Data Science and Other Key Terminology to Have in Your Pocket

The potential for you to evolve your business through AI is substantial and can be approached incrementally. Let’s walk through some key concepts and terminology that will guide you as you discover how AI can help your business, talk about AI, and differentiate between real value and cheap talk.

By its nature, we can never be absolutely certain of a prediction. For this reason, predictions are characterized by a percentage of accuracy. Good data is what drives accurate predictions, and I’d estimate about 80-90 percent of the time and money spent on AI is in working with data.

When you begin to look into what you can do with AI, know that it all comes down to what you can do with data. It doesn’t matter if you’re a Fortune 500 company or a company of five. You can use data to make predictions, and that data should be writing your code, not your developers.

If you’re a business executive, then you’ve likely heard how important it is to have a good data scientist on board to begin using AI. While data scientists are integral to an AI strategy, it’s important to know that your business’ AI strategy begins and ends with you, not the data scientists within your company.

A data scientist should be seen as a key collaborator who builds models that use data to uncover the opportunities and outcomes your business seeks. Data scientists make models, and models use data—static and unstructured data or sequential and structured data—to write the code.

If you have data, particularly labeled data, you can feed it into a model that requires little or no engineering. Data scientists are always asking questions and should leave meetings with more questions they seek to answer using data.

While you continue to learn more about building an AI strategy, keep the following definitions handy:

Machine Learning (ML)

AI did not always involve the machines themselves learning from real-world data. For many years, experts were the source of decisions that eventually led to predictions. Of course, there are many problems with this approach.

Machine learning is a subset of AI where predictions are not pre-programmed or determined by experts, but rather, are derived by the algorithmic processing of data. When machines are programmed to create their own representations of data in their own language unbiased by anything but the data that has been presented, we tend to get far better real-world results.

Deep Learning (DL)

Deep learning is a subset of ML where predictive representations of data are processed in layers rather than in parallel or sequentially. DL takes advantage of modern computer architectures that are designed to process long arrays of data that might represent images, audio, or features that are derived from large databases.

Shallow Learning

Shallow learning is a subset of ML where predictive representations of data are learned from simple or heuristic algorithms. These methods can be very effective; however, they do not have the precision, depth, or range of DL and are currently being replaced with DL on a case-by-case basis.

Neural Networks

A neural network is a type of DL structure where layers are arranged as nodes and edges. The design and name are inspired by the fundamental component of the brain: the neuron.

Internet of Things (IoT)

An Internet of Things (IoT) device is any processor, often small, remote, and decentralized, that acts as an independent remote agent. IoT devices frequently house sensors and act on data with ML models, performing predictions and storing data from those sensors. As required by the application, both raw and processed data may be transferred to the cloud to drive action and meaning.

Data Lake

A data lake refers to any kind of data that’s been aggregated in the cloud. Data is the energy source of all analytics and machine learning. The cost of collecting data in the cloud is dramatically less expensive than on-premises, and managing that data is increasingly easier.

Importantly, we are moving away from a world where databases are structured as rows and columns. Unlike a data warehouse, a data lake provides infrastructure for the date to reside in its natural state whether that be as graphs, time series, or completely unstructured collections.

Proprietary Data

Proprietary data is unique to an entity. It is not bought, shared, or derived from public sources. It has been acquired with the sole intention of fulfilling an organization’s purpose. Owing to its uniqueness and the clarity of its purpose, propriety data tends to be highly up-to-date and valuable—frequently more valuable than volumes of historical data.

Well-Architected

A Well-Architected software design is one that diligently fulfills the requirements of ensuring privacy and security, high availability, high performance, cost optimization, and operational excellence.

Graphical Processing Unit (GPU)

Graphical Processing Units (GPUs) are found in every computer. They enable software to present what we see on the screen. GPUs frequently work with long arrays of data, especially when processing 3D images, known as vectors. Vector processing is a fundamental activity that enables deep learning. Because GPUs are ideal for vector processing, they are increasingly being repurposed to train ML models.

Algorithms

ML algorithms are frequently cited as revolutionary. In practice, they’re quite ordinary. It’s the data, not the algorithms, that makes ML important in our age. Imagine a music score with no musician or instrument. Data makes the music in ML.

That said, some algorithms are more important in applied AI than others:

For Deep Learning—CNNs and RNNs

Convolutional Neural Networks (CNNs) are used mostly for predicting the contents of images.

Recurrent Neural Networks (RNNs) are used mostly for predicting the meaning of language and time series data.

Decision Trees and XGBoost

Decision trees are binary graphs that deterministically process input data to a likely prediction. They’re widely used on tabular data, that is, the data most companies work with: spreadsheet data, and the data that comes from a relational database.

XGBoost is a critical algorithm which combines many models in a decision tree structure and combines their strengths to produce exceptional predictions. Whereas any individual model may not predict well, their combination in XGBoost combines their strengths and cancels out their errors.

Frameworks

All software builds on the work of others. This means abstracting functions so that as you get further from the hardware, you need to think less about what it does and more about the problem you’re trying to solve. Here are the most popular ML frameworks, for your reference:

MXNet—An open source library for machine learning from the Apache organization.

TensorFlow—The most popular open source library for ML from Google.

Python—The most popular machine learning programming language. An open source variant called pytorch is currently the fastest growing ML framework.

R—The most popular analytics and statistical programming language; usage is decreasing as new ML techniques are increasingly adopted in python.

I recommend keeping these terms and definitions top-of-mind as you begin exploring new solutions and working to build an AI and ML strategy for your business.

Evolving Your Business with AI and ML Tools on AWS

For example, Royal FloraHolland, the world’s largest flower auction company, worked with APN Partner Xebiaon a machine learning solution that includes gaining more accurate trolley predictions, which can lead to greater operational efficiencies, better customer outcomes, and Euros saved.

Royal FloraHolland also use a deep learning model to check image quality and provide feedback for growers reaching buyers through flower photos, and to create a recommendation engine for buyers using the company’s application.

Video: Xebia & Royal FloraHolland Use ML to Improve Efficiency

Another great example is Lyft, which uses Machine Learning on AWS to detect anomalies in their data at scale and identify business risk. Lyft is one of the world’s leading ride-sharing organizations, and to accurately detect anomalies that could signal larger problems and require immediate attention, they turned to the automation and ML capabilities of Anodot, an APN Advanced Technology Partner with the AWS Machine Learning Competency.

Anodot’s AI-powered time series analytics solution, built on AWS, uses advanced machine learning algorithms to overcome limitations that humans bring to manual data analysis, identifying potential problems in real-time without having to inspect multiple dashboards manually.

Become an AWS Machine Learning Partner

If you’re an APN Partner looking to build a Machine Learning on AWS practice, check out the AWS Navigate Program. AWS Navigate provides a prescriptive path for APN Partners to build a specialized practice on AWS. Through a series of foundational and specialized e-learnings, you can unlock advanced resources and step-by-step guidance for building a successful practice or solution on AWS within the specialized area.