Welcome back to my series of video tutorials on effective machine learning with Python's scikit-learn library. In the first three videos, we discussed what machine learning is and how it works, we set up Python for machine learning, and we explored the famous iris dataset. This week, we're going to learn about our first machine learning model and use it to make predictions on the iris dataset!

Video #4: Model training and prediction

What is the K-nearest neighbors classification model?

What are the four steps for model training and prediction in scikit-learn?

How can I apply this pattern to other machine learning models?

Unlike most other machine learning models, K-nearest neighbors (also known as "KNN") can be understood without a deep knowledge of mathematics. First, you might visualize your training data on a coordinate plane, with the x and y coordinates representing the feature values and the color representing the response class:

KNN can predict the response class for a future observation by calculating the "distance" to all training observations and assuming that the response class of nearby observations is likely to be similar. These predictions can be visualized using a classification map:

Watch the video to learn how KNN works in much more detail, and then we'll use the KNN model in scikit-learn to actually make predictions! You'll also see how easy it is in scikit-learn to "tune" your model or to try a different classification model.

Next time, we'll discuss model evaluation procedures, which allow us to estimate how well our models are likely to perform. Model evaluation is a critical machine learning skill because it helps you to tune your models for best performance and to choose between models.

Going forward, videos will be released every other Wednesday so that I can continue to create high-quality lessons without sacrificing the depth with which I present the content. Please subscribe on YouTube to be notified when the next video is released, and then visit YouTube's Subscription Manager to confirm that email notifications are enabled for the Data School channel.

If you have a question or a comment, I'd love to hear from you below. See you next time!

There's not a way (that I know of) to build a classification map that can be easily visualized, if more than 2 features are used.

Keith Perkins

I've looked at a lot of ML information over the past few months. Your video's and ipynb's are some of the best. Please keep them coming.

Vikram

Thanks this tutorial is very helpful

Ramya R

Dear Kevin, thank you very much for this tutorial. It is very helpful and easy to understand as you speak at a slow pace and provide detailed description of concepts. Additional resources are also very useful.