Now there will be a bit of philosophy, and I ask you don't scroll to the Technical part. There will be some big ideas.

Folks that came over Tutorials in this session are already definitely strong guys and knows a lot.

I think that this notebook (maybe an article) will be more useful for the next session patients. А lot of tutorials from the previous course played an important role for me.

Why do I think that this course is the best of the best? Two things. First - excellent presentation of the material. And the second is - mechanism of motivation. It is awesome.
COMPETITIONS. The competitive process gives a rise to the best decisions and motivates the participants. So it was with me.

But, what if the new feature for regression no longer come to mind

And Medium-competition has already killed the kernel on Google Colab for several times. Your computer becomes a brick?

At the same time, a guy (or girl) from the other end of the world successfully hits the baselines and goes to the top 1%. So come on a piece of iron! Let's try to tune some parameters of the model!
Restarting. Oh sh*t...I will go to sleep, Tomorrow I will optimize:

[ ] code

[ ] calculation

[ ] hardware

[ ] brain

Forget it

Try out a new approach (hello n-gramm in td-idf).

Look if there will be a gain in cross-validation by adding Alice’s time spent on YouTube.

Both of them takes time, which directly (almost always) depends on the hardware which you launching your model.

Сode or computing optimization. A good choice, the right groundwork for the future.
But deadlines-work-study-family... where to find time for all of this?
Plus, I spend a lot of time in transport (metro-train-transfer) and I want to spend my time effectively.
In addition, good idea often comes in the most unexpected places and I want to immediately test it.

From the above, a third option appears - take a strong hardware in the cloud. And connect with it from any device, always having a powerful beast under the hood. "A rom dom dom" NFS Underground OST.

In fact, there are many companies whom offer resources for cloud computing. An important aspect for me is that the service must by free of charge, flexy and completeness of the opportunities provided. Therefore, I've choosed Google.

On light computations you can work with Colab or Kaggle Kernels. But there is limit of the memory.

There may be several reasons. The most often is a lack of computing resources (memory). And the most annoying thing is that all cells now need to be restarted. When you save the resulting objects in txt or pickle you are the winner. But only practice makes perfect. At the beginning of the learning path, such restarts of the kernel can, in the first place, enrage. Secondly, it takes a lot of time.

For now we going to use of a nice gift from Google in the amount of 300\$.

Register account(interesting, how many of you, dear readers, doesn’t have it yet?)
There is nothing complicated, everything is standard. Mail and phone confirmation. Specify a real phone number it will be used for billing confirmation.

Let's go onhttps://console.cloud.google.com/freetrial/signup
You'll see that they give us 300\$ and 12 months. I note that my account decreased by only 10.53 \$ during this course from Professor Yorko and his colleagues.
You can safely confirm the card (system will block $ 1 on it for one hour in average)

I think there is no point for my instance to take the power of the GPU. Because neural networks in this course are not considered. And if you want - there is a free GPU in Google Colab. But some of the boosting libraries are now supports GPU. Therefore, keep a close eye on the changes and choose the parameters of the instance based on your needs.

I felt an urgent need for additional resources when I started using XGBOOST, the documentation for it says that there is support for the GPU, but when I started on the Colab GPUs, it began to kill the kernels.

#### Boot Disk change to Ubuntu 16.04. But it is for your choice

#### Configure the firewall and remove the checkbox from "Disks" tab.

#### Press a create button - and voila - we have a strong hardware to ride with.