My team just finished a Kaggle competition to help Home Depot improve their search results. We tried a few machine learning algorithms along the way: starting with linear & logistic regression, moving on to random forests, and experimenting with XGBoost. In this post I’ll focus on random forests because they’re very easy to use and perform well on a variety of problems. We were able to place in the top 5% using a random forest model, but top finishers used a combination of models.

A Markov chain models a series of random transitions from one state to another, with the next state depending only on the current state. This idea has plenty of applications, one of them happens to be generating weird text, and we’ll do just that in this post.

The author of Homebrew was recently asked to “invert a binary tree” during a Google interview. Well, he didn’t get the job, and many a blogger cried foul at the pitiful state of technical interviews. I agree that coding interviews can be arbitrary and asinine – but I couldn’t help myself. I had to take the binary tree challenge to validate my existence as a programmer.