ai

The biggest and perhaps best source of data about what people like to watch on the internet and what they would pay for doesn’t come from streaming giants like Netflix, Amazon Prime Video, or Hulu. It comes from porn.

Recently I found a paper being presented at NeurIPS this year, entitled Neural Ordinary Differential Equations, written by Ricky Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud from the University of Toronto. The core idea is that certain types of neural networks are analogous to a discretized differential equation, so maybe using off-the-shelf differential equation solvers will help get better results. This led me down a bit of a rabbit hole of papers that I found very interesting, so I thought I would share a short summary/view-from-30,000 feet on this idea.

Generating long pieces of music is a challenging problem, as music contains structure at multiple timescales, from milisecond timings to motifs to phrases to repetition of entire sections. We present Music Transformer, an attention-based neural network that can generate music with improved long-term coherence.

Because of big data and machine learning technologies emergence, a lot of data became available that was previously either deduced or speculated. This data, rooted in more credible sources, provided the means to use more complex methods of data analysis to gain value-added benefits for the business.

As a self-taught data scientist or wanna-be data scientist, deciding on what to learn per time is important. However, it can become a stressful task in itself because there a lot of materials and new concepts to cover. For some people, the challenge is that they lack the motivation to keep learning. They quit after a week or two and then start all over again and have been going in this cycle ever since.