Month: May 2016

With three rounds of Allsvenskan games played since my last post, it’s time for an update. Like last time, I’m just going to throw a few visualizations at you together with my initial thoughts without going too much in-depth.

Starting out with the league table, we see just how close the league has been so far – with eight games played, only four points separate Östersund in 11th place from Malmö in 2nd.

We can also see some interesting streaks since last time, with Norrköping and Elfsborg winning all three games while Hammarby, Gefle and Falkenberg have been struggling. Looking at the early surprise teams we see that Sundsvall have continued to perform well while Jönköpings Södra have dropped in the table.

Looking at shots we see how Hammarby, Kalmar and Norrköping have all moved in to the ‘busy attack, quit defence’ quadrant, indicating that they’ve played a bit better lately (or faced easier opposition!), while Sundsvall is still stuck in the ‘quiet attack, busy defence’ quadrant.

While Malmö produces a lot of shots, they’re still one of the most ineffective sides up front. Göteborg and Norrköping on the other hand are enyoing some effective scoring at the moment.

Sundsvall are still conceding a lot of shots, but at least they’re not converted into goals very often – which in part explains their good results so far. Elfsborg have moved into the ‘formidable’ defensive quadrant, only conceding one goal in the last three games.

Looking at Expected Goals, Malmö are still the clearly best team, with Norrköping improving while Djurgården have dropped a bit. Here we really see the difference between the early surprise teams’ performance recently, as Sundsvall have improved both attacking and defensive numbers while Jönköpings Södra have done the opposite.

So how would the teams rank xG-wise? Expected Goals Difference should do well as measure of skill, and here we again see how the model ranks Malmö as the best side so far, with Norrköping and AIK the main contenders. A bottom three of Helsingborg, Gefle and Falkenberg have also emerged.

Another way of evaluating the teams’ performance so far is to simulate how many points on average each team would’ve received from their games. To do this I’ve used the shots from each game to simulate the result 10,000 times and the teams have then been awarded Expected Points based on the derived 1X2 probabilities.

For example, if the simulation would come up with probabilities of 0.5, 0.3 and 0.2 for each outcome then the home side would be awarded 0.5*3 + 0.3*1 or 1.8 Expected Points, while the away side would get 0.2*3 + 0.3*1, or 0.9 Expected Points.

Here’s a table of the team’s Expected Points so far:

But a team can’t get 1.8 points from a game, only 0, 1 or 3 – so how have the teams performed compared to their Expected Points?

Note: Malmö have been awarded a 3-0 win against Göteborg as the game was abandoned due to home fans throwing pyrotechnics towards a Malmö player. These points have been included.

Here we see how Helsingborg and Sundsvall have taken quite a lot more points than expected, while Falkenberg and Kalmar have done the opposite. This could be the result of some good/bad luck, but it can also mean that the model fail to properly assess the quality of these teams.

Let’s dig deeper and have a look at the Expected Points distribution of each team:

Looking at these distributions we can see just how extreme the results have been for some of the teams so far. In fact, my model estimates that if we re-played Helsingborg’s games 10,000 times, they would get 13 points or more only about 5% of the time!