It's a blog

It’s been 9 months since my last Wikidata map update and once again we have many new noticable areas appearing, including Norway, South Africa, Peru and New Zealand to name but a few. As with the last map generation post I once again created a diff image so that the areas of change are easily identifiable comparing the data from July 2017 with that from my last post on October 2016.

A group of 7 developers worked on the app over a few days and as well as meeting each other and learning from each other they also managed to work on various improvements which I have summarised below.

2 factor authentication (nearly)

Work has been done towards allowing 2fa logins to the app.

Lots of the login & authentication code has been refactored and the app now uses the clientlogin API module provided by Mediawiki instead of the older login module.

When building to debug the 2fa input box will appear if you have 2fa login enabled, however the current production build will not show this box and simply display a message saying that 2fa is not currently supported. This is due to a small amount of session handling work that the app still needs.

Better menu & Logout

As development on the app was fairly non existent between mid 2013 and 2016 the UI generally fell behind. This is visible in forms, buttons as well as app layout.

One significant push was made to drop the old style ‘burger’ menu from the top right of the app and replace it with a new slide out menu draw including a feature image and icons for menu items.

Uploaded images display limit

Some users have run into issues with the number of upload contributions that the app loads by default in the contributions activity. The default has always been 500 and this can cause memory exhaustion / OOM and a crash on some memory limited phones.

In an attempt to fix and generally speed up the app a recent upload limit has been added to the settings which will limit the number images and image details that are displayed, however the app will still fetch and store more than this on the device.

Nearby places enhancements

The nearby places enhancements probably account for the largest portion of development time at the pre hackathon. The app has always had a list of nearby places that don’t have images on commons but now the app also has a map!

The map is powered by the mapbox SDK and the current beta uses the mapbox tiles however part of the plan for the Vienna hackathon is to switch this to using the wikimedia hosted map tiles at https://maps.wikimedia.org.

The map also contains clickable pins that provide a small pop up pulling information from Wikidata including the label and description of the item as well as providing two buttons to get directions to the place or read the Wikipedia article.

Image info coordinates & image date

Extra information has also been added to the image details view and the image date and coordinates of the image can now be seen in the app.

Summary of hackathon activity

The contributions and authors that worked on the app during the pre hackathon can be found on Github at the following link.

Roughly 66 commits were made between the 11th and 19th of May 2017 by 9 contributors.

Back in 2013 maps were generated almost daily to track the immediate usage of the then new coordinate location within the project. An animation was then created by Denny & Lydia showing the amazing growth which can be seen on commons here. Recently we found the original images used to make this animation starting in June 2013 and extending to September 2013, and to celebrate the fourth birthday of Wikidata we decided to make a few new animations.

The Google Assistant is essentially a chat bot that you can talk too within the new Allo chat app. The assistant is also baked into some new Google hardware, such as the pixel phones. During a quick test of the assistant, I noticed that if you ask it to “tell me an interesting fact” sometimes it will respond with facts from Wikipedia. Continue reading

The RevisionSlider is an extension for MediaWiki that has just been deployed on all Wikipedias and other Wikimedia websites as a beta feature. The extension was developed by Wikimedia Germany as part of their focus on technical wishes of the German speaking Wikimedia community. This post will look at the RevisionSliders design, development and use so far.

I start this post not by talking about Facebook, but about Google Photos. Google now offers unlimited ‘high resolution’ images within its service where high resolution is defined as 16MP for an image and 1080p for video. Of course there is some compression here that some may argue against but photos and video can also be uploaded at original quality (exactly as captured) and the cost of space for these files is very reasonable. So, It looks like I have found a new home for my piles of photos and videos that I want to be able to look back at in 20 years!

Prior to Google Photos developments I stored a reasonable number of images on Facebook, and now I want to also add them all to Google Photos, but that is not as easy as I first thought. All of your Facebook data can easily be downloaded which includes all of your images and videos, but not exactly as they were when you uploaded them, as they have all of the exif data such as location and time stripped. This data is actually available in a html file which is served with each Facebook album. So, I wrote a terribly hacky script in PHP for Windows to extract that data and re add it to the files so that they can be bulk uploaded to Google Photos and take advantage of the timeline and location features.

The code can be found below (it looks horrible but works…)

PHP

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

<?php

// README: Set the path to the extracted facebook dump photos directory here

I would rewrite it but I have no need to (as it works). But when searching online for some code to do just this I came up short and thus thought I would post the rough idea and process for others to find, and perhaps improve on.

So, biggest turnout at a UK referendum with 72.2%, we have only had 3 though.

It was so close, 27.8% didn’t vote and thus 34.7% of the UK population wanted to remain and 37.5% wanted to leave. The pie chart really emphases this.

As for comparing the 2 referendums, the vote to join the EEC in 1975 saw 17.3 million vote join with only 8.4 million against.
With the BrExit a similar number (in the scheme of things) wanted to remain with 16.1 but with a whopping 17.4 wishing to leave.

Interestingly the turnout for the referendum to join the European Communities in 1975 and the BrExit referendum both had a higher turnout than any European Union Parliamentary election. The highest turnout for the EUP elections was in 1994 with 49.4%.

I originally posted about the Wikidata maps back in early 2015 and have followed up with a few posts since looking at interesting developments. This is another one of those posts covering the changes since the last post, so late 2015, to now, May 2016.

The new maps look very similar to the naked eye and the new ‘big’ map can be seen below.

So while at the 2016 Wikimedia Hackathon in Jerusalem I teamed up with @valhallasw to generate some diffs of these maps, in a slightly more programatic way to my posts following up the 2015 Wikimania!