Month: October 2014

But if London is such a wonderful place to live why do so many people want to get out? One reason for wanting to leave is the scale of churn itself which makes stable communities increasingly rare. According to the UCL publication London 2062 (edited by Sarah Bell and James Paskins) London’s ‘revolving door’ saw 7.3m inflows and 6m outflows in the period 2002-2011, that means the equivalent of about three quarters of the capital left in the past decade and almost a whole new city settled in. In around one third of the 33 London boroughs the equivalent of half their populations move in or out every five years.

This ideology of progress amounts to a chronological form of ethnocentrism. Thus chronological ethnocentrism is the belief that we now live in a better society, compared to past societies. Of course, ethnocentrism is the anthropological term for the attitude that our society is better than any other society now existing, and theirs are OK to the degree that they are like ours.

Chronological ethnocentrism plays a helpful role for history textbook authors: it lets them sequester bad things, from racism to the robber barons, in the distant past. Unfortunately for students, it also makes history impossibly dull, because we all “know” everything turned out for the best. It also makes history irrelevant, because it separates what we might learn about, say, racism or the robber barons in the past from issues of the here and now. Unfortunately for us all, just as ethnocentrism makes us less able to learn from other societies, chronological ethnocentrism makes us less able to learn from our past. It makes us stupider.

From the same article:

since Thomas Dewey in 1948 no major party candidate with facial hair has even run for president, and Dewey wore only the smallest of mustaches.

Finally, check out Here And There Along The Echo. Damn those talented bastards; damn them to Hades. Glad it exists; same way I feel about Simogo, same way I feel about 80 Days. Just a sense of genuine admiration.

Gamergate is bullshit, and it’s certainly not about ethics in journalism. Threats and harassment against women in gaming is reprehensible for any reason whatsoever, and it’s astonishing that the very people who are pushing the boundaries of what gaming can do and express are the ones being attacked.

Now, I had hoped it would go without saying that that’s my opinion, mainly because I was afraid of saying it out loud. That’s right – even though I own and run a company that, by definition, I cannot be fired from, and even though I don’t have any advertisers who can be threatened, I’m still a bit afraid about speaking out. That’s the chilling effect of Gamergate, and it’s what has made me shirk my responsibility to speak out against it. I thought, better to keep my head down, which is terrible.

I don’t have a lot to add to the discussion, other than to say two things.

Firstly, publicly condemning Gamergate is a good start, but it’s not enough. Whoever you are, if you love and enjoy games, it’s your responsibility to support and champion those people who are taking a risk – whether that’s people making games that are not horribly sexist and that preferably promote feminism, or people who critique the portrayal of women in games (among other troubling portrayals). Do what you feel able to – and then some more. You’re on the right side of history, and this is the moment where your actions will count the most.

Secondly, if I were 17 right now, there’s a good chance I would have sympathised with the Gamergaters. I found it difficult to talk to and relate with girls and that made me resentful and defensive. I felt my life was difficult enough, and the notion that women had it any harder than I did was incomprehensible. I choose that word incomprehensible because I really mean it; I had no idea the level of harassment and unfairness they experienced in life, and anyone telling me otherwise was obviously mistaken and just attacking me.

Thankfully I wasn’t obsessed over this, and I had plenty of other, more productive things to occupy my time with. But it wasn’t for another few years that I began to mix with a wider group of people and read more books and experience better art and gradually comprehend that other people had far bigger and far different problems than my own. So my story is a positive one: it’s possible for people to grow and mature, if they’re helped.

If we declare that the behaviour of Gamergate is not acceptable; if we support and champion the people making games into better art; if we help those who don’t comprehend to comprehend (which will take a lot of time and patience!) – we move forward to a better world, inch by halting inch.

I was fortunate enough to catch a screening of Interstellar tonight, courtesy of BAFTA. Christopher Nolan surprised the audience by introducing the movie with a few words, comparing-but-not-comparing it with 2001.

It’s not as good as 2001 – but you could say that about almost any movie. Is it a great movie, though? No. Is it a good movie? Maybe. If you like Nolan’s other movies and you like science fiction and incredible visuals, it’s certainly worth watching Interstellar; there are many moments and entire sections of the movie which are absolutely stunning. Unfortunately, they’re marred by an often plodding and predictable story, flat characterisation, and confusing cinematography.

There is an excellent 90 minute movie hiding in Interstellar, and probably another very decent 30 minute short film. Unfortunately, you’ll have to see the whole 170 minutes to get to them.

Using Adblock on my desktop browser gives me a completely unrealistic view of the internet. Websites magically become temples to content and information; they are unsullied by commercial interests and bias; they place my interests as a reader above all else. I can’t imagine using the internet without it. I realise I’m potentially depriving sites of ad revenue (which is why I subscribe my favourites) but I confess I don’t find it hard to justify.

You can’t get an adblocker for Safari on the iPhone or iPad. For some reason this hasn’t bothered me in the past, partly because I used Instapaper and RSS readers a lot, and partly because zooming into specific columns of text meant that ads in sidebars tended to be hidden.

But no longer. Cast your eyes upon this travesty:

Captured from my iPad from the Guardian yesterday. The ad animation is artificially lengthened in this video due to the capturing process, but when I timed it separately, it lasted for 30 seconds (15 seconds, looped twice).

The bright colours combined with the motion graphics made it impossible for me to concentrate on reading the article – and because it was in-line with the text, and because the iPad shows quite a lot of text in a single screen, I couldn’t simply scroll past it quickly as I might do on a phone. My only option was to turn on Reader mode, stripping out all the noise from the page.

I dearly hope that distracting ads like this don’t become more common, as I’ll have to start using Reader mode all the time, or investigate using an alternative browser that can block ads (on Android, I believe Firefox can do this through extensions).

A new feature of iOS 8 is Apple’s Health App. It’s a way for users to view any health data that has been collected by in-built sensors in the device itself (such as step counts from the phone’s specialised accelerometers), along with data that can been added by third party apps (such as your weight, as recorded by a set of smart scales).

The dashboard that Apple supplies is deliberately basic. Everything that can be graphed is graphed, albeit in a very stripped-down way; no scrolling, no trendlines, no regular axis labels, and so on. They’re so spartan that I’m not really sure why Apple included them at all. But the true purpose of Apple Health is not as a pretty dashboard, but rather as a way for apps to share health data with one another.

In Apple’s world, users of a running app wouldn’t have to manually enter their weight in order to calculate an accurate ‘calories burned’ figure; instead, they would authorise the running app to access their Health data so it always has the most up-to-date weight information. Likewise, the running app would synchronise its calorie burn information back to Health so that (for example) a dieting app can have a better view of calories out vs. in.

None of this happens automatically. Developers must specifically build in ‘Healthkit’ functionality, and while Apple may have hoped that everyone would eagerly jump onto their bandwagon, many of the most popular apps have been dragging their heels. The reason is that while adding Healthkit functionality isn’t particularly difficult from a technical perspective, it poses troubling business issues for some companies. Take Fitbit, for example. Why would they contribute the data their pedometer collects – step count, distance count, floors climbed, and soon, heart rate information – to Apple Health when it could result in their customers using a non-Fitbit app to view that data?

Not only do they lose control of the customer experience; not only do they lose the ability to sell their Fitbit Premium subscription; but worst of all, they become commoditised. They’re just the same as any other cheap pedometer, because as far as the customer is concerned, all they are is a bit of plastic that sends bits to a phone.

Fitbit has even more cause to worry with the iPhone 5s and iPhone 6, since both phones include specialised accelerometers that allow them to record steps with effectively zero battery consumption. Indeed, the iPhone 6 includes a barometer that allows it to record floor counts. Theoretically, this means that anyone who owns those phones has absolutely no need of any dedicated pedometer device, Fitbit or not.

In practice, Fitbit is still doing fine. People still buy their devices, and I still use my own Fitbit. Firstly, the data appears to be more accurate:

All the screenshots in this post were taken at the same time; you can see that on Saturday 25th October, I walked a lot of steps and climbed a lot of flight-equivalents

Now, while most health professionals will tell you that consistency is more important than precision when it comes to step counts (i.e. it’s more important to know that you’re doing 20% more steps than yesterday, rather than knowing you did precisely 1000 more steps), it’s still nice to see your steps tick up reliably. As for floor climbing, Apple’s sensors woefully underestimate the true count, which is disappointing. But two things are even more important than precision.

One: The Fitbit is almost always with me, clipped to my belt, while (amazingly) I don’t always carry my phone with me; hence more complete records.

Two: Viewing my step count on my Fitbit takes about three seconds. On Apple Health, it takes more like ten seconds (although I could probably get an app that might accelerate that). So I look at my Fitbit more frequently.

Having said all of that, the Apple Watch will eliminate all of Fitbit’s advantages in terms of accuracy and accessibility (due to its fixed position on my wrist) and I suspect that will be the end of my Fitbit-using days.

A couple of days ago, I sat next to a student on the train creating a Powerpoint presentation. She had started on a slide titled, “Germany’s Policy of Fulfillment” and was pulling out bullet points from a text book. Ten words per bullet, four bullets per slide, lots of slides, each on a small question. I’m not sure whether it was for homework or for a presentation.

In any case, the Powerpoint format doesn’t strike me as a very good way of thinking about the causes of World War Two. The low information density on each slide, as compared to an A4 page, and the inevitably simple structure of bullet points, makes it difficult to express complex or subtle arguments; instead, it encourages a kind of Buzzfeed-ish, “5 top reasons for WW2” listicle.

That said, I still end up writing bullet points in my conference talks about games, partly because it’s expected and partly because it’s easy. So I can’t criticise this student too much.

Part of me wants to do the cooler style of presentation, mostly skillfully performed by Lawrence Lessig, with slides composed of full-screen photos and single words or sentences. This requires much more preparation as you can’t simply read the bullet points from your slide (the worst kind of presentation); although I often worry that the images, usually pulled from Flickr or Google Image Search, are just a way to get cheap laughs (e.g. ironic photos, pictures of cats, memes, etc.)

On occasion I’ll do presentations without any slides at all, and just memorise my talk. Despite the fact that this takes just as much preparation as anything else, I get the feeling that my audience ends up dissatisfied, as if I’m not delivering value for money (or time), or making them work harder by having them just listen to me.

Ultimately, I think conference presentations are a pretty terrible form of imparting knowledge. It’s telling that we decry in-person lectures as being one of the very worst forms of education at schools and universities – non-interactive, non-personalised, and a pointless exercise in assembling 300 students into a single room – and yet we’re perfectly fine with doing the same at tech or games conferences. Is it because the presentations at tech conferences are so much better? I think not. Some are terrible; and some university lectures are wonderful.

Some information and stories are well-told as lectures; some as videos; some as podcasts; some as books; and some in other ways. My feeling is that good conferences are interesting and enjoyable not so much because of good presentations (because if I’m interested in the topic, it’s rare I learn anything genuinely new) but because of the special atmosphere generated by live experiences shared among hundreds of people in the same space, and the conversations that follow.

A 2011 overhaul of girl scouting programs abandoned the old badge system and adopted a set of three “Journeys.” It also aligned badges and leadership opportunities with 21st-century ideas revolving around social issues, professional opportunities for women, and science, technology, engineering and mathematics, the so-called STEM curriculum.

A girl Discovers her special skills and talents, finds the confidence to set challenging goals for herself and strives to live by her values. This includes being proud of where she came from as well as where she’s going.

A girl Connects with others, which means she learns how to team up, solve conflicts, and have healthy relationships. These skills help her in school right now and prepare her for any career she chooses in the future.

A girl Takes Action and makes the world a better place, learning a lot about her community and the world along the way.

Reverse gamification! Less emphasis on a series of unconnected badges, and more on a (hopefully) coherent and meaningful journey. Although at first glance, I wonder if journeys might end up being too broad? It’s hard to strike a balance.

These days, I rarely pirate anything at all. I subscribe to Spotify and Amazon Prime, and I pay the BBC TV Licence Fee. I buy all my books, apps, and games from Apple and Amazon; these are all unimaginably affordable compared to just a couple of decades ago, when a Nintendo 64 game easily cost £80/$130 in today’s money.

I usually see movies at the cinema but will occasionally buy blurays if it’s something special (plus I get screeners from BAFTA); and because I don’t watch much TV any more, I can get by with intermittent subscriptions to Netflix for the purposes of binge-watching Parks and Rec or similar.

That leaves one major exception: US TV shows that aren’t on Netflix or Amazon Instant Video. I believe the only way of legally watching shows like Game of Thrones, Mad Men, The Walking Dead, Marvel’s Agents of SHIELD, The Flash, True Detective, etc., in a timely fashion in the UK is by subscribing to Sky, who have bought up the rights to the most popular US shows. Sky is not cheap, especially if you’re only using it to watch literally one or two series a week.

In the absence of any way to buy the episodes outright via Apple or Amazon, I download a couple of shows a week. To assuage my guilt, I try to buy the shows when they finally go on sale in the UK. I suppose I could be more patient and just wait, but we live in a global village these days and I like to understand what my friends in the US are talking about when it comes to popular culture.

Things were different when I was a teenager and at university. There was no Spotify or Netflix, no Amazon Prime or iTunes TV Store. I also didn’t have much money. Accordingly, I pirated pretty much everything other than apps and games, which were a hassle to deal with.

Most of the stuff was poor quality such I’ve since deleted the files or obtained legal copies; but that doesn’t fix everything. I don’t regard piracy as a particularly bad sin – digital content is non-rivalrous and so the concept of ‘theft’ doesn’t apply – but I do think it shows wilful ignorance at best, and contempt at worst, towards artists.

In the olden days (90s and 2000s), you could attempt to justify piracy by claiming – somewhat truthfully – that only a tiny percentage of the sale price actually made it back to the artist. Putting aside the way this devalues the contribution of all the non-artists involved and the fact that even a tiny percentage is better than zero, the fact is that marketplaces like Steam, iTunes, and Amazon provide many artists with substantially higher cuts, from 35% to 70% and beyond. It’s much less palatable to advocate piracy when there’s no question you’re harming the artist financially.

The other argument was that a lot of desirable content was DRMed or not available in certain regions or on certain platforms. That is still the case for a few things including my beloved TV shows, but it’s much less common. As for DRM, it’s effectively vanished from purchased music, and the rise of tightly-integrated digital ecosystems owned by Apple, Amazon, and Google has taken the sting out of DRMed apps and video, for better or worse. I don’t like books being DRMed, but that’s not a good enough excuse for me to not buy them. Having said that, I feel absolutely no guilt in downloading un-DRMed versions of content I’ve already bought – not for sending to friends, but for consuming on incompatible ecosystems.

A few days ago, 73 scientists signed a letter asserting that brain training games – which typically feature puzzle games and mental exercises on smartphones, tablets, PCs, or handheld devices – do not successfully increase general measures of intelligence or memory.

I have long had my doubts about the efficacy of games like Brain Age in improving general intelligence. Doing simple arithmetic exercises, in my mind, only improves your ability to… do simple arithmetic. Supposedly there are some mental exercises you can do to improve working memory, such as the n-back task, but these are really quite difficult and not fun to do. Still, I have not been a practising neuroscientist or experimental psychologist for several years, so I didn’t feel qualified to comment.

I suggest you read the whole letter in full, or failing that, the Guardian’s summary (which also handily includes responses from game developers) but there are some important excerpts that are worth considering:

It is customary for advertising to highlight the benefits and overstate potential advantages of their products. In the brain-game market, advertisements also reassure consumers that claims and promises are based on solid scientific evidence, as the games are “designed by neuroscientists” at top universities and research centers. Some companies present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training. Often, however, the cited research is only tangentially related to the scientific claims of the company, and to the games they sell.

Too many times have I seen apps and games that use the badge of being ‘designed by neuroscientists’ as a mark of efficacy and quality. It makes me sick. I don’t doubt the sincerity of their intentions, but they are being misleading. Just as often, I see game designers trot out a long list of papers of varying quality that are barely relevant to the actual experience being offered. This also makes me sick.

…we also need to keep in mind opportunity costs. Time spent playing the games is time not spent reading, socializing, gardening, exercising, or engaging in many other activities that may benefit cognitive and physical health of older adults. Given that the effects of playing the games tend to be task-specific, it may be advisable to train an activity that by itself comes with benefits for everyday life.

Another drawback of publicizing computer games as a fix to deteriorating cognitive performance is that it diverts attention and resources from prevention efforts. The promise of a magic bullet detracts from the message that cognitive vigor in old age, to the extent that it can be influenced by the lives we live, reflects the long-term effects of a healthy and active lifestyle.

People shouldn’t play sudoku or solve crosswords or go to the bingo in the belief that they make you smarter. They should do them because they’re fun. If you want to improve your cognitive health, do a range of mental tasks and be physically active – there is lots of good research demonstrating this works. Unfortunately, this is more time consuming and tiring than sitting at home playing on a smartphone, and thus is a harder sell.

Do not expect that cognitively challenging activities will work like one-shot treatments or vaccines; there is little evidence that you can do something once (or even for a concentrated period) and be inoculated against the effects of aging in an enduring way. In all likelihood, gains won’t last long after you stop the challenge.

On a related note, another thing that makes me sick are the pseudoscience apps I regularly see in the Top Health and Fitness category these days, including “Hypnotic Gastric Band” and the endless apps that promise to reduce your stress and anxiety. In some ways, these are no worse than self-help books that have been with us forever; but I think the veneer of science and professionalism delivered by the App Store and by the whole ‘quantified self’ industry is encouraging people to believe in effects that are not proven to exist. More on this another time.

Quite apart from the fact that even a big TV can’t replicate the ultra-widescreen experience required to properly appreciate 2001, I think that most normal people – myself included – are incapable of paying sufficient attention to the movie unless forced to do so in a dark cinema. It’s not just that I’d want to check my phone during some of the slower bits (which, to be fair, is most of the movie); it’s that it’d be near-impossible to avoid interruptions like noise from outside, or phones ringing, or people coming and going, and so on. So, see it at the cinema. Also, live in the UK, because if you don’t, you’re out of luck.

2001 is one of the two movies that I rewatch every year or two. Specifically, the flight to the space-station, and then to the Moon:

(didn’t I tell you not to watch this at home?)

What’s the other movie? Master and Commander: The Far Side of the World. Here’s the opening sequence:

It’s beautiful, and slow. The movie features few battles other than those against the weather. Like 2001, there is much “competence porn” wherein smart and experienced people concoct clever plans. Like 2001, it is a journey into the unknown, on board a state-of-the-art vessel with serious technical problems.

They’re both – mostly – contemplative movies punctuated by moments of sheer terror, providing an enjoyable mix of ASMR-like relaxation with adrenaline that keeps me awake. And once I’ve finished, I feel like I’ve grappled with weighty questions that concern the future of humanity. What more could you want?