I’m doing documentation, which involves drawing lines, measuring things with the marquee, typing measurements. Sometimes I have the temerity to try and draw a marquee and oh god I’ve got the wrong tool. It’s a text layer! Disaster! Such a catastrophe! Photoshop must warn me! Twice! Every single time.

Like, whatever.

I know, small problems, right? Some idiots would call this a first world problem as if anything but being shot at, tortured or starving to death can never be complained about. Fuck it. This is annoying.

This is my thing, the thing that I’m into! I’m emotionally invested in this product or service so if there’s another product or service then I have to come up with reasons why it’s not as good or isn’t needed or lacks a thing I need or is ideologically doomed to failure.

But maybe I decide I like that new thing better and so make a big deal about how This Solves My Problems Better and that since we were always at war with Eastasia that it was inevitable I switch over when this prophets-foretold Better Thing would come along. Of course then to maintain an appearance of balance and to head off any criticism I’m frivolous or impulsive about things I’ll say the Old Thing wasn’t bad and it was very good at the time but things simply move on and they didn’t keep up with the game and they missed a trick and misjudged the market and That’s Just Business.

But it isn’t that, I’m emotionally involved in This Thing and so therefore anything else that exists is a competitor to it, and everything ever in the whole world is a zero-sum game and there can be Only One Winner. Only one. That is a This-Killer. This is a That-Killer. If your thing succeeds, my thing must fail, and that would be an attack on me, so I must attack you.

And this is why my interest in ‘technology’ (as a subject that gets written about) has pretty much died a death.

I read this by Andy Clarke, and then this by Luke Wroblewski, about the crap passwords people use on various things. It got me thinking about the discussions I’ve read and taken part in, where the core argument ends up something along the lines of “People choose crap passwords! They need to do better!” In other words, “People don’t understand the system! They need to learn how to use the system!”

It’s pretty obvious where I’m going with this, bear with me, I need to vent. For passwords, we’re still in a blame-the-user mentality, seemingly because the whole thing of online identity is a non-trivial and long-standing problem. Passwords seem to be the only solution that works, so we’re stuck with it, and users need to, must learn to adapt to the system, or they lose. They lose their data, their security, their money, the provability of their own identity. That’s a pretty high cost.

In UI design we minimise hurdles, barriers, blockers, and try to make things easy and pleasant. Problem is, passwords are dull. Coming up with a good one seems hard. So we present the process as a minor thing - OK, you have to enter it twice, and it’s starred out, but it’s on a par with entering your email address. It’s presented so that the important thing seems to be to make sure you typed it right, not to come up with something secure. Don’t scare the user, make it seem less important, let’s keep the mood cheerful, guys!

None of that means the other extreme is a good aim either. We all know the insane password requirements of this or that other site, the one that demands we type something that looks like the cat walked over the keyboard, and that doesn’t work either, because you’ll just write it down somewhere.

I whinged about this problem on twitter, and of course that XKCD cartoon came up, which presents a much better idea for passwords (should the site allow spaces!) I don’t have a problem with the password method, but with the idea that ‘we have trained users’ to come up with a certain kind of password. We haven’t. We’ve trained users to regard passwords as an irritation, an annoying test that needs to be cheated to get around, so users type ‘password1’ because that meets the requirements of an algorithm. They don’t see, because they don’t know, and may not actually care about, the importance of passwords. Passwords are annoying! We’ve trained users to cheat, not to come up with passwords, of any sort.

And that’s because passwords aren’t the important thing. We’re spending time trying to fix the problems we’ve got with a bad solution to a problem, not the problem itself.

So I guess my point here is, stop bitching about users and their crap passwords, but about how we’ve not figured out a better solution than passwords yet. It’s really not a trivial problem, and I would never claim it is, but that’s not the users’ fault.

I read this post by Paul Boag. I empathise with his position. I can see his point, and know that he’s complaining about something that is genuinely unfair, and is wrong. But I think he’s complaining about the wrong people, and the wrong thing.

In claiming a religion as your own, you must be aware of what the religion stands for. For an organised religion with creed, and leaders who represent the faith, you must know how that religion presents itself to the world. Unless you’ve been hiding under a rock, you can hardly claim ignorance of the PR, in other words.

When other co-religionists, and more importantly, the leaders of your faith are loudly and consistently advertising their vile and disgusting beliefs as fundamental to the faith, then can you complain when people start believing them? When you then claim to be of that faith, should you be surprised if they judge you according to what they’ve heard of your faith?

Perhaps you should complain more about those misrepresenting your faith than those who judge you because of it.

I’ve just started using Lightroom, and in the process of pointing it at my photos I rediscovered all my old photoblog pictures, going back to 2003. I’d got the idea of doing a photoblog from Cathy (thank you!) who was using her camera phone (a Sharp G10i) to take pictures with. The pictures it took were tiny, barely thumbnails by today’s standards, and it was an absolute arse to get them off the phone, but it was doable. So I went out and got the same kind of phone and got started. I kept the photoblog going, using a variety of cameras, until 2009.

Looking at the pictures got me thinking. To a greater or lesser degree, I could remember where I was, who I was with, what I was doing, and what I was thinking when I took each picture. These tiny little images still had the power of memory. I did have a camera that could take decent photos, I even had a reasonably-OK digital camera, but I don’t have many pictures taken with either. They also have memories attached, but they’re generally of things I remember anyway, of events or places important enough to take the ‘real’ camera to. The camera-phone shots are of day-to-day stuff, things I noticed while walking home, out with friends, at work, just little things that have interesting colours and textures, things that offer a nice composition, or things that are plain unusual.

From left: the first in the series (the Brighton Marina car park), beer cooling at post-Pride BBQ at John & Liz’s, a toasting fork I bought at the South of England Show, at work, a walk on East Brighton golf course (never found out what these were), and the first day of Civil Partnerships in the UK - I got married here 8 years later.

Thinking of all this, I realised this is exactly what I’m using Instagram for. It’s also why I find the bitching about Instagram so irritating. Yes, the pictures have filters applied, they are relatively low resolution, they aren’t ‘serious’ photography, but that’s the point, it’s all about enjoying taking pictures, making them look nice, and sharing them. If the best camera is the one you have with you, then the best photos are the ones you actually took. It doesn’t matter if your SLR could have taken the perfectly exposed 12 megapixel shot if you didn’t think of using it, but you might just take a photo of your lunch using Instagram, and years later you see that photo, remember that lunch, the people you were with, and it might just bring a smile to your face and lift your day. I have full backups of all the pictures I’ve taken with Instagram, and I’ve got an IFTTT action set up to keep the backup up to date, and eventually I’ll have the whole lot printed out. Even if the record is incomplete, i.e. without the comments and captions I’ve added to my own pictures, I know that each one will be a link to the memory of me taking it, and the person I was then.

There’s a thing going on about responsive images on websites: making sure the person looking at the page is delivered the right pixel and file size of image for their device and conditions. I’ve read around the subject and done some experiments, and to my mind we can’t fix it properly for a couple of reasons, but there are some interesting other things we can do instead. So, first off, the couple of reasons:

The first is that we can assume some things, but not all things. A small screen doesn’t necessarily mean low bandwidth, in the same way that using an iOS or Android device doesn’t actually mean you’re mobile (I use my iPad on the sofa, a mere 2 metres from the wifi hub). We can do clever things with guesswork, but there are few things more frustrating online than visiting a website and being given a degraded experience because of incorrect assumptions. I rage about such things. So to sum that all up, we need the browser, the OS and the network to cooperate and tell the page (either at the client or server end) what the situation is, i.e. is it a good network connection, is there much processing or memory capacity on the device, and what can the browser understand. Can the thing be downloaded (at all? quickly?), can it be loaded and can it be interpreted properly? We needed this years ago. Really.

The other reason is an extension of the first. Does it matter at the moment? This article puts it rather well. We can’t make it perfect, but giving the user something is better than nothing. We can to an extent trust that OSs and browsers can handle chunky and demanding pages, even if they don’t do it well. Of course we know the page shouldn’t be chunky or demanding unless you’ve got a damn good reason for it, but assuming you have that reason, best to deliver it and hope for the best rather than ACCESS DENIED (download our app!). Which isn’t friendly.

Anyway, I was thinking of all of this as I was updating my awesome new portfolio which is awesome(hire me) I think you’ll agree. (Sorry). I wanted to include pictures of my projects that worked on large and small screens, but didn’t want to lose the vertical rhythm of the page (such as it is) or by having poky hard-to-make-out images at one scale and dirty great pixels at the other. I also needed something that would work with the diversity of projects I’ve worked on; illustration, IA, UI design, branding, HTML & CSS and so on. A simple screenshot wouldn’t work here.

I had of course read the hoo-hah about the picture tag vs. srcset (written up nicely by Jeremy Keith here), and it gave me some ideas. I know the picture element isn’t meant to be a container tag for lots of pictures you’d show all at once, but what if it was? The orthodox way of it is that there’s one image you show (say) when space is limited, another when there’s more space, yet another when it’s a high-res display, and so on, with only one of the set being shown at any one time. What if you could show all of them at once, in some form?

So that’s what I did. There’s an image tag which always displays, and I’ve made that image the one that fits on pretty much all screens. There’s a background image too, because there are a couple of container elements around the image - a div and a p. So I put a background on the div. Then, as the display gets larger, I add an image using :after to the p, inside a media query. Then, as the space increases, I put a background on the p. I could use :before as well for another image, and attach some to the container div in the same way, if I wanted.

So it’s a bit of a faff to arrange, but the results are exactly what I was after: a collage of images to give an overview of the project, that responds to the viewport and doesn’t download images it isn’t going to show. In other words, a responsive picture. Ithangyou.

Facebook bought Instagram. For a lot of money. We know this. Some people seem to think everyone’s reaction is about the money, but I’m not sure it is. I know mine isn’t. I really enjoy using Instagram, and before yesterday I enjoyed it rather more because my husband and more of my friends were using it. Today it’s still fun, but it’s missing these people, so it’s not as fun, and that’s not because Instagram was bought, it’s because Facebook bought it.

I don’t like Facebook. A lot of people don’t like Facebook. Millions do like it. Millions love it. Fine, they can do their thing on Facebook. I was doing my thing on Twitter and Instagram (and Flickr, and my own sites…) and now something I was using is part of that evil empire, that thing that has a known history of not being honourable or ‘playing nice’ with its users’ data. So yes, I’m upset about it, but this isn’t actually one of those oft-assumed entitlement rants; Instagram can sell itself to whoever it fucking chooses and I fully accept I have not the slightest say in that, but I do have a say of my own opinion, and I will voice it. Making your opinion known is not an automatic demand that something be stopped, or something be done, or to be consulted on anything that happens, or even to be listened-to. To these people who insist on pointing out that this is how startups work, or that the internet wasn’t made for my personal use, or that offered a billion dollars I might consider things I normally wouldn’t, I say do please piss off and tell someone who doesn’t know this already, which is fucking no-one. But then, you don’t have to do what I say. You’re just expressing your opinion too.

A few years ago I was a bit of a gamer. I was a ‘PC enthusiast’ too. I built many PCs, played with many operating systems, deciphered many impenetrable interfaces and networked several flats. It seemed like a kind of fun at the time, in a way, until it it just… didn’t, anymore. The same shitty interfaces, the same having to learn loads of stuff about something you don’t want to know to fix what shouldn’t be broken, the benefits of technical win didn’t seem to be enough of a compensation for all that effort. It’s not like it was bleeding edge stuff – basic networking? Come on. I had other things to do. If I’m going to solve technical problems I want them to be interesting, not ones already solved and on the shelf of Dixons for £9.99 including VAT.

Since then, I realise I’ve developed a bit of an aversion to tech specs. “Does it do the job? OK then.” That kind of thing. I still play games, though mostly on the iPad or the PS3. The games work, work well, and I don’t need to think about whether they’ll work on my device before trying them.

So, having said all that, I was looking at the Mac App Store this morning. I saw two games I recognised, Arkham Asylum and Duke Nukem Forever. I had a look just to see what the screenshots were like, and noticed on the descriptions of both games a set of ‘important requirements’. Actual system requirements. Both are below.

Lists of numbers. Reference to ‘video cards’. Integrated video chipsets. ATI X1xxx series. I only know what these mean from fairly long experience. It should be obvious just what bullshit jargon terms they are to dump on someone who just wants to play a game. Without support, they’d have to find out what a video card is, or a video chipset. Volumes formatted as what? What volume? The space inside my computer? Do I have to move bits around? Searching for X1xxx will never return anything particularly helpful either.

But what do you mean it’s all in “About This Mac, then click More Info… then System Report and, oh my!” It’s all utter, utter, bullshit. It’s a contemptuous arrogant put down of the potential buyer – you don’t have the technical knowledge to play a game. Go away and write your shopping lists, little person. Why should technical knowledge of the workings of a computer be a requirement for gaming? Strategic and tactical thinking, yes; problem solving, yes; all sorts of things, yes; knowledge of video chipsets, no. Solving the problems of finding this stuff out is not a game, it is not fun.

But it’s not really the game studios’ fault. They’re often doing something with the game that does require specific hardware. Like it or not, for a lot of games it’s just not worth making it run on everything. What’s needed is some clear bullshit-free way of letting the buyer know whether this game will work on this computer. There was some thing talked about by Microsoft (and others) a few years ago of having a ‘standard’ spec for computers at specific times, so you’d get the “2008 High End” computer and the “2007 Mid Range” computer and so on. Game boxes (for this was before digital download was big) would say what they worked on, and you’d of course know what your machine was (so the magical thinking went), and would buy accordingly. Of course it came to nothing. The strength (and limitation) of the PC gaming market is the flexibility and variety of PCs, and besides you’d just be replacing one lot of jargon with another, albeit friendlier set.

But wait, I’m on a Mac. I’m looking at this games using the Mac App Store, an app that’s running on my computer. It knows what the spec of my machine is. Why should I need to know what the lists of numbers are? That’s the computer’s job!

The Mac App Store can (but doesn’t) tell me if my machine is capable of running the game well. There are loads of ways to do it that would be friendly and informative and yet be face-saving and PR-fluffy enough for both Apple and the game studios – from a “This game was designed for a bigger/better/faster/more computer than yours” to just not featuring games that won’t run.

Yes, it’s more work for Apple to maintain the App Store, a little harder to publish ‘featured app’ lists, but surely that’s better than just dumping it on the poor sod who just wants to play a game?

For a while now I’ve been thinking I must ‘do something’ about my reliance on Google stuff. I use Gmail, Google Apps, Google Search, Google Analytics, Google Reader and so on. Writing it down like that, it seems ridiculous and frankly insane to trust one company (and one set of sign in details) with so much of my online life. And now there’s a Gmail app.

Weirdly it’s this that might push me over the edge and into real Google alternatives, and it comes down to how I’ve ended up splitting my computing and online life. I use my iPad a lot. In many ways it’s my main machine (if such a thing makes any sense these days) – I use it for everything I do when I’m not working. It gives me a psychological separation of the computers in my life – I like the different experiences on the different platforms and, to sound a bit pretentious for a moment, use them to moderate my thinking and actions. If there’s something that’s not convenient (or possible) on the iPad then I have to ask myself whether it’s something I should be trying to do at that point, or whether I should just set a reminder to myself for when I’m next ‘at work’ at the laptop. I think it’s a fairly good system. It works for me.

The biggest difference in my ‘usual stuff’ of things to do has been with email. I use the native Apple Mail app on iOS, and the Gmail web interface on the laptop. There’s a difference in tone and function between the two things – it’s not that either is better or worse for work or social email, it’s just that I maintain that separation. I’m not dogmatic about it, I have to be practical, and I use the Gmail web interface on the iPad interface from time to time, but it’s not a polished experience, it doesn’t feel quite right, and so it reminds me of my self-imposed separation, so I don’t do too much ‘in the wrong place’.

And so, back to the point. With a Gmail app on my iOS device (how could I not try it?) then maybe it’d get too easy. And that makes me think how much of the rest of my life is on Google’s servers. And maybe I shouldn’t do that. I think I need something based somewhere else, preferably in a country with decent data protection laws. I don’t want to be a systems admin, but I’d like some control over things, maybe a managed server somewhere with email and calendaring on it. I know I won’t want to swap it for another megacorp’s thing (like Yahoo, Hotmail or iCloud) but I dunno, what are your alternatives to Google’s hegemony? Answers on a tweet, gratefully received.

I can understand why you’d want a nice generic tag to encode various numbers and whatever, but I don’t see the point in getting rid of a really useful and unambiguous tag like time. The arguments for replacing it seem to be about fairly specific use cases, dimensionless numbers, ranges of numbers, temperatures and so on. Fine, I can see that. It might be useful. What I object to is this sense of a ‘grand scheme’ creeping back in for HTML to become more like XML - a nice neat encoding of all the world’s data so that it can be easily indexed and rationalised by big systems. There’s much talk in that thread of microformats and machine indexing as if that was a major goal in making websites. I dunno, I figured it was about letting people read stuff.

What I liked about the development of HTML5 was the idea of ‘paving the cowpaths’ - looking at how people use HTML and either making that ‘official’ or adding stuff to make it easier. What we ended up with was a reasonably sensible set of tools that were nice and generic, salted with ones for a few specific use cases, like dates and times and form inputs or whatever. Yes, you’d be putting the tags in for a machine to understand them, but that machine would be a browser, something to present what you’d done - the need for marking up your content was obvious and understandable.

I liked the balance between neat and scruffy that HTML5 represented but this idea seems to be going too far back to the neat end, it smacks of XHTML2. You neats, you’ve got your XML, go and play with that, and let us be scruffy with our HTML.