I often have the problem of needing to convert a folder full of video files to a format compatible with my Freebox, without spending ages in front of my machine doing each video individually. This often happens when I have several takes from my mobile phone or camcorder as mp4 files. Here’s how I worked out how to do it. These instructions apply to Windows 7, and may also work elsewhere on Windows machines.

I have tailored these instructions for a generic encode that will work on Freebox (ADSL triple-play box in France) but since the resulting files stick to standards they should work on most set-top boxes and portable players, including phones and tablets except iPhone/iPad, which read mp4 natively anyway… The typical conversion required is from mp4 to mkv or avi; I prefer mkv for the Freebox as it has worked in more cases for me.

I’m assuming the video codec doesn’t need changing. Most video I encounter is h264 or XviD which are Freebox compatible. This makes converting much faster, because re-encoding video takes ages. I do force audio encoding though, because I’ve had problems with AAC audio tracks (which the Freebox v5 doesn’t like very much at all) or raw PCM and other audio formats that don’t play nice.

Be careful with some Android phone video as it can have dynamic framerates. See advanced notes on how to change codec or force a framerate on your video which will help keep audio and video in sync. If you need to edit after you’ve converted your takes, you’ll be better off with a constant framerate.

Tools required:

AviDemux – http://avidemux.sourceforge.net/ a capable video editor / converter. Open Source and free. This also works on Mac and Linux, and the simple batch file below is adaptable for bash shells which would cover Mac and Linux. Work it out for yourself.

Files required:

1. The script file for AviDemux. This is a simple file used to provide instructions to AviDemux to save the mp4 video file you have as an mkv, and re-encode audio to Lame MP3 @ 128kbps. It provides a generic display width of 1280 pxiels but this may not be used if the source video is different (we’re not re-encoding video here).

Set line 1 of the batch file (test.bat in the download) to point to the path where your avidemux.exe resides.

Copy both files into an empty folder.

Copy your .mp4 files to convert into that folder too.

Double-click test.bat and the script will take each mp4 file one-by-one and send it to AviDemux to encode/mux it for you. AviDemux will launch a window each time, but you should not need to interact with it at all. Once it has finished, you will have a bunch of .mkv files with your mp4 files.

Advanced notes

You can do all sorts of encodings with this method. The batch file can be changed to look for avi, mkv or other video file extensions. Just change (*.mp4) to (*.avi) or whatever.

If you want to re-encode video, you can create a specific conversion file from AviDemux by loading a single video to convert, setting up your parameters, then using File > Project Script > Save As Project to create a scripting file. Then edit that file to remove specific lines (like the filename and segment lengths) but to keep the video and audio codec information, and reference that new file on line 3 of the batch file.

The great thing about physical objects is wear and tear. The places you touch most wear down, an ideal indicator for the key functions in your remote interface. This type of observation has even led to abstract art such as the traces left by different iPad apps. You can tell a lot about how something is used by investigating these physical traces. Doing the research to collect that data can be fun, using heat maps, click tracking or even a screen cover and paint. The results of that research can be visually persuasive too. It’s pretty obvious from the photo on the left that play and fast-forward are the main used buttons. ON DEMAND is bigger, screaming at you to spend money on content… why isn’t the really useful button (play) in that simple central place? Did the big “on demand” button increase rentals?

“Pick up any modern TV remote and you’ll immediately see the problem of experience rot. On/Off, volume and channel selectors are no longer enough. We need to switch devices, control captions, have a text capability for on-screen editing, a thumbs-up and thumbs-down for ratings, pause, record, slow motion, rewind, 30-second rewind” [from Jared M. Spool’s Experience Rot]

All too often, interfaces do not have enough affordance in their design for what we really need to use. The best web applications use progressive disclosure. Google UX principles (no longer on their corporate site) stated their applications would “provide a natural growth path for those who are interested”. With a remote control unit, it’s a bit difficult to allow an interface to evolve. It does however make great sense to provide most used buttons (analogous to website features) in the best place, ergonomically easy to reach when holding the unit comfortably in your hand. Many remote control units get this wrong. Websites too.

With the increasing ubiquity of mobile phones (and multi-touch screen technology in general), interface affordance can become interactive. Why not hide away all the crap you don’t need more than once or twice a year, and leave just the most used controls? Freemote, an Android app for French Freebox ADSL triple-play boxes, does just that.

In an increasingly touch and gesture based world, we could swipe, slide and select to move onto a screen that includes extra options we need occasionally, just like the freemote app allows you to do. Intelligent remote control units can improve our lives by making the most used features quick and easy to find. Nest have developed a learning thermostat which can even save you money. Any interface can save you time, if the design strategy keeps focus on keeping it simple.

Where it all goes wrong is when, because it’s “the web”, you keep iterating on new releases and keep adding features to interfaces ad infinitum. Sooner or later there may be a push to do a big new feature release or a rebranding / graphical redesign. How often is there a big push to prune core features or simplify the incumbent interface?

Designers and developers alike dream of rebuilding from the ground up. Sadly, that’s what nascent competitors do. Within an organisation the best you can do is state the principles you wish to work to, and then live them. Every day. Fight for simplicity, restructure the interface at every opportunity, and make sure that when you benchmark the best examples you see every day, you show the examples to as many people as you can.

Mobile design for small screens is a great way to imagine a useful set of things to click (buttons, form elements, product options) that form a coherent screen. Use the mobile revolution to improve the way your website works too. Indeed mobile is becoming semantically redundant because often people are on “mobile” devices but are sitting in their bedroom, watching TV, and texting friends.

“A huge chunk of our users don’t even have a personal computer in the first place, so why are we removing content and functionality to which they’ll then never have access?” [From SuAnne Hall’s, Three Reasons We’ve Outgrown Mobile Context]

So why not take a look at your website click stats / heat maps. Look where your wear and tear is happening. Maybe you could get rid of some of the distractions, improving the usability of your website or application, which will make it friendlier. The better you do this, the easier it might then be to transition from a PC/screen/keyboard/mouse paradigm to a universal application for your readers or customers. Everybody can gain from keeping the lesser used stuff out of the way. They’ll look for it if they need it. If they’re not aware it’s there, there are plenty of tools – including marketing, in-app tutorials, gamification – that can help feature discovery. Seldom used features shouldn’t be pushed too hard for the sake of it either. Users that engage enough with your brand or product will spend time, occasionally, checking out option screens or creating / enriching their account to make their on site / in app experience more fluid. They won’t waste time discovering all your glorious complexity if their initial experience isn’t almost instantly useful or pleasurable.

I believe we are entering into a new age of minimalism driven by the requirement to reduce noise as much as possible, because the digital noise the new generation are wading through means your signal is going to be harder to detect. Matt Gemmell, writing about simplifying blogging, said

“I bet you could simplify your blog in some way without detracting from the reading experience.”

I have simplified that further. You could even say that simplification, rather than detracting from the original experience, actually improves it.

I’ve just got back from the international interactive design festival, or WIF (from the original Webdesign International Festival). It’s a meeting of designers linked to the web and interaction (HCI) from around the world who come to talk about the latest trends and ideas.

I was on the jury for the design competition which gave student and corporate teams 24 hours to design a concept and create an interactive prototype. This year the subject was “the school of the future”. The entries were fascinating and most were impressive, especially given the limited time available.

Unusually for events like this, it takes place in Limoges, a town in the centre of France which is better known for porcelain and quiet country life than for international events. This gives it a different feel from the London and Paris roadshow corporate events, though it limits somewhat the audiences. Overall the WIF was a great experience and I hope to be able to attend next time it’s organised in an active capacity.

While the teams were busy sweating, I had some time to attend the many workshops and conferences (talks, really) and meet some very interesting design professionals. I thought I might share my notes, including my poor sketches which are nothing like the wonderful Sketch Notes by Eva Lotta Lamm.

Bad Science by Ben Goldacre is a fascinating read. I’ve intermittently followed his Bad Science column on the Guardian and it’s often thought provoking and uplifting too. I’m thus no stranger to the placebo effect, the inflated claims of homeopathy, and the commercial interests of all these companies selling diet plans and nutrition guidance. This book concentrates an overview of all that and more, with a level of scientific detail and rigour which should leave you without doubt that you must be prudent about what you read in the papers especially when related to carcinogens and/or wonder pills.

I recently bought the book for my Kindle. Pictured is the book on my phone (with Kindle app), happily synchronised with my Kindle which stays on my bedside table. It’s fantastic to be travelling home on the train and to simply pick up where I left off the night before, on a different device. Bad Science makes for fascinating – and frightening – reading. Sometimes there is complexity in the discourse, but this appears necessary to expose the fluff and pseudo-science which at face value seems reasonable or seems to prove the efficacy of a proposed wonder pill. This complexity is thankfully rare for the less scientifically inclined and certainly doesn’t get in the way of perfectly readable and understandable prose for the most part.

“For sheer savagery, the illusion-destroying, joyous attack on the self-regarding, know-nothing orthodoxies of the modern middle classes, Bad Science can not be beaten. You’ll laugh your head off, then throw all those expensive health foods in the bin.” Trevor Philips, Observer

The guiding principle which Bad Science seeks to expose is bamboozlement. Hiding data, over-emphasising positive results, and other quackery doesn’t stop people from believing in products. The worst of it is that this belief heightens the placebo effect and does nothing to vitiate the claims of alternative medicine. They’re not totally wrong, therefore, to claim that their products have usefulness. Once you read this book the tables may turn though, as you will realise that trials that are not blinded (where testers and participants alike are not aware of what they are actually giving/getting – real medication or placebo) are meaningless. You may as well just believe that something you like will make you better (a Wine Gum a day, a glass of wine a day) and eat your greens, stop smoking and do a bit of exercise of course.

This bamboozling is summarised by one of the most highlighted passages in the book (eBooks certainly have this advantage over their dead tree siblings):

This process of professionalising the obvious fosters a sense of mystery around science, and health advice, which is unnecessary and destructive. More than anything, more than the unnecessary ownership of the obvious, it is disempowering.

Don’t be blinded by science or misled that obvious things, said in veiled (and often fake) scientific terms, are suddenly new and above your head. Read this book. Feel better.

WordPress is a fantastic platform, with an excellent plugin mechanism and the most usable admin interface I have seen. I know and have used several others including Joomla, Zope, Drupal, and old stuff you may not have heard of. The problem with being popular though is that you are likely to be a victim of more attacks. There’s a strange pharmaceutical spam attack out there, and it got me too. I first found out about it when Google emailed my with a possible hacking notice. Links like /valium-high were appearing in the Google results for this site, yet when I tried the links they were giving me a 404 (page does not exist) result. The sneaky thing is that the hack is cloaked, the link /valium-high did in fact work but only if accessed via a search engine spider (or search bot / Googlebot). So Google sees a strange page selling valium, whereas regular visitors see a boring “page not found”. Spammers use these techniques to help their own strange pages rank in Google.

Using “Fetch as Googlebot” in Google webmaster tools allowed me to confirm the cloaking issue. To clean the hack, and simulate a search crawler without resorting to publishing tests live to my domain, I used my own server and tested using a search engine crawler simulator on a custom subdomain.

After a lot of searching, including various scripts like lookforbadguys and advice on checking the database I still couldn’t find the bad code. I gave up forensics and just reinstalled a clean version of WordPress (often the best recourse if you can’t find the hack quickly). It then took me a while to get a few other files I needed (my theme, images, custom scripts) from the old install and make sure they were working correctly.

Since I was making updates, I finally brought this WordPress site up to date with a few changes to CSS to take full advantage of screen real estate. This humble template was less than 800 pixels wide. I am now using a 960 pixel grid which is a de facto standard on the web given larger screen resolutions. I hope you find it a little easier to read.

Is anyone else concerned that the Internet is becoming a walled garden on Facebook, encouraging people never to leave the facebook site? People are more likely to read the Guardian now it’s a Facebook app. No doubt this is due to having to install the app to read content “read” by others – frictionless sharing as they call it. It means a lot more traction gained for Facebook, and a less neutral web experience.

Net neutrality is already wishful thinking, now that Google & Facebook dominate so much – do you even have a separate Instant Messaging / email app outside of Outlook at work? Are you aware that most of what you listen to and read will be shared automatically with your friends?

‎”As well as increasing traffic, the app is making our journalism visible to new audiences. Over half of the app’s users are 24 and under – traditionally a very hard-to-reach demographic for news organisations. The Facebook app is one of a number of successful launches by the Guardian in recent months as our ‘digital first’ strategy gains momentum. We’re delighted with the results.”
– Andrew Miller, chief executive officer of Guardian Media Group

I must be an old grumpy git, since being on Facebook is frighteningly efficient at appealing to the younger demographic. I do get nostalgic about plain-text email with properly nested quoting wrapping at 74 characters, web pages that are visible anywhere on any device, and music that comes from analogue encoding on physical objects. Will appealing to the younger net users without embedding your content on Facebook be possible soon?

Five years ago, I published an article for our fifth wedding anniversary. So if I have got my head on straight, that makes it our tin – 10 years – anniversary today.

How time flies. When we first got married our wedding site had a guestbook I cooked up in PHP. Five years on, a blog post was where a few friends gave their comments. Ten years on, and it’s Facebook where all the reactions have come from. So from DIY PHP/MySQL to WordPress (also PHP/MySQL) to Facebook (PHP too) things keep on changing.

Here’s to ten more! No doubt the next anniversary post will happen somewhere else entirely. Any predictions?

Every time a major site with a big audience changes, there are always going to be detractors. Especially a site like Facebook. People spend a lot of time there, so interface changes are almost tantamount to moving stuff around in their lounge/den.

I think there are a number of issues with the new Facebook homepage. I’ve seen it before. It’s called feature creep. Lots of stuff all clamouring for your attention. Chat, realtime updates, top stories, the rest of the news, adverts, suggestions for friends, app updates, messages (FB-ized email) and notifications.

Clever use of AJAX saves FB a heavy, slow loading user experience. Unfortunately it also allows stuff to get very busy. Progressive loading is an interesting technique and FB has evangelised it well. They’ve added a new design pattern I’m less of a fan of into the mix, inspired no doubt by a pattern I’ve seen on mobile terminals: revealing scrollbars on mouse over.

For some time you have been able to set overflow:auto on <div> elements so that scrollbars – regular, OS managed scrollbars that look different depending on whether you’re a Mac or a PC – appear if the content goes outside the bounding box as defined for the div. FB are presenting to the masses a funky new way of doing it. A grey rounded scrollbar, as seen on your iPhone / Android terminal when you touch a screen full of text that can scroll, appears when you hover your mouse around each of the blocks of content on the right hand side (and in some other cases too).

I have a big beef with this, because scrollers are now everywhere on the page. You still have your regular scrollbar if you’re still using a desk/lap-top machine to access the site. Very close to it you now have other scrollable elements that don’t follow the same rules. They don’t work the same as OS scrollbars. If like me you often scroll with the keyboard once the area has focus, they’re a PITA. The target for scrolling is not very wide. The screen looks a mess if you leave all the different boxes scrolled at different points. It’s not intuitive to know which zones really will scroll or not if text inside them aligns perfectly with then edge of the scrollable zone. So you have to mouse over them, which isn’t good for an addict of the PgUp, PgDn, Ctrl, Shift and arrow keys like me. I only click to give a zone focus or to position the cursor far from where I am currently.

Other bad karma effect for me: I was assaulted with little bubbles and tutorial messages when the version change happened. Not a discreet “learn about what’s new” that I could easily dismiss, but (IIRC) something like 3 or 4 different notifications all around the screen which meant I had to dismiss them all before getting back to my usual FB timewasting / networking activity.

There is a personality type that resists change, and with group effects in play this gets amplified. I’ve already seen groups campaigning to get the old back. I could care less about that, by all means go and change and organically improve. Just be careful about the over-riding experience, because a site as popular as FB has a real responsibility to keep design patterns sound, so that people don’t start getting used to bad practices. FB may have more reasons than most to cram stuff into a central page, but what they’re doing is making a one-page experience as they remove more and more reasons to leave the main page. You can now comment on people’s walls straight from the home, and read comments not currently on screen with new fly-outs from the right hand column. I can imagine other sites doing the same, and how difficult they’d be to navigate.

And of course I can hear them now in boardrooms around the world. “Why don’t we just make that a scrolly box, and stick like four of them together in the right column…”. Like as if somehow, the page no longer scrolling has solved the old page fold debate, and instead lots of individual blocks will be scrollable. Or not.

On holiday this summer in the Vendée region (near the Loire valley), I was pleasantly surprised by my till receipt for my holiday shopping. Instead of a list in simple order of items scanned by the cashier, the receipt was both grouped by department, and ordered by highest priced item first. At a glance, you can see which items from each department are the most expensive, and which departments you bought the most goods from.

In the past, till receipts were printed line by line first mechanically – possibly with mechanical tabulation (addition of next item to subtotal) inside the machine – then by fairly dumb electronic calculators which would do much the same. More recently, bar code scanning meant the machines queried a database for the item price. Later, the item name would be queried and printed (initially a few characters per item) and yet the basic running totals and chronological ordering have still to change in many supermarkets and other stores where you buy a lot of items.

Behind the scenes, no doubt accounting has been done by department for some time. It’s a relatively small jump from a flat database lookup to allowing classification by groups of products. This, to my knowledge, is standard practice for any self respecting supermarket manager / category manager. I’m fascinated to see just how long it has taken to expose these groups to the customer in a useful way. Buffering the data scanned and making a single printout at the end has surely long been within the technical capability of many point of sales devices, since back office equipment and even individual tills have been able to do it for some time. Fast printing of hundreds of lines of text has been possible for well over a decade.

I see several advantages to this approach. Customers get a clear receipt, even if errors are made while scanning or products are cancelled (these need not show on a final receipt, but will create ugly correction lines on “print a line after every item is scanned” receipts). If any goods are bought in multiples, they need not be scanned together at the same time and relevant multi-buy discounts can be neatly added in the same place. Most of all, you leave the store with a feeling that you might want to keep the receipt a bit longer and look at your purchases a bit more carefully. Perhaps even in store, you might notice that you have been billed twice for expensive items because of scanning error. You may be on a tight budget and readable receipts surely help money management at the end of the month. Most of all, errors at tills are commonplace but it’s terribly difficult to see the errors while stressing to pack up your shopping. Kudos to Super U (France) for this little innovation which I think is a real customer pleaser. Now if only the database people could put decent descriptions in for products, instead of “ENM PAST 32%MG U BIO 250G”.

I love the series of sketch notes from Eva-Lotta Lamm, someone who attends a lot of conferences and makes notes with amazing visual impact.

I could have chosen one of many different images that she has uploaded, but this one is recent, colourful and contains perhaps a few things that are less technical – though you probably need to work in a company with an active website to really “get” the overall message. I’d love to know if you get anything out of reading them if you’re completely outside of web marketing / user experience / web project management.