Thursday, 14 February 2013

So, there's this big discussion going on for my hobby group at the moment, and the main discussion has been going on in facebook - mainly because that's where I was first asked to set up a discussion and it took all of five minutes to get the page up and running.

However - now that discussions are progressing, there are a number of people *outside* the facebookiverse who have raised quite reasonable objections to the discussion happening there. not everyone is on facebook, not everyone *wants* to be on facebook, and to be honest, a facebook group kinda sucks for searching and archiving really important discussions.

thus it has been requested that I copy all the posts and comments to Someplace Else, to make them available for more general consumption.

At first I balked at this request - 24 posts and 250 comments to be individually copied/pasted??? Who has that kind of time?

Of course when I actually sat down to think about the problem seriously, it took far less time than I though to solve it. So here's what I did, including the quick-and-dirty ruby script that will massage the facebook feed into something that resembles readable format. It aint pretty - but it'll pass for government work.

Step 1: get the feed from the API

I'll assume you are actually a member of the group that you're after. You will need to be.

First you need to create an access token to get the data out of facebook. This is essentially the same as doing one of those "allow application to access my data" things that you click on when you add a new app. In this stage, you need to allow the Graph-API application to access *your* group-data, to prove that you have access to he feed of the group - and to allow it to fetch out all the posts for you.

click the "Get access token" button

Select the "user_groups" checkbox

click the new "get access token" button

follow any prompts (if this is the first time using this API, you'll get the "allow access for this application" process - but it may not happen for subsequent attempts

You should now have a long encoded token in the box at the top of the page.

Next up we need to tell it what feed to use. There's a drop-down labelled "GET" which you should leave as-is. In the text-box next to that, type in the ID of the group in a URL-format like this: /1234567890?fields=feed and then click "Submit". The "fields=feed" tells the API to actually go and fetch the feed of posts and comments.

At this stage, you should be able to see a huge hash full of posts and comments in the box to the right hand side of the screen. Copy and paste that into a file.

Step 2: massage it into shape

Now you've got your feed data, you just need to play with it and spit it out into a nicer format. In our case, I decided to go for just a rough html format that showed what the posts were, what comments were attached, and who said what. My script is posted below - which can serve as a starting point for whatever you'd like to see done.

This script accepts the input filename and an optional output filename (or it just jams '.html' on the end of the input filename). It'll generate a really rough-and-ready html file that contains the posts and comments (with names and datetimes) plus some of the links (if present).

So what does this all mean for all of us running Rails-based systems? Is this just a flash-in-the-pan issue that will fade away the moment it's out of the public eye? or is it a herald of the coming apocalypse?