As applications open for the 2015 round of OpenNews fellows, I reflect on my time and the first round of fellowships. The process was very different. We weren’t OpenNews then, we were just Knight-Mozilla fellows. We had no idea what we were getting ourselves into. But we made friends, immediately.

Looking back, what surprises me was how quickly we became comfortable with each other. Conversation, conferences and visits were so easy. I miss them and I miss the working dynamic, if what you could call when we got together “work”. It was so much fun.

I also recall meeting one of my best friends during my fellowship at The Guardian. She is a programmer to the core and a role model of mine.

I have met some of the subsequent fellows and they are of the same odd ball ilk. People you want in the community. People who make the community, give it the right flavour.

I was at a round table discussion recently held by Undercurrent about building diversity in the workforce. One thing that stuck with me was the power with which switching one little words changes the ethos of a company. When hiring, instead of asking yourself if the candidate will be a cultural fit you should ask if this candidate will be a cultural add.

This, I believe, is the beauty of the OpenNews programme. It does not look to fit fellows to a news partner. It looks to add to it in ways those partners do not yet realise they can gain from.

Similarly, what the fellows get from the fellowship is not a an easy fit but a brave addition to anyones life journey.

A quick word about what I’ve been up to: After finishing my fellowship at The Guardian I am now a Data Journalist at The Times & Sunday Times working full time scraping, parsing and analysing data as part of a data-driven investigative unit. We’ve had front page stories with both titles.

Having spent two days in a theatre come town hall come hack space with 150 developers, journalists, designers and students here are three thoughts regarding hacking, developing and the future of news.

Play to win or play to learn

Whether it is a one day hack day or a two, three or four day hackathon, there is never enough time to do something truly innovative. Don’t get me wrong, I love hack days but this is a sad fact that is better known beforehand. Hack days are meant to be shallow. It is about mashups and remixing. Organisers want something simple with a clear output which they can easily implement and so say “look, this is what we got for all the effort we put in”. Disruption does not take place at hack days.

Which is why you should play to learn and not play to win. I was on the Times Digital team at newsHACK and we won our category “Best Journalism Tool”. This was despite the fact that we built proof of concept prototypes for an incredibly complex and difficult to implement concept. The overall winners were the FT team who took Dictionary of Numbers and made it more customised to the user. Clear, simple and easy to implement. That was the reason the judges gave for awarding them the overall prize. Interestingly, we were told that our hack was the most hotly debated amongst the judges.

There was a 4:3 split in terms of judges’ background. The majority were from a digital management position, three had strong editorial backgrounds. I believe the debate was caused by editorial backing our project and digital backing the FT. I also believe they were right to choose the FT. Our hack consisted of natural language processing with machine learning and a gamification skin to crowd source the training of the algorithm. It was ambitious, too ambitious to win the hack day game.

But we didn’t set out to win, we set out to challenge ourselves and push each other to gain new experience in skills, which we couldn’t afford to do in a working environment. Which is why, I feel, we won. We got the most out of the two day sprint.

It’s a team sport

My favourite moment of newsHACK was the look in our designer, Mario Cameira‘s eye, when our developer, Aendrew Rininsland, was showing him how to edit the CSS from the Drupal instance he spun the night before. By the way, Aendrew should get a prize for the quickest build of a Drupal site. Everyone from our team gave 150% and tried new things, and better yet, came back with a list of new things they want to do for taking the project forward. We worked so hard to realise our idea and not just to win. We wanted to show how machine learning can be applied to news. It was a rather lofty challenge but in the end, the news hacks and hackers running the show at the BBC, Sky, Guardian and FT got it.

My approach for a hack day is to imagine the most ambitious, the most innovative hack and see how far we get to realising it. It is not a winning formula, in fact it is a losing formula. But it gets every team member thinking and contributing their imagination and expertise. It pushes the boundary for everyone and brings to the fore the collective creative paths we can pursue. We are a new team at The Times with relatively little combined experience in the news world. By being recognised amongst a large audience of digitally forward media houses, we were punching well above our weight. What I’ve learnt is that playing to learn results in a better outcome for the team and for each member individually, than playing to win.

FartBanana was a hack presented at Hacked.io. It consisted of two electroluminescent wires each connected to a banana skin that when in contact makes a fart noise. It was hilarious to behold. It was absolutely useless but added so much to the event. Every time I think of the inventor with black rotting banana skins in his sitting room making fart noises it brings a smile to my face. So never underestimate the power and importance of whimsy. I think if news application teams had outbreaks of spontaneous whimsy every now and again it would add to the overall productivity and team moral. Whimsy as a management tool, think about it.

Now that the OpenNews fellowship programme is beginning to recruit the third round of fellows, I thought it ripe to reflect on my time as a fellow, an alumni and the only alumni to still be in the news industry.

To get an idea of what I did and with whom I worked at The Guardian here is a video:

I am now a Data Journalist at the Times Newspaper Limited. I have the role and the job title I wanted when I decided to learn to code. Most importantly I consider myself a programmer. It took me a long time to get it into my head. I was at a D3 workshop recently and I realised I was more knowledgable and more capable than most of the programmers who have ‘developer’ in their job title. It came as a shock, but I think it has allowed me to take even longer strides in my data journalism.

Even though I can no longer do whatever I want I still have immensely more creative freedom and self-governance than anyone else with ‘journalist’ in their job title. Because I am self-taught, because I don’t need any instruments in a news organisation to produce everything they can, because I hunt for ideas on GitHub and Source and from DataMiners; I cannot be managed. And the digital team at the Times Newspaper Limited are ok with that.

I see myself as a disruptive force and have been told to “keep on doing that”. I am working with a young, creative and determined team. We have huge challenges ahead of us and that makes us a team. We are a motley crew with the most diverse skills ever seen at News Corp. So my time as a fellow has carried on in spirit. You can’t undo a realisation. So I will always go where I can code stories.

Dan Sinker anouncing the first ever OpenNews Fellows

And with me will always be Mark, Laurian, Dan and Cole. I was recently asked to describe the best team I ever worked with. I said the OpenNews fellows. Even though we weren’t in the same newsroom and in most cases not even in the same country, I still feel they were my fellows. We hacked, we taught, we drank whiskey together.

So to the next round of fellows I say: work open, make news, cherish the fellows.

We got around 70 women (and 10 men) in a room for 3 days talking, learning and building. The result was not the product ideas, the stories generated or the visualizations built but the sense of fun, community and excitement. The chicas went away brimming with enthusiasm and overcome with determination. Although it was too early in the process to get them all coding (which I am wont to do), the key moment for me was when I was accosted in the ladies by a participant who said she wanted to be like me, to do what I do. We received gifts, we received thanks, we received praise. But what we really gained was a retort to the online threats against women.

This is Giannina Segnini. She is an investigative journalist at La Nacion Costa Rica. She has put two presidents in jail. She has been targeted and shot in her own home. She takes the government to court to get data. She works with designers and developers. She is tough as nails. If any web troll dared threaten her to her face, she would make his penis retract.

We work in what is considered a “man’s world”. We may be subject to unappetizing remarks, slimy trolls and inappropriate comments. But mostly we do our job, we do it well and we get respect. From men and women alike. Often men are fed up with the brogrammer mentality and want to work in the environments we help foster. I can say for myself, I have never been threatened or maligned on the web. My presence tends to be gender neutral and that’s more by my own choosing rather than fear of becoming a target.

Here I am at the launch of HacksHackers San Jose. The room was packed beyond capacity. I had 10 minutes so I got out my laptop, sat down on the stage and coded (HacksHackers diva Chrys Wu held the mic when I typed). As Costa Rica is known for its bird life I made a web app for tweeting out the songs of threatened, vulnerable and critically endangered birds in Costa Rica. All the code for it can be found in this GitHub repository.

As a woman whose presence online is just as important in forming my identity as my presence offline, I don’t feel I need a report button. I don’t necessarily believe that the most important thing for a woman getting online is to feel safe and secure but to feel included and empowered. That can sometimes be dangerous. So what I need is more Chicas Poderosas, more Girls Who Code, more <write/speak/code>. More women being of the web and not just on the web.

]]>http://datamineruk.com/2013/07/31/dont-write-it-dont-record-it-be-a-chicas-poderosas-and-build-it/feed/1Preaching DDJ In The Netherlandshttp://datamineruk.com/2013/05/21/preaching-ddj-in-the-netherlands/
http://datamineruk.com/2013/05/21/preaching-ddj-in-the-netherlands/#commentsTue, 21 May 2013 12:27:11 +0000http://datamineruk.com/?p=1702Continue Reading →]]>Towards the end of April I had the privilege of being invited to Utrecht University to teach a one-day workshop in Advanced Data Journalism (i.e. scary code). Here is a quick video they made for to give people a quick idea of what ddj is about:

Teaching journalists to code is never easy and impossible in one day, but I wanted the participants to get a flavour of what is possible and where the journalistic pitfalls are. Being a new sphere for innovation in media, data journalism comes with its challenges. Luckily I only had one person run away after the “coding bit”. Others took up the baton and started scraping the very next day. One such person is Arno Kersten who works for the Dutch-Flemish Association of Investigative Journalists and publishes the ddj website Medialab.

]]>http://datamineruk.com/2013/05/21/preaching-ddj-in-the-netherlands/feed/0Get Busy Building Before You Get Busy Fundinghttp://datamineruk.com/2013/03/26/busy-building-busy-funding/
http://datamineruk.com/2013/03/26/busy-building-busy-funding/#commentsTue, 26 Mar 2013 14:48:25 +0000http://datamineruk.com/?p=1696Continue Reading →]]>I spoke at the latest HacksHackers London meetup along with three of my other OpenNews Fellows. Also speaking was John Bracken from the Knight Foundation and Bobbie Johnson who successfully started Matter on Kickstarter. Chronic blogger, Martin Belam has a good round up (see links) where he mentions my ‘rallying cry’:

“You have no excuse not to be making things now. You don’t need funding. You don’t need to be put in the newsroom. Start with the basics, expand it, and make something fun as a prototype.”

I was always told by developers that you will only learn through making. Which is why I believe so many developers are not formerly trained in computer science. My fellows are not. I would liken it to the gap between ‘studying’ journalism in college and what you learn by being part of a newsroom. The former is where you get the basics, the latter where you build your craft.

I have applied to the Knight Foundation Prototype Fund to build a platform and will go to Kickstarter if that fails. And if that fails I will just build in my spare time. Entrepreneurism and start-up life is about, as my fellow fellow Stijn described his journey, a series of failures. I am not afraid to fail because I will just learn and build it myself. Even if I succeed with Knight I plan on having a Kickstarter project this year. I have also applied for funding for a big data driven investigative story. I am not looking to build a body of work but a body of knowledge. You gain knowledge from trying and failing.

When I started an internship at CNN, all those many years ago, the first thing Richard Quest asked all of us fresh-faced interns to do was pitch a story. Here I learnt there was an art to pitching and a successful pitcher became a successful journalist. The beauty of digital journalism is that you don’t need to be in a bustling newsroom to pitch. You can pitch to your audience. You can pitch for funding. You can build your newsroom online and gauge the success.

In that sense, you have no excuse not to be pitching now. Not to be having ideas. You don’t need to be in a newsroom. Build a platform. I have recently bootstrapped my WordPress theme. I installed this free theme and modified the code base. Tinker with products, tinker with code and tinker with the stories you can produce.

I’ll let you know how my funding and project go. In the meantime I will be on a panel at the Polis Journalism Conference speaking about trust in data journalism on 5th April. Sign up here, it’s free!

Don’t use it for what it’s made for, understand its functionality and use that

I said this over a year ago when I was first using social media tools to gather, filter and verify news. I’m sure it is one of the reasons The Guardian chose me to be its 2012 Knight-MozillaOpenNews Fellow. And I stand by those words just as strongly today.

Gone are the days when a hugely popular blog and thousands of Twitter followers would pave your way into the newsroom. They now allow you to be part of the brand, at the peripheral. You could blog as part of a news organisation, working freelance and communicating every now and again with the editor of that section. Your battle to shape the news, pitch stories and drive the agenda is made more difficult by the “new media” silo you find yourself in.

Blogging is a new breed of the medium. It is the same species. It functions pretty much the same. The key to extending the news sphere is cross breeding. Find a species bred for an entirely different purpose, understand its functionality and use that. For instance, I came across this graphic by The New York Times’ graphics editor and resident statistician, Amanda Cox:

As you can tell from the topic, this was published in 2008, long before data journalism became such a trend. I love this because something like this cannot be fathomed by a journalist. Amanda Cox sees data but she also sees a tool for analysis and can shed light on what factors affected whether Obama or Clinton won a state. I like this as well because I have leant to do it! (from a free online data analysis course run by Johns Hopkins University). Using R:

setwd("directory_your_data_is_in") # you need to set your working directory to where your data file is located

my_data <- read.table("name_of_data_file", header = , sep =, quote = , dec = , ...) # for instance if you have a csv file with the first row being the column names, entries surrounded by " if they contain "," and "." as a decimal point then header=TRUE, sep=",", quote="\"", dec=".". Or you can just use read.csv()

install.packages("tree") # installing the tree package so you can access its functions
library(tree) # loading the functions into R

# To predict, say which Candidate (column name) wins from all the other columns in the dataset
predict_tree <- tree(Candidate ~ ., my_data)

# To print out the prediction tree
plot(predict_tree)
text(predict_tree)

# I think the data in the NYT example needed a lot of manipulation, turning numerical 1's and 0's into factors for example. Also finding the percentage splits on things like education involves looking at that subset as its own branch

Rather than being creative with the medium why don’t we start thinking outside of the box in terms of the message. Analytical journalism is something Twitter, Facebook and Google can’t compete with. They can be faster, they can be more diverse, they can be more averting. But your friends and followers are not going to take the time and effort to analyse raw facts. That is where the value of new journalism lies.

Linear regression could be used, for example, to see which more strongly predicts levels of school truancy, social or educational factors. But thinking outside the box needn’t be so demanding as a university course in data analysis. Instead of a static graphic or even an exploratory interactive, why not make an animation. With HTML5 video, 4G networks and Popcorn.js to tie video to web elements upon us already, web is fast becoming a playing field for broadcast journalism. Here is a brilliant example of building a data narrative through animation:

For those who play with code, why not build robots to help you report? You can build a web scraper to monitor a data source and tweet/email you if certain conditions are met. The possibilities are endless.

If you are interested in news and journalism look outside the reporting sphere, gain analytical skills and ask yourself “how does it function?” and use that.

]]>http://datamineruk.com/2013/03/12/to-get-into-news-you-have-to-look-outside-it/feed/1DataMinerUK: What I Do And Howhttp://datamineruk.com/2013/03/07/datamineruk-what-i-do-and-how/
http://datamineruk.com/2013/03/07/datamineruk-what-i-do-and-how/#commentsThu, 07 Mar 2013 15:24:10 +0000http://datamineruk.com/?p=1640Continue Reading →]]>What I Do

Data journalism has sprawled into an all-encompassing term meaning new digital journalism excluding social media/community. In other words, news generated from a newsroom which doesn’t solely consist of text print. However, there is a wide variety of fields interwoven and interrelated into the term data journalism.

So when asked about getting into data journalism, one has to clarify which branch. The first split from the trunk has to be the broader areas of investigative data journalsim and data visualisation. As newsrooms moved from print to digital, few of them had the chance or expertise to experiment and so the whole tree fell under the data journalism canopy. Thus they were made of fullstack developers (most were designers and developers with an editorial leader, such as the NewYork Times team run by Aron Pilhofer).

Now data journalism is reaching its tipping point. A data journalist can no longer dapple with Excel, Google Fusion tables, HTML and JavaScript and expect to be in the game. I was not lucky enough to be able to attend this year’s National Institute for Computer Assisted Reporting (NICAR) conference, #NICAR13, but following the twitter feed has revealed this shift.

Seeing resumes of people who can do a little of everything, but not having specialty or small set of specialities worries @pilhofer#NICAR13

I specialise in the backend of data journalism: investigations. I work to be the primary source of a story, having found it in data. As such my skills lean less towards design and JavaScript and more towards scraping, databases and statistics.

What I Need To Do It

There is no clear way to get into data journalism. Firstly you must decide what area of this wide field you are interested in. You must find where your passion and curiosity lies. You can never get to a level of training where you feel you can satisfactorily do your job. That’s the difference between journalism and data journalism. The web evolves in 3 month cycles and so you are always playing catch up, whether that’s retrieving data or making the latest unique interactive.

The most important thing you need is drive. Because it is hard. I have learnt an object oriented programming language primarily for scraping, Python, a database language MySQL (ElasticSearch if the data is very large) and an analysis software language, R. I want to get at a story, I want to find an interesting pattern, a tantalising trail. For that, I need to be able to get, clean, sort and analyse data in order to understand it.

The problem of “what skills do you need to know?” depends on what you want to do, says @markhorvit. Understanding data is big part. #NICAR13

But it’s not just the the skills that are an important part of the journey. That’s just what you need to get ready. It’s what you learn along the way that is your greatest asset.

How I Do It

I work in a virtual world. Literally. The only software I have installed on my machine are VirtualBox and Vagrant. I create a virtual machine inside my machine. I have blueprints for many virtual machines. Each machine has a different function i.e. a different piece of software installed. So to perform a function such as fetching the data or cleaning it or analysing it, I have a brand new environment which can be recreated on any computer.

I call these environments “Infinite Interns“. In order to help journalists see the possibilities of what I do, I tell then to think about what they could accomplish if they had an infinite amount of interns. Because that’s what code is. Here are a couple of slides about my Infinite Interns system:

To use this system I have another repository called Skel. This is the skeleton layout for all my data driven projects. Every time I start a new project I download Skel. This also has folders for the various components of my investigation, the source data, the transformed data, the analysis, etc. With each intern I bring to life I transform the data and put the results on my own machine. After that I kill it. A benefit of having virtual interns is that no labour laws apply!

All my processes are coded, as are my environments, so I can write a script that will automatically run the investigation from start to finish in one command. In that way it is completely transparent and reproducible.

Why I Do It

It has taken me a long time to develop this process. It’s built on what I’ve come to learn as an OpenNews Fellow at The Guardian. I use code to do journalism, to find and report stories. Thus my process of development is not built around agile or responsiveness or all the other ways developer teams are run. My process is built around transparency and reproducibility. It is meant to stand up in court.

I practice this on every scale. For instance, I have a three step process when I scrape. For a large enough dataset, there will be a search function on a webpage which will pull out sections of the database. I write a scraper to pull out every section, paginate through the entries and store the URLs for each of the individual entries. Then I write another scraper to go to all the URLs and store the HTML for the page. All of it. Only after I have done that do I write a third and final scraper to collect all the necessary details and store it in a structured database.

Having recently taught this process to a group of developers at Al Jazeera English, one of them asked me why I have this second interim step. Surely it wastes time and computing space. You need it just as a journalist needs their notes. When you scrape a website you are fetching information from a site that has fetched it from a server. The entity you are investigating has access to that server and can change that information. It will change on the site and what proof do you have that you did not fabricate the data or retrieve it incorrectly. If you have the HTML you have proof of what the website retrieved at that point in time.

I am a journalist by training and am learning developer skills. But I am creating my own processes based on what I need as a journalist not as a developer. It is an ever evolving process but that insight is the result of two years delving into the world of data journalism. As long as you have an insight of your own, a unique way of getting done what you need to get done, then you are a data journalist.

]]>http://datamineruk.com/2013/03/07/datamineruk-what-i-do-and-how/feed/1Data Journalism – Where Is It All Going?http://datamineruk.com/2013/01/29/data-journalism-where-is-it-all-going/
http://datamineruk.com/2013/01/29/data-journalism-where-is-it-all-going/#commentsTue, 29 Jan 2013 17:08:11 +0000http://datamineruk.com/?p=1631Continue Reading →]]>Firstly I’m going nowhere as I am having operations on my feet. The Internet, however, is my only freedom. Painkillers are my cage.

…how many data journalists are working today? How many will be needed? What are the primary tools they rely upon now? What will they need in 2013? Who are the leaders or primary drivers in the area? What are the most notable projects? What organizations are embracing data journalism, and why?

I spent last year as a Knight-Mozilla OpenNews Fellow at The Guardian, working within the Interactive Team, in the hope of getting where a journalist in the age of big data needs to go. It was not just about learning to code, which I started a year pervious at ScraperWiki, but understanding the digital story cycle, the editorial decisions, and the internal politics. Most importantly, my journey was and still is the road less travelled by. I had to completely retrain myself. With no syllabus, no certificate and no notion of what would be needed to don the robes of the data journalist. So here is my two cents.

How many data journalists are working today?

That depends on your definition of a data journalist. After the dawning of the era of the citizen journalist we no longer have a clear definition of journalist! This is a moot point but I would say it is more of a legal point. Rather than egotistically heralding the bastions of truthfulness, impartiality and objectivity; I would say a journalist is someone who can argue a case for public interest defence in court. A data journalist is therefore anyone who can argue a case for public interest if he/she is brought to court upon producing content for the web.

Typically a journalist could argue a case for public interest upon publication and these were traditionally of the form of print, radio and TV. And traditionally , the process toward publication began with collecting documents and documenting the process which transformed these documents into its resultant mediated format. Now, documents are data. We live in a digitized world. Therefore all journalists are becoming data journalists. It’s just journalism adapting to new media going out and new media coming in.

To answer the question (whilst not really answering), the number of data journalists today are the number of journalists working in an organization which will survive the digital transition and find a stable business model, and who will not be fired before these things happen. The number of data driven investigative journalists will be one or two per survivor successful enough to afford specialist desks. I make this distinction because all documents are data and all journalism is built atop documents, however journalism in its simplest form involves mediating already processed data whereas data specialists will be processing and analyzing raw data in order to get to the stories within.

How many will be needed?

That depends on who survives the digital cull. Journalists will need produce for the web first. Once news organizations settle on a content management system I believe there would be very little computer skills needed. Training to use the CMS would be more than adequate. I believe that the big stories, the special features and editorials are going to need that extra pizzaz. As such, small investigations coupled with video, graphic and/or interactive features will become desirable, especially amongst those editors who want to boast at conferences.

To create a long form digital piece you of course need a multi-skilled team. For these teams to work efficiently requires each member to be familiar with all the processes and to have a vast range of skills. So a most specialized data journalist with coding skills will be desirable but only at a select few organizations. But I believe supply will be very low as few journalists are interested in this route and even few institutions teach computer-assisted-reporting to an adequate standard for the job requirements.

What are the primary tools they rely upon now?

Currently there is no standard and tools vary according to institutions. The very basics are spreadsheets and fusions tables. Web standards are HTML and CSS. The more data intensive (and ambitious) interactive desks work with R. This is a legacy of training as statisticians and data scientists coming out of University should be R trained. Also R is free. Assume everything they use will need to be free. One object oriented programming language, Ruby or Python, is used. Some say either is fine but others such as ProPublica state they are Ruby houses. If a team dictates which one will be used as standard, this is most probably a good sign. It means they are sharing code and plan on reusing it.

The trend I think you will see this year and for years to come is an evolution towards JavaScript and HTML5 houses. These will be the hard skills looked for. Simply, everything will more towards being browser native so that there can be quality control across all devices. Responsive design rather than singular apps. Third party tools will be phased out very quickly the higher up you climb the digital ladder. The web changes too quickly to rely on them. The tools that will be relied upon most will be people-based: creativity, problem-solving, ingenuity, design architecture, accuracy, etc. All the things for which we are yet to make an algorithm.

Who are the leaders or primary drivers in the area? What are the most notable projects? What organizations are embracing data journalism, and why?

These are all sort of the same question. The primary drivers are the tje organizations embracing data journalism and they are creating the most notable projects. The Guardian, The New York Times, The Boston Globe, Texas Tribune, BBC, ProPublica, etc. The majority of the big players in the US are in the game and playing mostly to swap and/or steal people between them rather then fostering their own personnel or attempting a different approach. The UK has a couple of major players and some crouching at the starting line. The US has a much longer history of CAR and a more go-get-’em attitude that has allowed Aron Pilhofer and Scott Klein to adopt their own team management and structure.

Even though “data journalism” is on the rise (as a term anyway) getting to where Aron and Scott are is extremely difficult, being successful at it even more so. Aron and Scott were given the opportunity to form a team when news organizations were going through a frantic burst of evolution, straining to cling on to life. ProPublica is one fo the more elegant species to arise from this turbulent time. Now, those who feel they survived the crunch are battening down the hatches and hoping to stave off the winter by being conservative. Because managers and directors have now heard the terms “big data”, “interactives”, “data journalism” they feel someone has figured it out and all they need to do is steal them or if that fails find a cheaper copy.

Sadly, most news organizations are embracing data journalism because The New York Times has. Managers feel they should be sprouting buzz words at meetings and having coffee with Google reps. They need to be seen to be embracing the future. Even if they have no idea what that entails. Many send editors to conferences but few send journalists to training courses. Most want to find out what free tools they can use, who they can partner with and who is the next Twitter. At the core of digital journalism development is incentive. It still doesn’t make money but it can make careers. It has to stop being about the newsroom ‘ooooh’ factor and more about getting the right story.

What will they need in 2013?

Experience. They will need people attempting these projects on a small scale, therefore they need non-developers to have some skills and a thirst to learn new ones. They need more people and more models in order to experiment as “data journalism” is not a solid thing as yet. They need the internal structure (and a stable one) in order to allow the team the time it takes to mature. They need the to money to pay developers, good ones. They need the internal structure of their institution to undertand what it is they are doing and to adopt the same fluidity.

They need to keep experimenting.

]]>http://datamineruk.com/2013/01/29/data-journalism-where-is-it-all-going/feed/1Hacktivities at the Mozilla Festival 2012http://datamineruk.com/2012/11/13/hacktivities-at-the-mozilla-festival-2012/
http://datamineruk.com/2012/11/13/hacktivities-at-the-mozilla-festival-2012/#commentsTue, 13 Nov 2012 17:20:16 +0000http://datamineruk.com/?p=1614Continue Reading →]]>Last weekend (9-11 November 2012) saw the second Mozilla Festival. Over a thousand people came through the doors of Ravensbourne College, packing 9 floors and hacking to their hearts content. Digital journalism super stars Aron Pilhofer, Brian Boyer, Scott Klein, Miranda Mulligan and the Guardian Interactive Team were in attendance.

I optimistically decided to run two learning labs. Both teaching non-coders to hack and hackers some media literacy hacktivities and interactive video with Popcorn.js. So there is something for everyone! These will be put into a hacktivity kit but until I find a home for making my workshops into online lessons I’ll be putting all the links right here.

First a word of note, if you’re attending my workshops or learning labs bring a laptop and not a tablet. I’m very hands-on!

HTML for Journalists

Using Mozilla’s new webmaker tool, Thimble, you can code and see what the browser sees. In this hacktivity you markup and style a news article, learning the bascis of HTML and CSS. For those of you already web savvy there is a media literacy game suggested. Try it out!

Location-based Storytelling Using Popcorn Maker and Popcorn.js

This was really fun to make and teach. Popcorn is the project that is going to get me to further hone my JavaScript skills. It is the most applicable webmaker project to journalism. Here non-coders use Popcorn Maker to replicated a BBC Interactive and those who code for the web can recreate it easily using Popcorn.js. Choose a track and try it for yourself. For non-coders I suggest you do the HTML course first, go through the web fundamentals and JavaScript fundamentals track on CodeCademy and then have a crack at the Popcorn.js assignment!

Besides running the above workshops, which was great fun and so many journalists attended, all the OpenNews 2012 fellows also gave a one minute presentation on something they got up to this year. Watch it here (25 minutes in). The 2013 fellows were also announced. It was great to meet almost all of them and I look forward to following their amazing journey next year. It’s mind-blowing to think that such talented and diverse people are offering their skills to newsrooms around the globe. Counter to what most people would think, I think now is an incredible time to be in journalism. People are telling their own stories, instantly and to the world. The fabric with which media people now work with is truly intricate and interwoven through the public sphere. Now we should be experimenting with new types of stories that can be told.