The project reached its $5K goal on May 14 after being on Kickstarter just over a week

A hacker-journalist from the Los Angeles Times has started an online project to capture the front pages of news websites on the hour, every hour, in a bid to archive news website homepages for future generations.

PastPages, which was launched by Ben Welsh earlier this month, takes an image snapshot of the world’s biggest news sites every hour and makes them available as an archive.

At the LA Times Welsh specialises in using his skills as a computer programmer to assist his reporting.

To help maintain the website, which is currently costing him $60 a month out of his own pocket, Ben turned to US-based crowdfunding platform Kickstarter.

In exchange for gifts or benefits Kickstarter allows people to invest in projects that they want to fund. For $20, an investor can get their name credited on the PastPages website, for $250 a news organisation can get their site added to the archive and for $5,000 you can have your name or logo featured on every PastPages webpage and be named as the site’s sponsor.

The project passed its funding goal of $5,000 on May 14 after being on Kickstarter just over a week and still has over a month left to get more funding.

"Last year when Mubarak stepped down in Egypt I did a little tour of the web and took screenshots of all the major news websites and created a little gallery and it just sort of struck me that this is something we should be doing systematically, this is history," Welsh told Journalism.co.uk.

"In the same way people look at newspaper front pages as a snapshot in time, the front pages of the web are like a digital version of that.

"I just started hacking around at it as a weekend project. Then, earlier this year I went to a journalism conference in Austin and there were some academics there talking about content analysis they had to do and the difficulties of doing it.

"So I thought to myself I should finish this thing. I ironed out the last few little bugs it had and just put it out there to see what people would think."

Currently the only British news websites archived on his platform are the BBC, Daily Mail and Guardian, but Welsh has plans to expand this.

"I currently have 80 or so sites being archived at the moment, I want to double-triple-quadruple that."

With the success of the fundraising he also intends to massively expand the functionality of the website.

Currently the site just takes an image snapshot of the front pages but in the future PastPages will scrape and host all the HTML, images and code running on the website.

This will create an archive which is searchable by keyword and there are also plans to create an API which would allow other programmers to create new projects and mash-ups with the site’s data.

Ben has no plans to profit from the website, instead he sees it as an important public resource. The source code the website runs on has been open-sourced so that anyone can build their own archive and solves the problem. As Welsh put it: "what if I get hit by a bus?"

"I would view the site as a success if someone was studying the media coverage of the US election and came to me and said, 'could you give me the database of everything you have?' That would be really satisfying."

The latest podcast

Tumblr is vast. Its users blog about a wide variety of topics, and this also reflects in the ways news outlets approach their presence of the platform, with a combination of general blogs or topic specific ones.

But what exactly do they post, and how do they get Tumblr users involved with their blogs?