Just as you can get a snapshot of wikipedia and store on your PC (or even PDA), it occurred to me that a snapshot of the OESF forum would be a very useful thing if combined with free text search.

There's a huge amount of wisdom on it (a lot of which should be in the wiki, but isn't), and so an archive would be a real asset. The low-graphics version of course would be best, but provided the archive was created without the attachments, it'd be OK as there'd be only one copy of the site.

I did try a speed-throttled wget once, but it wasn't too satisfactory.

Any chance of considering being able to do this on the server itself and make a monthly snapshot downloadable in .zip or .tar.bz2? Some of us could burn CDs or DVDs to send to people without broad-band.

This is a terrific idea and I also wish this would be possible.I think a forum like this is based on a database. it should not be too hard to dump the database content into pute text files (or HTML maybe).

If it isn't possible on the server side, there is still the "lo-fi" versoin of the forums which are simpler HTML pages and probalby easier and faster to collect via a tool like wget or plucker. So this may be another idea.

Another nice feature would be to install an NNTP gateway or POP/IMAP server inorder to make it possible to download the forum contents to client programs like email clients or usenet clients. If it would be possible to add content (reply, add topics) also from these clients, this would be even better.

But this would probablyl be a major effort and since we can be thankful that some nice people drive this forum without profit, I totally understand if this will never happen :-)

This is a terrific idea and I also wish this would be possible.I think a forum like this is based on a database. it should not be too hard to dump the database content into pute text files (or HTML maybe).

If it isn't possible on the server side, there is still the "lo-fi" versoin of the forums which are simpler HTML pages and probalby easier and faster to collect via a tool like wget or plucker. So this may be another idea.

I now found a program which will do such a job, and am trying it out... the good thing is that it allows me to choose to only download URLs with "lofiversion" in them and to throttle bandwidth. The problem with wget is that it doesn't fix links, whereas this program does.

When it's finished, I will see what the results are like and offer a download on my website.

QUOTE

Another nice feature would be to install an NNTP gateway or POP/IMAP server inorder to make it possible to download the forum contents to client programs like email clients or usenet clients. If it would be possible to add content (reply, add topics) also from these clients, this would be even better.

But this would probablyl be a major effort and since we can be thankful that some nice people drive this forum without profit, I totally understand if this will never happen :-)

daniel

I did ask about an RSS feed, but I think this forum doesn't have the facility.

It downloaded about 9,000 pages that resulted in a 120MB. It took 1 hour on my adsl connection.This is just for a computer game...!?Is there any way to know beforehand, how much data is going to be downloaded?I was thinking about placing this wiki on my Zaurus, but it ended up being way too big for just one game.I decided to cancel this because it was downloading 8 other previous versions of (Elder Scrolls game wikis) which are in the left-hand frame.

I had been just saving individual web pages that interested me from this wiki previously.

speculatrix,I tried using the program you listed to make an offline copy of this wiki site:http://www.uesp.net/wiki/Oblivion:OblivionIt downloaded about 9,000 pages that resulted in a 120MB. It took 1 hour on my adsl connection.

make sure you only include HTML files and CSS, don't allow it to leave the domain, ensure you specify the correct url to match just the wiki, and so on. there are countless options!

QUOTE(desertrat)

Doesn't the --convert-links, --html-extension options cover it?

Hmm, possibly, but I was struggling to get it to convert the query string and fix everything and stay on the lofiversion URL. This windows program just made it easier to do.

I think it's not a very smart idea to download forums with a tool lik Htttrack. This forum is based on PHP scripts which interface to a database. Downloading with htttrack makes the forum engine generate thousands of pages, resulting in high usage and high traffic.

It should not be too hard to make the database available for download regularly in a compressed format. Then one would create scripts to serve the database locally on the Z...

I think it's not a very smart idea to download forums with a tool lik Htttrack. This forum is based on PHP scripts which interface to a database. Downloading with htttrack makes the forum engine generate thousands of pages, resulting in high usage and high traffic.

httrack has nice features to limit the load it puts on servers, and in fact the latest version has a restriction of 100kbps bandwidth use.

I made sure it restricted the number of parallel page fetches too, so that it too quite a while to download.

I've got building work on at home and they've cut the power again, so I can't access my fileserver and do the upload at the moment.

I've just had a thought... what's the limit on the number of files in the same directory on a fat32 memory card? Would the forum archive actually be storable on flash?

I don't think this should be a problem really, since you probably should put it into a squashfs archive anyway and mount it on the Z. That way you'll save on space bigtime, because this is mostly text.

hmm, well, it's a pretty damn big file when zipped up - over 60MB! It took a long time to spider the forum with the rate throttled right down - 21000 files or so!

I'm uploading the file now as a .zip to my website at http://www.zaurus.org.uk/downloads.html , because that way people who want a cramfs/squashfs mountable archive can create one, and people with windows can unpack it and use a local search function like google desktop to index it.