the current overwriting all tracks in SST will probably take another 5 weeks
until it is finished we will still see "skippd tracks" in the Q

my suggestion is in addition to the slow upload all files to build an "on demand upload"
any track that is in the Q should get "priority" thus we would stop the skipping tracks problem in about one hour and not end of march

it could be accomplished by something like a cronjob asking the database about whats in the Q and not jet updated, and just getting these files f.e. using scp form JERICS backup at home

And who, pray tell, would tend to that "on-demand upload list" and sift through terabytes of tracks to make it happen? I trust that you realize that a human or twenty would be required to do it manually..._________________diginferno

.::If my answers frighten you then you should cease asking scary questions::.

i am writing about a script running on sst and the other stations
(it would run on the 24/7 server and check periodically if all the tracks who are soon to be played (already in the Q) are there and the checksums fit to the backup.
if not it would upload the missing /damaged track

making checksums for a terabyte of files costs less than 24 hours so how long does it take to check them all?

well we got php it works on linux and windows.
its not ideal but ok in this case since we need some binding to the database to get the current playlist
and then some function like "$hash=md5_file($filename)" can work on the problem on the server side.

i would prefer doing these things on linux too , but we can make do what we got.

having hashes of all the music files on the server and in the backup could save us a lot of time since only some 1/3 to 1/10 is broken/missing and needs fixing.