Thousands of Australian websites have irretrievably lost their data and email files following a malicious security hack on Australian domain registrar and web host Distribute.IT.
The company has been scrambling to save data and get customers back online or moved to safe servers since the security breach occurred over a week …

thats what i was pondering

i assume they had offsite backup servers, in the event of natural disaster, and just left the links on and up... not the best, not the worst

at least all their clients can just restore from their backups to a new provider, since they did do regular backups of their code and databases, right, right..?

painful lessons learnt, i hope some people reading this are going into their hosted sites panel and getting copies of the db and what not, or urging their customers to pay up for a regular backup service :P

Here's some Fail for you, and Fail for you, and you and you.

From another report of this mess... "I think I'm in shock ... I have lost everything .... I couldnt possibly replicate all those years of work again ... my whole lifes work is gone down the drain," wrote one.

How does someone entrust another party with their life’s work, with no copies of it themselves?

Yes,

TFA does rather suggest that backups where on disk, hot connected to the servers involved, which does seem a little careless. (As i my experience, when a machine dies, Windows server does occasionally take even connected USB sticks with it)

Hmmm

secure, I mean really secure. No, really.

We have a team of Malaysian students who meticulously copy all our data down on reams of paper in binary format, and then photocopy those pages, and store them in climate-controlled rooms on two separate sites, so if we are ever hacked and lose our data we can reconstruct it.

Of course, the team are currently 200-strong and about 3 years behind with the transcription process, but it's still a lot better than this newfangled fancy-dancy "cloud" rubbish.

Who is the villain?

Something wrong here. The hacker appears to highlighted a big hole in the hoster's backup policy. That is unforgiveable. It's very hard to keep a server safe, that's why backups are more important.

The worst a hacker should be able to achieve is wiping the server and possibly poison the last backup or so. That's why you should always archive backups. Then you can work your way back to a safe position and minimise the loss.

To all of the above. Tera(peta?)bytes.

So if I were to want to do something like this, perhaps I would come at the "problem" bass ackwards. After compromising the main system, I'd posion only the backups over enough time to "get" them all. And only then take down the main.

one more backup rant

so the hosting company had no disconnected/offline backup or offsite tapes. many of the customers had no personal local copies of their data. the data that so many peoples livelihoods entirely depended on. are you freaking kidding me?! we're going to see more and more of this with budget "cloud" services appearing all over the place.