Posted
by
Zonk
on Saturday August 04, 2007 @06:30PM
from the collaboration-makes-things-easier-you-don't-say dept.

BobB writes "Two universities — Bowdoin in Maine and Loyola Marymount in Los Angeles — have entered a unique arrangement under which they are backing up each other's web sites, email and servers on different ends of the continent. They say this could be a disaster recovery model all sorts of organizations could follow. From the article: 'When Bowdoin switched over to Exchange e-mail, so the schools would have similar e-mail infrastructure, LMU staffers were their guides and advisers. "We implemented that pretty quickly," says Davis, the Bowdoin CIO. "When we launched Exchange, we had just eight calls to our help desk." And the shared experience of the infrastructure components then forms a kind of informal help desk, where managers and staff can reach out for advice, brainstorm and troubleshoot problems with their colleagues a continent away.'"

I'll admit I have not used Notes from the admin side, nor used any version of it for about 4 years. The places I used it at were a few retail locations (I was never personally impressed with the program), so possibly it is preferred from supporting a lot of remote users, but this was before RPC over HTTP in Outlook as well. The people I work with in IT used Notes several years ago and often use it as a humorous comparison to some other piece of software, essentially refering to how much they disliked the

Having been an administrator of both systems, there's only two spots in which Notes exceeds Exchange/Outlook: in Notes, when I set an item on my calendar as "out of office", it asks if I want to set an out of office message for that time. Fantastic, but that's available in E/O 2007, so it's only an edge if you're still on older versions of the pair. Second, I can set the number of people a conference room will hold so that users can't ov

Wow, "simply install"? That's gotta be a freaking joke. Any email software that requires a bare minimum of THREE servers is so insanely not-simple. My single GroupWise server is a bit old, but it still runs GroupWise for quite a few clients (granted, not 14,000) and I don't get any calls for it except two people who managed to remove their "sent items" folder (and, as you said, spam-block checking).

Yes, so you spend all your time building one box and make it "perfect", then push the image to the other servers. You still had to build that one Notes box. A few hours to build the first, then 10 minutes to build the remainder. Your original point seemed to be the number of servers involved and the time it takes to set it up. My point was the additional servers do not take a significant investment in time. The 16 servers that make up our environment were all built in two days. After the initial build

No, but you can ghost a base OS image, then have it sysprep on first boot and install Exchange in its proper role. It was a bit more work than necessary for that many servers, but when a server goes down, I just have to plug my 2G flash key into any convenient machine, edit one of about 50 text files each describing a different server config and put the name and IP of the machine in. Then, plug it into the server and reboot. 20 minutes later, the server's up and running like it never went down.

Actually it requires 3 different 'server roles' that can be installed on separate servers or the same one. (there are two more optional roles too, one of these, the Edge Transport Role must be installed on a separate server, preferrably outside the domain for security purposes) It depends on the number of clients you have if you need multiple servers to handle the processing load.And as far as smiplicity, you can just hit "Typical Installation" and it will install the required 3 roles and only ask for basi

From what I understand, this is pretty common in higher ed -- in fact, the college that I work for is currently setting up something similar with another college in the area. Not cross-continent redundancy, true, but enough to keep things going should there be a smaller disaster in the area. If all of Western New York is wiped out, I don't really care if people can get their email.

This really came to the forefront with the beating the New Orleans area colleges took during Katrina; from what I recall, Loyola and Tulane were really unprepared and suffered for it.

I am not familiar with specifics of these agreements, so perhaps you can tell me... is security jointly administered (blanket policies/configs etc), does the host institution have oversight, or does the institute that originates the data have oversight of the remote servers.

Not that it makes a huge difference... my sister had all of her data stolen (and consequently her credit was hijacked) through infiltration of a Bay Area college by ID thieves. No off-siting involved.

It doesn't surprise me either. It would surprise me if my university system (Wisconsin) wasn't doing this by copying to other campuses within the state. From what I've gathered from our IT department, we have IPv6 backbones running throughout the UW system (but oddly, not "used" for anywhere outside of the system). Unfortunately, they have not taken them further than the server rooms. The only way I could get IPv6 to work when I lived on campus was through tunnelling....

My company has 30 sites and so it was easy for us to install (Linux) servers at multiple locations and arrange overnight rsync backups of data, server-located 'My Documents' folders, email & Intranet redundancy etc. for business continuity. I am a school governor for my son's local primary school and their backup procedure comprised a disk-to-disk copy from their main student server to another Windows-based server on the network, with an occasional dump to a removable hard disk.

When the school decided to improve their backup (after a disk failure and realisation that their backup process had not been working for a while, naturally!), they approached their incumbent IT supplier for a recommendation - which turned out to be a new main server with Windows 2003 Server, enough CALs for the children, a dual Xeon processors, SCSI-based RAID 5 and removable tape - very functional, very corporate and very expensive (approx £6,500) for a school that teaches 5-11 year olds!

Having approached me for my comments, we are now looking at a two-way peering arrangement with the local secondary school comprising two Linux-based servers with SATA RAID 1 (the school is only using the server for low-volume file and print services so Samba and CUPS are just what's needed), and an overnight backup strategy through the education WAN. Total cost is approx £750 for the two servers.

The only thing that may not make this fly will be County Hall red tape.

Eh, this is hardly unique. The university's IT department I work for has a similar arrangement with three other universities in place. We offer collocation space for each other and in one case even make each other's IP address space available on both sides. Is something really bad happens our colleagues can bring up our web site at their location and vice versa. In addition to that we also use SunGuard [sungard.com] so that the administration is able to keep on running even if our campus falls off the world.

After that, you should be able to copy just the changes and the new files.

We use software that does exactly that with block level backups to Netapp filers. Taking a wild guess here, I would say they have quite a bit of new data daily on the exchange servers of two large universities.

After that, you should be able to copy just the changes and the new files. It is amazing.

Even better, use rdiff-backup, which uses rsync to transfer minimal deltas, and preserves a complete history on the backup server. I do this with a number of systems that I back up and it works very, very well. When I first set it up I put extensive effort into building a script to automatically clean out backups so that I could keep dailies for a week, weeklies for two months, monthlies for a year, etc., but in practice I just keep dailies forever because all backups except the most recent are stored a

The speed of light is more of a limiting factor for latency, and not throughput.

And throughput is affected by latency. Which was the original poster's point. A huge round-trip time will affect the number of Megabits/sec that you can get through a pipe regardless of how big the pipe is.

That said, it's not even a big deal for latency -- light travels at 186,282 miles/second. New York to LA is approximately 2,800 miles.

Light travels through fiber at a slower speed. You'll never see data moving thr

And throughput is affected by latency. Which was the original poster's point. A huge round-trip time will affect the number of Megabits/sec that you can get through a pipe regardless of how big the pipe is.

Not necessarily. Given that these are pretty reliable links, you can set the transmission window relatively high without incurring very many penalties. That way, even if there is significant latency in the connection, you can maximize bandwidth.

I never used to think the speed of light was much of a limiting factor in my day to day life, but once I was living somewhere without access to wired broadband and so I looked into satellite internet. Then I found that geostationary satellites are about 22,200 miles above the surface of the earth, and that the best-case round trip ping would be about 500 ms. Which is awful for gaming or working via remote terminal.

The difference between a cross-continental trip, and shooting a signal up and down again to a geostationary satellite is a factor of 20.And even then, satellite communication is perfectly usable for applications where latency isn't a huge deal, as long as your transmission protocols are properly tweaked for a high-latency connection, as long as you're not sharing it between a whole lot of customers -- the big drawback is that it's rather expensive (for some rather obvious reasons) to have a large portion of

There have been times when the media have reported on a network called "Internet2." This is misleading since Internet2 is in fact a consortium and not a computer network. "Internet2" is sometimes used, albeit a misnomer, for the Abilene Network.

This list [internet2.edu] [PDF] is better, and while it confirms that this particular college isn't on Internet2, University of Maine is a member. Your sleepy home state isn't entirely left out of the fun it seems.

I actually do see speeds like this. We pay $450/mo for a 3mb guaranteed/100mb burst pipe at our Vericenter cage in Boston, and something like that much for a backup facility in Dallas. We have about 300mb of new data an hour to transfer, which we send in hourly batches, and it usually completes at something like 37mb/s. Now this certainly isn't a guaranteed rate, and I wouldn't want to run an application that depended on that kind of throughput, but it shows you what's possible across Sprint's backbone f

...that you might have to accept the legal responsibilies of the site that is being backed up. It's not just a simple exchange of providing corresponding services... Take it down to a personal level... who would you trust to use your personal computer as a backup server (in a reciprocal manner)? No one that hasn't your full and complete trust is my guess. Encryption would provide some protection but this isn't about data backup but service fallback.

So unless you have some kind of legal agreement covering your actual risks it's not for everyone. But for large scale organisations, with real legal clout, like universities it might makes sense. But not for individuals.

I don't know... companies that offer "online backup" still don't actually take responsibility for the integrity of your data. Also, you'd find out pretty quickly whether the person on the other end of your backup is doing a good job. Try accessing your data, running checksums, etc.

After all, you don't really need to make sure that the person you have the deal with never once loses a piece of data, but only that the chances are remote of him losing a piece of data at the same time you lose that same piece

If that is worry to you, use my program cryptosync [vanheusden.com]. It compresses and encrypts each file and also encrypts the file- and pathname. All files stay individual files so that you don't need to transfer a tar file of gigabytes.

It's always nice to see something from Maine featured in technology news. We're a tiny state population-wise, but there are many high-tech companies here.
Hell, even in my own town there is a company that developed and makes the MK47 Advanced Lightweight Grenade Launcher as well as a bunch of other armaments for the military.
Sure, this post is off topic and my karma blows already, but I always have to say "Go Maine!" whenever I read about my little state on any technology forum/site.

The summary mentions a university using Exchange successfully. Anathema, I say! Anathema! Have we no sense of decency? Must we sit idly by and let such disinformation blot out a beacon of truth here on the Internet? Let us take up our arms, brothers, and march onward. Er, roll onward, since all our computer chairs have wheels on them. We shall destroy the enemy while sitting upright in an ergonomic fashion! We shall upend the world from the depths of the basement!

The change has been rather surprisingly successful. I had been very happy running our previous Sun servers and the control I had over everything via LDAP, but now all email changes are under the control of the Microsoft systems. However, the new email has been relatively easy to use, with fewer problems (though the web client does not due nearly enough justice for non-IE users, when compared with the old Java interface), still supports all the features I actually use, and if it helps in this Hot-Site effo

that my mail is being watched by 'the authorities' (you know, the ones which are keeping us safe from terrorism so successfully these days), and that if I should ever lose any I can just make a baseless accusation against myself to the police and have them restore it from their backups.

"They say this could be a disaster recovery model all sorts of organizations could follow."
For private businesses maybe, but I'm sure hosting backups on other organizations hardware is not acceptable under SOX.

This is common for survivalist and preparedness minded folks. You and a trusted relative or friend exchange backup critical gear/necessities/copies of records, etc. In case of catastrophic loss of either abode, the other person has a decent "backup" for you to fall back on. Arrangements like this have been quite common for some decades now, usually they include mutually assured lodging, should full long term evacuation be required. IMO, it is quite a sound idea. Remember on the news, you see the same scene all the time, those scenes from..take your pick, fires, floods, hurricanes or whatever.. the newsies always zero in on those folks who are all freaked out and sad, and EVERY time they say "We lost EVERYTHING!"..well, there's no need for that if you take the time in advance to preposition enough of your gear so it doesn't fall into the "everything" category. The situation will still suck, but having a nice set of backup everything will sure help mitigate things and make the situation suck *less*. As to what to exchange/store, use your imagination, what would you like to have as a backup if for some reason your home just got wiped out? Spare sets of clothes for everyone, favorite toys for the kids, some electronic gear, tools, sporting goods, books, other media of importance to you, family photos, household records, personal mementos, etc. Salt to taste there. Even just a stuffed closet is good enough, that and the place to evacuate *to*.

That, and what we call BOBs, or "bug out bags" are good ideas. A "bob" is a backpack or other container (backbacks are good in case you get stuck on foot), that has enough critical essentials to keep you alive for a week or so, enough even on foot to get you out of the disaster area most likely. It's called a bug-out bag from the old army term, and it is designed so if you have zero notice-hear on the radio local railroad has a tanker car full of chlorine leaking, nasty forest fire heading your way, and it's close, etc, that you can grab it and go, out the door within less than one minute. Very high speed emergency evacuation. The deal is, you hope you never need it, but if you do, it literally could save your life.

Interesting subject, and although it is not directly related to the main parent IT topic, the concept is very similar.

BTW, back in 1993, we used to hold annual DR swaps with another federal agency who had similar systems to our own. We also had an MOU to mirror critical data and use each others facilities in case our primary site was hosed. (or is that swarming with the walking undead?)

So does this mean that if one University doesn't give up information to the RIAA, then the RIAA could just try to get it from the "backups" from another University ?
Not that I would ever suggest that the RIAA would ever do such a thing.

Exchange isn't so problematic because it is written poorly. The problem is that it is so frequently administered poorly.It's also problematic because exchange experts are few and far between. But then again how many sendmail or qmail experts are there?

I wonder though - are they using exchange just for e-mail? Or are they using it for scheduling, shared folders, etc.? I can't see implementing shared schedules university wide and only receiving 8 help desk calls. You'd think more than 8 people would be c

And there was me thinking the problem was the shit that is the Exchange Jet storage engine is the biggest pile of dino turd going. We know from PostPath that Exchange really is a pile of junk. That gets five times the throughput on *IDENTICAL* hardware, talking the same binary protocol down the wire. I remember back when Exchange first came out everyone said it just chewed through CPU cycles and IO bandwidth compared to other solutions. Now we know it that it really is down to rubbish programming on behalf

Im not sure what special about this?
replace University with DataCentre and this happens all the time.
With SAN's and dark fibre, you can get machine A at datacentre A writing data directly to tapes in an automated library in DataCentre B, and vice versa.
and they generally backup a hell of a lot more than email and websites.

I've experimented with it (an earlier version though) and it was pretty slick.I didn't ever finish setting it up and getting it running, but I can't remember off the top of my head what problem I must have run into. I think I got distracted by the prospect of setting up ext3cow, which is an automatic timeshifting/versioning filesystem (but it pretty early/beta, not for production) and generally ADDed out.

I should go back and take another look at it. One of the things I really liked was that it was developed

Excluding religious points.. why not? Exchange is nowdays a VERY MATURE colaboration system and the de-facto standard for business in many places. What's the diference? Use Exchange, GMail, POP3 or whatever you want. It's all about freedome, isn't it?

/late night rant on/Actually there is a large number of individuals who are supportes of the free/os movement for ideological reasons: "it's all about freedom" they cry, "let's us all decide what to use", "information wants to be free", "yadda yadda"... And then, when somebody chose to use Exchange or Vista or whatever they are the first to jump and cry foul....Wasn't it about freedom after all? Well, they made their choise, so what's the freaking problem then? It's actualy very amusing. Bytes ate bytes. S

The University I work for has used Exchange for the last 4 years. From my point of view as a user, it's been excellent. There hasn't been more than a handful of hours of downtime in my entire term there (that I've noticed).I'd like to see Microsoft broken up the way AT&T was a few decades ago, but for real. Not, however, because their Exchange Server sucks. Vista is a different story, of course, and is a real dog. But, by automatically being critical of every product, the "I hate everything Microso

It is usually not about each individual's freedom, but every individual's freedom... One person deciding that an entire university will use exchange and nothing but (thus exercising their freedom) severely limits the freedom of the students and faculty at that university to choose email clients. Luckily my undergrad university also offered IMAP access (although they very strongly discouraged it); otherwise I'd have been stuck either using Windows and outlook or using the webmail interface (evolution conne

unless you use firefox.I use all three (outlook, IE and firefox) to read my mail and manage my calendar.exchange in outlook and IE is verry similar but in firefox it's missing most of the basic functionality like flagging a message and decent search.

on the other hand, when not using firefox for it, it's actualy has several good features that I haven't found elswhere tho, the fact that it's easy to sync. the calendar between the exchange server and a pocket pc being one of them.

Exchange is nowdays a VERY MATURE colaboration system and the de-facto standard for business in many places.

Same can be said about Windows, can it not? Certainly "mature", right? And with 95% of the desktops running it, there is no argument, that it is anything, but a standard.

Just like the rest of Microsoft products, Exchange is very appealing on the surface of it and from the start. Then the real problems start creeping in and soon you can't buy new hardware fast enough to keep the piece of crap runni

The warm fuzzy, feel good, parent article talks of 2 organizations able to share data files, something that has been going on for as long as "Off Site Storage" has been around. I am baffled, what is the point of this article?

Who modded this? Aside from the post being more or less irrelevant (it's not about a multi-peered architecture) his comparison to his LAN using his parents system should have been a good reason to rule out ANY enterprise architecture expertise whatsoever.

That said, as a system admin who's business does not have any kind of secondary solution (no hot/hot, no hot/cold, etc) I'd still be leery of trusting my data or my lively hood to a peer and an admin team I didn't know. Maybe this works better in academia

I think dumping an encrypted file over on someone else's network over a secure connection isn't such a big deal. I mean, you're not an idiot so you'll be using a pretty decent encryption tool on the data before handing it over to the semi-trusted peer. One box and a VPN is a lot cheaper than a colo. I think it's kind of an interesting experiment. You do make a valid point though about how it probably will not catch on in corporate america.

Exchange is great... For calendaring and such. But for mail? Yech... If people just must use outlook (we like to call it LookOut where I work), please let your outward facing MTA be a hardened *nix host running (e.g.) postfix with a lot of anti-asshat countermeasures (spam/nigerians/viruses/whatever) and let that secure box forward the defanged mails to the happy-dappy leaky-as-a-sieve exchange server. It will save you a lot of headaches.

Exchange used to be a great steaming pile of rubbish you could not back up properly without shutting down everything for the duration of the tape run (yes there were hacks, but not really good enough for bare metal recovery). It has improved a lot since then - personally I prefer just about anything else instead of something with big, weird, slow databases you can't read with anything else that change format with versions. So long as things are kept carefully in step (ie. same versions and patches so no b

Yes, Exchange has not been difficult to backup in some time. The only version I ever used that would corrupt itself was 5.5, and that's well over 9 years old now.The only issue now is with poorly trained admins who still try to run brick-level backups or use ExMerge as their backup tool. MS has repeated told admins not to do this since Exchange 2000, and there are still backup programs that tell you to do it this way. You *will* break something using that method. It's akin to backing up a 500 table data