Posted
by
timothy
on Thursday March 13, 2003 @02:54PM
from the relating-phenomena dept.

Michael Buhrley writes "I felt a pretty good earthquake this afternoon in Tokyo. I immediately went to the Japan Weather Association earthquake information page to see if it had registered the quake, which it had not (the ground was still shaking at this point.) 20 seconds later when I refreshed the page the server had slowed to a crawl.
I had been looking at traffic graphs for one of my servers earlier and thought it would be neat to correlate the traffic data with the seismic data for the event.
I wonder how quickly a noticeable traffic spike could be detected and what other information could be gleaned from the web behavior. Lots of traffic = big quake or quake in big city.
The U.S.G.S. Pasadena Field Office has a page that compares this phenomenon to the Slashdot effect."

I'm the system geek for the USGS Pasadena office, and I wrote the article referenced. I've found that traffic on our servers starts increasing within one minute of the earthquake. The peak traffic for California earthquakes is always ten minutes after the event. You can set your watch by it, it's that dependable.

As for the USGS servers having problems after earthquakes, we've been served through Akamai EdgeSuite since late 2001. So for the most part, our servers have been doing better. We've had a co

Now that you've analyzed the time relationship of the responses, how about doing analysis of the spatial response? Get geographical data on all the IP's that made web requests and display the result as an animated map.

So let me get this straight. You were feeling an earthquake. While it was still going on you went to a web site to see if you were really feeling an earthquake instead of seeking some kind of shelter? Thats way more geek than me.

The Richter Scale isn't really used anymore. It's originaly purpose was to measure vertical ground motion and it loses its accuracy above something like an 8.5

What is used now is something called the Moment Magnitude Scale that actually computes the amount of energy released in an earthquake. It is fairly similiar to the Richter Scale when you compare magnitudes of earthquakes. Obligatory linkage [ualr.edu]. One thing to note though is that each step up in the MMS is an increase of 30 the amount of energy an earthquake releases. So a 6.0 releases 30 times more energy than a 5.0

Basically what this comes down to is that people will think "Oh good! We had a 6.0 on the San Andreas Fault! So that should release some energy built up!" The SAF is capable of producing about an 8.0 here in Southern California, so it would take 900 6.0's to equal the energy of an 8.0:)

It seems like if you live in a fairly seismically active area, that people are pretty keen on the webpages. The USGS has a site showing earthquakes in California, located here [usgs.gov]. We had a decent sized earthquake here a few weeks ago in Big Bear at about 4:20 in the morning. It woke me up at the end and I couldn't be sure if there was an earthquake or not. So I got up, turned on the computer and sure enough, the site showed the earthquake immediately.

One of the cool things about that site is the fact that you can report what you felt in your area and they create a shake map based on the reports. Within 10 minutes there were already about 15,000 reports that people sent in and that number climbed quite a bit as the morning went on.

Part of my reason for logging onto the site after an earthquake is curiosity. I want to know where it was centered and how big it was. I think that has to do with a lot of other people's reasons for logging on as well.

I'm sure if you looked at cnn.com's traffic logs for 2001-09-11 it would be pretty similar to what you'd see on something like an earthquake. Actually, I would really like to see that in graph format, or even Slashdot's traffic logs for that day.

You can read Taco's account of manning the slashdot servers [slashdot.org] through 9/11. In summary: normal slashdot traffic is about 20 pages/sec and 1.4 million per day. That spiked to about 60-70/sec on 9/11, with nearly 3 million for the day.

For what it's worth, I read that traffic on CNN was about four times normal that day. Earthquake-driven traffic spikes are about that size overall, but they are very sharp. After the Nisqually earthquake near Seattle [2/28/2001] traffic on the Earthquake Hazards Program [usgs.gov] web site went up by 300x in 25 minutes.
It went from about 2 hits/sec to over 700/sec. I wrote a small article about it at
http://bort.gps.caltech.edu/spikes/28feb2001 [caltech.edu]

I found the way that the USGS down in LA ended up implimenting Load Balancing even more informative then the fact that Michael went to check the information of the website. After all in the many quakes I've felt, I've always gone to the USGS Website [usgs.gov] once during and a number of times afterwards to find out both the epicenter (One was too damn close) and the magnitude. And in two cases the website was updating the start and end of the quake while I was reloading.

While we call this the slashdot effect lots of other portals do the same thing. For instance when the Elizabeth Smart case happened yesterday most Utah news orginizations had their servers thrashed by over-access. That was in part because Drudge Report (and likely others) linked to local news stories.

As for utilizing this, I suppose you could set up a script that monitors such sites in a manner akin to ping. Although I think that most administrators would prefer you didn't. Get a bunch of such clients going and you effectively have an accidental denial of service attack.

Further such monitoring only works on servers that aren't designed for high traffic. Put an other way, what would cause a slowdown for your local paper is likely very different from what would cause a slowdown to CNN. Further as servers are upgraded you lose your "baseline."

So there is an effect and the effect correleates to what is "significant" to the readers of that site. But doing much with the information would be hard.

Having said that though someone had a joke about an early warning system built by checking Drudge Report (a popular American news portal). It probably is a good idea. When there is breaking news most people go to the Drudge Report because he typically links to the best information about the news. He further tends to put important breaking stories in red.

I thought from the headline that the impact of vibrations on the hardware would be discussed. I can imagine that a hard drive might perform a little slower if it was rattled violently. Does anybody mount their servers with this type of shock absorbsion in mind?

The CDC looks for similar stuff like this -- spikes in reported illnesses, blah, blah, blah.

The way they really have any idea what's going on is massive correllation of many databases. School absenteeism jumps in an area combined with a local increase in NyQuil purchases or something like that.

Did you know that when it rains, the sales of umbrellas go up? And when it's sunny, the sales of ice-cream go up too? Wben the adverts come on the telly, the electricity usage surges? These are not mysteries.

Now that slashdot has linked to the site, I expect that their site will now experience alot of hits. As a result of a spike in hits, they will assume that their has been another earthquake (since alot of page hits=earthquake!). Panic will result since they will know there is an earthquake but nobody locate it. I name this effect the slashquake effect.