Posted
by
timothy
on Friday June 24, 2011 @06:10PM
from the let's-make-y2k-feel-dumb dept.

hawguy writes with an AP story about upcoming tests of greater allowed variation in the frequency of the current carried on the U.S. electric grid: "A yearlong experiment with the nation's electric grid could mess up traffic lights, security systems and some computers — and make plug-in clocks and appliances like programmable coffeemakers run up to 20 minutes fast."

and doesn't understand what happens when you're even a bunch of *degrees* out of sync, much less a few decihertz. We don't have *near* enough HVDC intertie to make this not matter, and I can't imaging how they think this is gonna work -- nothing at all on NERC's website to say what's *really* gonna happen, either.

Being out of sync is BAD but once you tie generators together they will keep each other in sync, so it's really only a concern when you first tie them together.

The thing is at the moment they play with the grid for no reason other than to keep the average number of cycles per second very close to 60 over a long period so clocks stay in sync, it sounds like they are planning to stop doing that.

and doesn't understand what happens when you're even a bunch of *degrees* out of sync, much less a few decihertz. We don't have *near* enough HVDC intertie to make this not matter, and I can't imaging how they think this is gonna work -- nothing at all on NERC's website to say what's *really* gonna happen, either.

Love all the warning, too.

I think the organization that's responsible for the reliability of the entire USA power grid has some idea of the need for frequency stabilization when connecting new power sources to the grid. Not that it's relevant for what they are proposing - power plants already know how to sync up their generators to the grid and they don't care if it's 60.001 Hz or 60.002 Hz, they'll take that into account.

The magnitude of this frequency deviation is tiny, 20 minutes/year is about.003% - the power grid can fluctuate much more than than on a daily basis, but until now, it's always been corrected to keep the overall frequency at 60 Hz.

Believe it or not, the engineers that operate the network actually know what they are doing.

The flow of power between tied AC networks is determined by phase, not voltage. To adjust the phase between your generator and that of a neighbor to whom you wish to send power you must run faster than he for long enough to accumulate the desired phase difference. Such adjustments are going on constantly throughout the network and conflict with the requirement to keep the average frequency at exactly 60Hz. Relaxing the latter requirement will make network operations easier and more reliable.

Why? Timing isn't an issue [nist.gov]. The drift in phase due to the thermal expansion and contraction of the materials carrying the power is a bit of a nuicense, but using better-grade materials (making behaviour more predictable and more controllable) would solve some of that and substations are quite capable of handling the marginal extra complexity of preventing errors from accumulating.

The added complexity is needed anyway as virtually every major blackout in history (including all the ones in recent times) have

Are they graded for -40~+120 Celsius degrees, variable humidity, high EMI resistance? Are they guaranteed to retain accuracy in extreme conditions?Are they available on common commercially available SBC that have the specific set of features we need as well? Alternatively, do they use one of common protocols like SPI or CAN, without need to add a whole lot of glue logic (both hardware and software) to make them readable from embedded Linux?Is the price comparable to, say, a decent GPS-based c

and doesn't understand what happens when you're even a bunch of *degrees* out of sync, much less a few decihertz.

They understand very well. This isn't about allowing generators to drift out of sync with each other in the short term. It's about not correcting the long-term variations in the grid as a whole.

Household clocks and coffeemakers seem unlikely to be a problem. Most of them nowadays aren't synced to house current, but use quartz oscillators. More likely problems would be old systems which have never been replaced because they've never needed to be; traffic light controllers are a reasonable example.

It will be a sad day when no one cares enough about language and communication to politely correct someone's grammar mistakes, and when those who try are shouted down by an angry, ignorant mob who are so insecure that they can't handle simple mistakes being pointed out.

Speaking of which, if you're really trying to help someone with these kinds of grammar issues, you should consider offering some easy ways to know when to use which version. 'Whom' is basically the same as 'him', while 'who' is basically the same as 'he'. By swapping in 'him' or 'he', most people can figure out which one is correct.

I find a sentence with incorrect grammar harder to read. I may just not understand it or may misinterpret it. Your computer will do the same when you program it. A basic principle of interface design is not to do something unexpected (like suddenly changing the frequency!).

The point being made is that the claim that the east coast runs several cycles per day faster than the west coast is bogus. If they're interted without frequency conversion they don't slip cycles at all. The east coast might run with a phase shift. But if it slips a cycle this immediately precipitates the phase-thrash catastrophe that finished bringing down the east coast grid during the first "great northeast blackout".

The links convert AC to DC and then back to AC in order to eliminate the synchronous connection. My recollection is that the conversions process was close to 99% efficient. A more recent alternative to DC links is GE's phase shifting transformer which allows transfer of reactive as well as real power.

All of my clocks that matter synch themselves every few hours with the nearest WWV signal.

Many of my other timekeeping devices get their time hack from the net.

Anything free-running probably only has about a 30-second-per-day accuracy anyway (I don't own any Omegas, yet) and I really don't much care, because picking up a watch you haven't worn in a few months and setting it is part of the point of continuing to own analog technology at a time when I could put a solar-powered, radio-synchronized device on my

Casio sells a nice radio-synchronized digital watch for $38USD. Got one a couple of years ago and I love it. Automatically adjusts itself for DST. I use it to set the clocks in my house that don't automatically adjust for DST (mostly those on kitchen appliances).

The PET 3032 (back in 1978), free-running and unsynchronised, was capable of 30-seconds-per-year accuracy on a decent, clean power supply. That was, admittedly, about the absolute limit, but you could do it. A modern computer runs around 4 billion times as many cycles per second. More if you supercool then overclock it. A modern computer also has up to 16 cores per node and fairly typical clusters can have 64 nodes.

As for analog watches, the high-end mechanical watches you can buy off-the-shelf have a drift of around 1 second per day (30 times better than your estimate and 3 times better than any computer is capable of doing if the power supply will induce 3 seconds a day error). For free-running digital devices, a typical Casio quartz digital watch is around six nines accuracy (0.1 seconds drift a day), no synchronization required. Which means you can actually buy a cheap wristwatch that's 30x more accurate on timing than the best home computer you can get.

Sorry if I find the incompetence of hardware engineers a little hard to accept, I just prefer standards that, y'know, improve over time, not regress. 3 seconds a day drift is what vintage Swiss watches could do. I prefer modern technology to do better than the stuff that Huygens could do, not merely equal it.

WWV is weak on the West Coast. My watch sync attempts to sync to WWV every night but it is only successful about one day in three. This was initially frustrating because the device is supposed to adjust for DST but would fail to make the switch because it could not receive the signal on that day. Now the changeover day is different from what is programmed in so it doesn't work anyway.

Whoever designed that watch did a crummy job. There is a bit in the WWVB packet that tells the clock the current DST status: http://en.wikipedia.org/wiki/WWVB [wikipedia.org]

I've designed a WWVB nixie clock before, there are definitely some serious design constraints. For starters, the signal is 60khz which is "Longwave". You need a lot of antenna to pull in a longwave signal with any real success. I used a very long loopstick antenna, and even then the antenna is directional, so the direction the clock is oriented _matters_. Additionally, lots of things generate noise (QRM) in these frequencies, so watch out for CRT televisions and computers and (in the case of nixie clocks: high voltage switching supplies and multiplexed nixie tubes ionizing and de-ionizing neon hundreds of times a second). I built this thing because I am a HAM and also into nixie tubes.. but the truth is that WWVB is obsolete. Nowadays, the best way to get accurate time is via GPS. You can buy a GPS module to pull the time from for about $25-$35 to build into your clock.

But I am sad about this line frequency change.. In the United States, one of the most accurate clock signals is the 60hz power. It's accurate to within about a minute a year at present. That is a LOT more accurate than a standard crystal. TCXO's (calibrated crystals that have temperature sensors in them that dynamically recalibrate for temperature) get to about a few seconds a year when they are brand new and then degrade from there with age. So, the long and the short of it is that if this change happens, and if it is a pretty noticeable hit to my clocks' accuracy, I'll be bodging in little TCXO controlled 60hz sinewave generators into all of my clocks..:(

Irrelevant. Once a day the $10 clock I purchased at the drug store wirelessly synchronizes itself to the radio time signal (WWVB) emitted by the U.S. Atomic Clock in Fort Collins, Colorado. I can't believe this feature isn't in every clock -- Oh well, live and learn.

My PCs (and servers) all synchronize clocks over the network time protocol (NTP) and are connected to uninterrupted power supplies (UPS) which regulate the voltage and Hz. I can't believe anyone still connects computers directly to wall ou

It's really too bad that the WWVB isn't broadcast with a cryptographic signature so that the time signal can not be pirated; Thus allowing public clocks to be updated to a time signal that is verifiability correct. I can't believe anyone still trusts data that isn't cryptographically signed -- Oh well, live and learn.

Personally, I'd like to see some WWVB-style relays, for better signal strength in buildings and other areas that don't normally get good signal (particularly during the day).

I know that some places use CDMA radio receivers as a time source for NTP servers, as CDMA signals can penetrate buildings better than GPS and the WWVB signal (it's particularly useful when one can't get roof access) and CDMA spec requires time to be in sync with a very small error (10 microseconds, if I recall correctly, but I'm quite

Such a small change can have such a big impact.
I never really thought about how digital clocks keep track of time. This is a very interesting issue.
Of course, it could also turn into a boon for the industry, having everyone buy a clock that doesnt rely on "power timing".

I'm not sure what your point is? Of course it uses VOLTAGE, that is how electronic devices work, they use ELECTRICITY. Hence the name ELECTRonic.

The point of the crystal is that its accuracy depends mostly on the physical properties of the crystal, not the input voltage. Ok, voltage DOES have some effect, but it's minimal, like in the order of 1 part per billion per 1V change.

When the public power grid was being established, a clock manufacturer petitioned successfully to have the mains time kept in perfect 60 Hz synchrony for clocks to keep time off of. This was viewed by everyone as a Big Win. After that, all you needed to make a clock was an AC motor; really nobody needed to actually bother with a real clock anymore except the people at the power station, so "the grid was the clock" the way "the network is the computer".

Most digital clocks use a quartz oscillator as their frequency source. The mains power is not directly used for timing.

Most lime-powered digital clocks use the line for the frequency reference and run from the quartz crystal reference only when there's a power outage. That's because the quartz crystal, absent oven stabilization and expensive calibration (or even WITH it), will drift by minutes per year while the line frequency has been kept stable by reference to the national bureau of standards. The osci

Even quartz-crystal line-powered clocks use the line for reference and the crystal for backup during power outages.

The line frequency has been kept stable by comparing it to the national standard clock and adjusting it when it has accumulated small errors. This makes it far more stable than any inexpensive quartz crystal with no oven.

A one part-per-million crystal oscillator will accumulate over half a minute of error per year. The power grid has been good for a fraction of a second in the time since it w

For the El Cheapo clocks, it's less expensive to couple the 60Hz from the power transformer, thru a resistor, into a pin on the clock IC, than to provide a quartz crystal & capacitors to said chip. Even if it's only a difference of $0.10 for each unit, multiply that by millions. Remember, just follow the money. Cheaper = more profit.

I'm much more concerned about my laptop power supply and the several hundred dollars I might have to shell out if this insanity fries my laptop. Ditto for the TV and the other appliances. The other appliances belong to the landlord; but it's still no fun to have to be around and have some tech service them.

If your laptop power supply is anything like all the ones I've owned, it won't care. According to the label (and testing done while I travel), mine works just fine on nominally 50-60Hz mains power. I imagine it wouldn't really care if you went from 45-65Hz, though I suspect it might get a bit annoyed if you were to go to 400Hz or something extreme.

When I was with the Military Sealift Command, all the "salty dogs" told me to invest, quite specifically, in a small UPS for my stateroom. They were quite adamant about never plugging your electronic gear straight into the outlets.

The first time I saw the overhead lights doing Saturday Night Fever, I was grateful for the advice. All my gear survived.

Most switching power supplies immediately convert ac power into dc with a simple bridge rectifier before running it through the switching transistor at a very high kilohertz before going through the power supplies transformer. you could run a switching power supply on just about any frequency and even dc power

While you *could* design a switching power supply to run on nearly any voltage AC or DC, in practice, real-world computer power supplies tend to be pretty sensitive to power frequency and quality. I tried to run a small office off of a generator, and went through several generators including a Honda 2000W inverter model, a 10KVA gas generator (non-inverter), a 15KVA gas generator, and it wasn't until we got up to a 20KVA diesel generator that it gave stable enough power to run all of the computers. The dies

You need to do double power conversion. That's what I do for my small data center in order to be able to use a cheap generator. It is much more cost effective while on generator. I only do double power conversion while on generator since my grid power stability is much above average.

I have the generator recharging the batteries so it doesn't directly feed the hardware. A 12 V DC to 110 V AC inverter feeds the hardware.

Um, I think you need to narrow that down to "cheap electromechanical alarm clocks", unless I've seriously overlooked something, "Cheap" alarm clocks (from China, in particular, as though the distinction even matters anymore) now basically consist of a backlit LCD module glued to a piece of plastic, with piezo buzzer for the alarm itself. The really, *really* hardcore-cheap ones don't even plug in -- they just ship with a coin cell, and aren't back

LED alarm clocks still use an LM8560. Go buy one, take it apart, and find the ic with the weird pin spacing (not standard 0.1"). That's the same IC that's been in use for over 30 years. And it still runs on mains frequency (it has a pin to select 50/60hz operation).

I'm much more concerned about my laptop power supply and the several hundred dollars I might have to shell out if this insanity fries my laptop. Ditto for the TV and the other appliances. The other appliances belong to the landlord; but it's still no fun to have to be around and have some tech service them.

There's little risk of damage to any device because of the frequency changing slightly. The article didn't mention any expensive electronics because line frequency has no effect on them whatsoever. They all use DC internally, so their power supplies must rectify the line current anyway.

And it's only a test/phase in to see who complains. Only very old and cheap devices used power to clock themselves. If you really need those devices to be more accurate then they are easy to retrofit externally with a brick filter or internally change the mechanism to use a chrystal or replace in innards altogether.

Most clocks are not electric.Most Run on DC provided by a Crystal oscillator, the line frequency provided by the AC line to run them is irrelevant. only electromechanical electric clocks might be in error

And what makes you think that? The fact that it's 2011 and it's all microcontrolled now?

Go buy a brand new LED alarm clock. You will find it strangely similar to the one your dad (or grandpa, or you), had in the 1980s. Big LED display, snooze button, 9V battery compartment. Let me know if you find a crystal inside of those. You will find an LM8560 or one of its clones, and a wire from one of the transformer's legs through a diode to one of the chip's pins. Guess what?

I learned this as an army brat when my dad was stationed in Italy. Firstly, you had to use these shoe box-sized heavy transformers (that were passed on as soldiers moved back stateside) to transform their 220v power to 120v. But, since they're on 50 hertz instead of 60 like here in the states, clocks would run apparently slower. I suppose I could've asked my parents for a new clock, but I learned how to calculate the time offset and would reset the time (not the alarm) for when I needed to get up.

How many people pop right up at 0'dark thirty in the morning and start getting ready for school/work/drinking without any signal? Yeah, me neither. The alarm clock allowed for the suburbs by letting employees not have to cluster around public alarm clocks (church bells, factory whistles, etc). If my alarm clock is late, I'm late, if it's early, i lose precious moments in bed (not as bad as the first case, but still irksome).

I hope this gets swatted down on behalf of every person who has to wake up before

How many people pop right up at 0'dark thirty in the morning and start getting ready for school/work/drinking without any signal?

Usually I can manage this very well. I have an alarm clock that functions as a backup in case I fail to wake up, but probably 9 times out of 10, I wake up on my own on within about 5 minutes of when the alarm would go off if I left it on. This happens even though there is a variance of up to 2 hours or so in the exact time I typically go to bed on a night before I am working.

The real question is why do devices add the additional circuitry to count pulses off the mains grid rather than add additional circuitry to actually keep time?

A highly accurate crystal costs in the order for $1 for single quantities. A RTC $1-10 depending on feature set. If you already have a microcontroller you don't need the RTC either. Why are clocks etc reliant on an external signal to keep time? How do they keep time when they run on the battery which is a common backup for every $5 alarm you get?

As for streetlights... Really? How is this not a system which gets timing from some other central authority. I don't know much about street lights, but is this something that will only affect old small town streetlights, or do the shiny new modern LED powered ones in the city act independently enough that they aren't capable of contacting an NTP server?

I guess that this is an US issue since you guys run a 60Hz grid, getting a correct sync from the European 50Hz is probably harder/more exensive than using a crystal because all the clocks that I have seen over here use quartz crystals to keep the time.

A highly accurate crystal costs in the order for $1 for single quantities.

And gains or loses perhaps a minute per year - while the grid has been good for a fraction of a second (adjusted when the powerhouse clocks drift more than that from the national standard committee of atomic clocks).

So that's why line-powered clocks use the line for the primary reference and the crystal oscillator to avoid having to reset it after a power failure (and to insure you get your wake-up alarm). And why most appliances don't bother with a crystal at all. (Why spend extra to make them LESS accurate?)

Keeping accurate time is HARD. Distributing it by the power grid is EASY.

You said, "...why do devices add the additional circuitry to count pulses off the mains grid rather than add additional circuitry to actually keep time?"

Because adding a SINGLE RESISTOR from the power transformer to a pin on the clock chip is far cheaper than a quartz crystal and load/calibrating capacitors. Follow the money. When you're making a million units, even a few pennies, each, adds up to some big dollars.

What exactly is the benefit here? I kept waiting to see that somehow variable frequency power would travel farther or be more efficient, or at least save power companies some money (which I'm sure it does, or this wouldn't be happening).. but I can't imagine why or how.

Without explaining the benefit, this makes as much sense as ICANN opening up the TLDs.

Load on the grid shows up as mechanical resistance to the big spinning generators that control the frequency. If there is more load than generated supply, the generators slow and the frequency drops; more supply than load and the turbines spin the generators faster. Maintaining a balance of power is done by keeping the frequency at 60Hz.

That was easy enough when all power came from big generators, with predictable loads. But if you mandate photovoltaics and wind and other forms of power which vary in output, then things are a lot harder. The wind dies and a major wind farm drops a few hundred megawatts? The big generators can't respond quickly enough to keep frequency within its regulated range, so power companies have to install very expensive systems that can react faster.

Utilities are often legally mandated to buy power from renewable sources, but those renewable sources aren't held to any of the grid stability requirements. This ends up shifting an enormous burden of cost onto the utilities, who aren't happy with it. Loosening the grid frequency requirements is a way to make renewable but unreliable power less expensive.

And that gentlemen is: a new stimulus package. Start re-buying all your crap.

I still do not get btw, how an ethernet port is still not an option on kitchen/home appliances, all that problem would be gone, being able to adjust time from a time server. Of course an RTC module could help too:) with this specific problem.

In the 70's I developed a system to control the of the light sensitive coating onto 35mm rolls of film. This ran on a PDP-11 that used the mains cycle to keep time (20ms interupts with the UK's 50Hz supply) and measured the coating by the amount of x-rays reflected by the silver halide in the coating each second.... there were coninual errors in the accuracy of the coating as the time approached midnight.

It turns out that the National Grid was legally required to maintain a 50Hz average from midnight to midnight and would add or subtract cycles in the last minutes of the day in order to meet this requirement.

Five or so years later I was working in the National Grid Control Centre and saw the 2 clocks, one with an independent time source and one running from the mains frequency. The aim of the controllers each night was to adjust the mains frequency to bring the two clocks in sync at midnight.

This is no big deal. What they are talking about here is the additive cycles in a day and not worrying about the compensation process for that.

Some basics:

Anything connected to the 60Hz power is at 60HZ, You can not connect a 61Hz generator to the grid.In addition, when you connect a generator to the grid, you have to adjust its phase, as you bring it on line.If the phase angle does not line up you get you get into a "tug of war" between multiple generation sources and that doesn't work.

The sine wave coming out of one generator has to line up with the other sine waves from the other sine waves from the other generators.

What the article is talking about is the adjustment of the generating stations on the grid so that at the end of the day you get that exact number of cycles across the grid, not one more not one less. It is "really close" without tweaking but not exact.

It costs money to do those tweaks, to get the numbers on the money. That tweak right now really doesn't serve much purpose anymore.

Noting exciting, or interesting here, this is not Y2K nonsense, move along...

It looks like this method for timekeeping was common in the 1930s. I work in electronics and never in my life have I seen a clock that works like this. Ive been dismantling old equipment since I could hold a screwdiver. 35 years

No. If you take apart a clock radio with 4x7-segment LED display, chances are that it has a 9v battery compartment at the bottom, and an LM8560 inside. It's an IC that's been used for over 30 years and still in production. No matter how cool, modern looking, flashy blue LED display it is, it has the same IC a brown 1980s clock with red LEDs had. It could be a clone or have a different name, but it is that chip.

Guess what: it takes voltage from the transformer, before rectification, into one of the pins. It also has another pin to set 50/60Hz operation. And a SHITTY RC circuit for running off battery (useless, it's off several minutes every hour).

Another thing: most electric things CAN'T be plugged anywhere now. My grey-market XBOX 360 has a 120V power brick (I live in a 220V country). If you live in the USA, take a look at how many electronic stuff at your house doesn't even have a 220/110V switch. The only things you can pretty much plug in anywhere are chargers. Most other stuff either can't, either by design (things with motors or appliances you don't carry around), or by cost (most electronic stuff without a 110/220V switch).

I work in electronics and never in my life have I seen a clock that works like this. Ive been dismantling old equipment since I could hold a screwdiver. 35 years

Wow. So you've never seen a "classic" alarm clock, analog clock with time-set knobs on the back and usually a plunger on the back that you push in or pull out to shut up/arm the annoying buzzer? Never seen electric timer, a little box that plugs into the outlet that you plug something else into, has a big round wheel with mechanical detents that y

Honest question. How hard/expensive is it to design an electronic timekeeping device that isn't directly based on electrical current? If the issue here is that some devices are poorly/cheaply made, it would seem, on the face of it, that these clocks should be designed better, rather than designing the electrical grid around the clocks. Bit of the tail wagging the dog?

Or is this just a straw man that the electric utilities want to put forward in order to accomplish a change that has a more insidious effect on consumers?

It's easy and relatively cheap to make new devices use their own time base, but there's a huge installed base of devices that do use powerline frequency because up until now, powerline frequency has been adjusted to keep it very close to 60 hertz. So it's not a matter of the tail wagging the dog - the grid intentionally guaranteed a stable time base so clockmakers took advantage of it. It's more like the dog decided that it doesn't want to wag the tail anymore so he's having it removed.

It's not so much that electric grids are designed around clocks as much as it is that a good deal of electronic devices, including, most notably, alarm clocks, that use AC power have been designed around the the fact that household current frequency is extremely uniform, and has been so for many years.

I'd be willing to lay bets that when they start messing with this, they are going to find all kinds of devices they never imagined could be affected to start failing... some quite catastrophically (ie... i

and they all show different times, some are already off by 20 minutes.

this is 2011, every one of these devices should be able to connect to the internet and synchronize time just like my PC does (should) so I can be on my merry little way.

I bought a z-wave thermostat thinking it would get it's time from the controller automagically, not the case. royally pisses me off.

Do you really want to apply firmware upgrades to all of your devices everytime congress decides to change Daylight Savings time?

Would many people really take the time to program their wireless access point's WPA key into their coffee maker so it can sync the time?

Maybe extracting the time signal from Cellular GSM signals would be easier and nearly ubiquitous. Apparently the local cell phone tower knows what timezone I'm in, so there's be no need for devices to know, though I don't know how well that works o

I still have quite a few clocks that work like this in my home. I mean which manufacturer is going to install a crystal in his device when he can get away with using the power source to count 1 second at every AC power inversion ?

You can test the clocks you own by plugging them into a cheap power inverter 12 volt DC to 110 AC that you can plug into the lighter plug in your car.

Get back to me when you are done. You should be surprised unless you specifically bought all the devices that have a clock in your h

I'm sure you or your dad or someone in your family had a LED alarm clock somewhere. Probably you even have one now. Take it apart, and show me where the crystal is. Nowhere. Google for LM8560, and stop assuming things.

And yes, they still make those clocks. ANY LED alarm clock you buy now WILL have that chip. And no crystal.

The North American Electric Reliability Corp. runs the nation's interlocking web of transmission lines and power plants. A June 14 company presentation spelled out the potential effects of the change: East Coast clocks may run as much as 20 minutes fast over a year, but West Coast clocks are only likely to be off by 8 minutes. In Texas, it's only an expected speedup of 2 minutes.

My question is - will West Coast clocks run 8 minutes fast, or 8 minutes slow? My guess is that they'll be slow.

The power meeter in your home should calculate electrons going back and forth in the wires, a.k. current.

If the period (cycle 60HZ) changes a bit it shouldn't affect your bill. On the other end, voltage variations might have more impact since electrons moving with less voltage carry less energy (Watts) and you are usually billed by KWh.

Of course in the end, it depends on the internal workings of your meter, there are different types.

Older B/W tv sets used 60 Hz as the vertical sync frequency, But the receiver synchronizes itself to the incoming TV signal, not the local powerline. The master synch signal source at the transmitter was a high-stability quartz oscillator, which generated the synchronizing signals for all the cameras and other studio equipment, as well as the transmitted sync signals.

When color came along, the vertical sync frequency shifted ever so slightly, to 59.97Hz (and the horizontal shifted from 15.75 kHz to 15.734

Never heard about a line-frequency based alert system like that. Would be interested in reading more about it.

At one time there was the CONELRAD system, in which AM broadcast stations on designated frequencies would temporarily drop their carrier to trigger alarm receivers. These alarms were most commonly used by ham radio operators, who were required to go off the air immediately in event of a civil defense alert.: