I’ve used a cell phone with a lithium ion battery for a number of years and until today I was never aware of the very serious and potentially dangerous limitation of this battery technology.

If you charge a lithium ion cell at temperatures below the freezing point of water, a metallic lithium plating builds up on the cells anode. If the battery is then subject to a sharp impact or rapid charge rate it can result in a thermal run-away condition that can result in an explosion or fire. Even if the battery does not go into thermal run-away self-destruct mode, the plating will still ruin the battery.

Do not charge your lithium ion battery if the temperature is below freezing!

A mysterious dust coating much of the Puget Sound region has baffled many people. Some have said it had a “volcanic” quality, similar when Mt. St. Helens erupted. That speculation had me a bit nervous since I’ve had dreams of massive volcanic activity along the Cascades all of my life.

My eyes have been bothering me lot the last few weeks, gooped up, as has my nose and sinuses, that was another mystery.

Well, it turns out the mysterious white powder is tree pollen which explains my eyes being gooped up. That’s something that I usually get to deal with during tree pollen season but that’s also something that happens slightly later in the year. I hope this also means it gets over sooner and I’m not miserable through mid-June.

It has long been known that very small bits of matter, less than a few tens of thousands of atoms, can exhibit properties quite different than bulk material. Only recently have we had the ability to create structures from materials that are substantially smaller than optical light wavelengths. We call these new materials nano-materials because they involve structures that are measured in nano-meters. Optical wavelengths encompass wavelengths from about 800nm for deep red, to 450nm for violet. Nano-materials have structures that may be considerably smaller than this.

Early applications of nano-materials mostly involved random assemblages of small structures, a good example being nano-particle based lithium ion batteries that can withstand much higher charge and discharge rates because of the high active surface areas made possible using nano-particle construction or new ultra-capacitors, capacitors with energy storage capacity rivaling batteries.

A newer class of material which makes use of the ordered arrangement of nanometer scale structures is now making the production of materials with very exotic properties. These materials are referred to as meta-materials because they possess properties that no natural material could possess.

So far the most interesting properties of meta-materials are their optical properties. Because it is possible to create structures that are much smaller than visible light, materials can be made that appear homogeneous to the incident radiation but have objects that appear to the incident radiation as artificial atoms with very unusual properties.

Materials made up of a series of harmonically coupled resonators can exhibit a negative index of refraction, they can cause light to bend the “wrong way”. With these materials it is possible to make a flat lens. What is more, it is possible to create a lens that exhibits super-resolution, the ability to resolve features that are smaller than the wavelength of the illuminating light.

Another feature of lenses created from these materials is that they do not have an axis like a conventional curved lens and as a result they can have an unlimited depth of field. This also has significant implications for photo lithography because presently when you expose a silicon wafer, near the center the image is sharp but it becomes increasingly blurry near the edges and so for example a wafer of CPU’s, those nearest the center will be the highest quality, and defects and quality decrease as you get to the edge. With these new lenses it will be possible for all of the wafer to be exposed with correct focus.

I believe we’re just seeing the tip of the iceberg and more exciting developments can be expected.

I am uncomfortable with the plans to attempt to create black holes in particle accelerators. It is believed, that if created, they will be safe because they will rapidly evaporate via Hawking radiation.

The idea behind Hawking radiation is basically this, at the event horizon pairs of virtual particles will pop into existence such that one goes inside the event horizon while the other goes out. Because the two can not re-unite within the time limit imposed by the Heisenberg uncertainty principle, the virtual particles become real, the one outside the horizon escapes and the theory has it that the black hole decreases in mass by a corresponding amount.

Now I do understand the argument, that the black hole has to decrease in mass in order for the conservation of mass to hold true, but I don’t understand how a virtual particle outside the event horizon becomes real with mass, and the one inside subtracts from the mass of the black hole.

Given that this theory has never been tested and it seems somewhat dicey at best, it seems less than wise to gamble the entire existence of our planet and everybody on it, on this not happening.

If Hawking radiation does not manifest, but black holes do, then instead of evaporating, the black hole will suck up any particle within it’s gravitational grasp, which for the first few nanoseconds might be few, but with every particle it sucks up it’s mass, and thus it’s gravitational pull, will increase, and very shortly thereafter we’ll have a tiny black hole orbiting the Sun where the Earth and every living thing upon it a few seconds earlier, previously was.

Sometimes I wonder if alien versions of this experiment aren’t the reason the SETI radio dial is so dark. Actually though, I know it’s not. The SETI protocol is designed in such a way that it is absolutely impossible to detect and confirm the presence of an alien radio signal unless that signal is intentionally beamed at the earth for several days continuously.

This is so because SETI protocol requires the signal being received repeatedly, but the distances involved are such that even with enormous power, a signal transmitted by a distant civilization will only be received here on earth if by chance a large directional antenna on their planet is aimed directly at a large directional antenna on our planet which is aimed directly at them.

Because of the mutual spin of both planets, the mutual orbits of both planets around their parent star, the motion of the stars within the galaxy relative to each other, there is zero chance of this happening accidentally. So the fact that we have no confirmed radio signals from extraterrestrial signals is no real surprise and doesn’t rule out the existence of those civilizations.

So maybe other civilizations have survived the black hole experiment, or maybe their civilizations wisely told their scientists, if you want to try this go try it on an uninhabited planet in the outer reaches of the solar system, not on our home planet.

I am not surprised by these results. I’ve been of the opinion that Dark Matter, Dark Energy, Inflation, Cosmological Constant, and Dark Energy are all just fudge factors to try to make “the big bang theory” conform to observations. It’s like using a complicated lens to make a square peg look round so you can explain how it must have fit into the round hole.

Measurements of Voyager I and II showed an anomalous reduction in deceleration as they left the solar system. Because the Sun’s gravity acts upon them, they are expected to slowly decelerate as they leave the solar system, and they are but not quite as quickly as expected.

The mysterious force that caused this was attributed to yet another cosmological fudge factor, dark energy.

I suspect there may be a much more mundane explanation. To understand how the velocity of these craft is measured, one has to be familiar with Doppler shift. When a train blows it’s whistle as it approaches you and continues as it passes then heads away, at first you hear the whistle at a higher pitch, then as it moves past you and away, a lower pitch. The same is true for electromagnetic radiation.

The velocity of these craft is measured by measuring the Doppler shift of their transmitters. The transmitters are designed to be extremely stable to allow Doppler shift to be measured precisely.

In order to measure the Doppler shift precisely, you have to measure the frequency of the received signal precisely. This means you have to measure how many cycles occur in a specific period of time. This in turn requires that you can very accurately measure out a segment of time. That in turn requires a highly accurate clock.

The United States maintains a time standard and provides the reference to the world from the National Institute of Science and Technology, formerly the National Bureau of Standards. There, they have highly accurate atomic clocks clicking off the nanoseconds.

NASA, in order measure frequencies accurately as well as time launches and other time critical operations precisely, has to have accurate clocks, they have their own time reference.

On the other side of the country, NewNet, an IRC network I founded in 1995, was trying to get a couple of servers to link up. IRC uses a protocol known as Time Stamp Protocol, to determine after a server split, who owns a particular nick or channel if there is a conflict.

In other words, let’s say you have two servers, A and B on a network and normally they communicate in real time. User “MyNick” connects to server A and is talking to people. Then a split occurs (server A temporarily looses communications with server B) and during that time another user logs into server B with the nick “MyNick”. Now, the split is resolved, both servers resume communications and there are two people on the network with the same nick. The servers use Time Stamp Protocol to determine which user had the Nick first and force the other user to change his nick.

In order for Time Stamp Protocol, both servers much agree on what time it is. This requires that the servers be synchronized to some accurate external time source. If for some reason the clocks on two servers do not agree, they will not link.

So here I am working with another site trying to link their server to mine and they will not connect. The admin of the other server says to me, “Your clock must be off because mine is sync’d to NASA”. I say to him, “Mine is sync’d to NIST so it must be yours that is off.”

We both test our servers and find that indeed our servers clocks do agree with the sources we sync’d them to. Then I checked NASA’s time server against NIST’s, there was five seconds difference.

Now getting back to Voyager I and II… If the clock you are using to measure the received frequencies of their transmitters is in accurate, then those received frequency measurements will be inaccurate. If the received measurements are accurate, then the velocities calculated to be causing the received Doppler shift is inaccurate.

Maybe it’s just me but before I would introduce a whole new fudge factor into cosmology, I’d check the clocks. Apparently that hasn’t occurred to NASA however, and I guess this explains why shuttles explode, less than half the Mars missions actually make it to Mars, etc.

There is something I haven’t been able to get regarding Hawking radiation and I think the scientific community, by trying to create black holes in a particle accelerator, might be making a serious mistake. Hawking radiation isn’t a proved phenomena, it is only theoretical, and if it turns out not to exist, or not to work as expected, then we’re potentially in a world of hurt.

Here is what I don’t get about it. Hawking radiation works like this. A virtual particle pair forms right at the event horizon. One particle in the pair is sucked into the black hole, the other escapes and becomes real. This results in radiation from the black hole and theory has it decreases the mass of the black hole.

There seems to me to be some assumptions here that I’m not so comfortable with. The first is that if a virtual particle in a virtual particle/anti-particle pair can’t unite with it’s mate, it becomes real. The second is that the other virtual particle, being sucked into the black hole, will somehow decrease the mass of the black hole.

I understand the thinking, conservation of mass, in order for that to be upheld, since radiation, matter / energy is “leaving” the black hole, it must decrease in mass. But nothing actually “left” the black hole, what left was a virtual particle created at the event horizon.

So, scientists are counting on these tiny black holes they hope to create in a particle accelerator to evaporate via Hawking radiation. But what if one of these assumptions is wrong? They don’t get smaller in mass or Hawking radiation doesn’t happen? Then they’re going to keep sucking up surrounding matter and growing in mass until the whole planet is sucked in and we’re all dead. This sounds like a pretty high risk scientific experiment to me. The kind of experiment that if we have to do it we should be doing it somewhere in deep space.

I don’t know if it’s true or not, but I’ve read that when scientists set of the first hydrogen bomb, they really weren’t sure if it wouldn’t detonate all the hydrogen in the water vapor in the atmosphere. We’re all here, so apparently it didn’t. I’m all for the advancement of science, but I question the wisdom of risking the entire human race on these types of experiments.

I have been advocating building advanced nuclear reactors capable of burning actinides to eliminate long term radioactive waste rather than attempting to store it in a national repository for 20,000 years.

Now, a new problem emerges. It has been found that radiation given off by radioactive waste materials breaks down minerals containing them much faster than originally believed. This makes long term storage essentially not viable. Glassification of waste won’t work for the long term because the alpha particles emitted by decaying radioactive elements break it down in just a few hundred years.

Follow this link for an article that details this process. This means that the radioactive waste will not be contained at the site for more than a few hundred years. If we store radioactive waste instead of destroying it, we are creating a huge mess for future generations.

Building actinide burning reactors to fission actinides not only will eliminate long-term radioactive waste leaving only fission products that will be save in several hundred years rather than 20,000, it will also provide 20-30x as much energy as the initial Uranium or Plutonium did when it was first used in a fission reactor. Given our energy situation this also is a good reason to build these.

If we don’t build them, then commercial reactors will continue to fission Uranium and generate Plutonium which is the most problematic element in nuclear waste. Other actinides, elements heavier than Uranium, also are long term waste problems. Plutonium-239 can be used as fuel in conventional thermal reactors (neutrons moderated to thermal speeds), but most of the other actinides can only be fissioned by fast neutrons, and this is why special reactors are needed to destroy them.

Now to be sure there are safety issues associated with fast flux reactors, and in general maintaining stability is more difficult than with a thermal reactor. This is because in a thermal reactor, neutrons have to be moderated (slowed) before they can be absorbed by another nucleus and cause a fission. Thus there is a built-in delay that limits the rate a reaction can ramp up. This built-in delay is not present in a fast-flux design.

There are other methods of providing stability in a fast-flux design. Fast flux reactors generally have negative temperature coefficients. That is, as the temperature rises, the reaction rate slows. This has the effect of providing negative feedback on the reaction rate thus stabilizing it.

However, we can mitigate these dangers to a large degree by building a fast-flux reactor form in the Yucca Mountain facility intended for waste storage.

The bigger problems aren’t safety, they’re economic. It is simply cheaper to build conventional thermal reactors and keep running Uranium through them one-pass and generating huge amounts of waste. Uranium is cheap and since the waste isn’t being dealt with, nobody is presently bearing that expense in real terms.

If we are realistic about expense, we must build these reactors and burn these actinide wastes because there simply is no safe way to store them long term.

I think it is undeniable that the carbon dioxide that we are pumping into our atmosphere is having a warming effect on our planet. However, I do not believe it is the only cause. I do believe we should stop screwing our atmosphere up and not add to the natural warming.

The other planets are heating up as well, Pluto (well it used to be a planet.. poor Pluto) is heating even though it’s on a portion of it’s elliptical orbit taking it further from our Sun. Mars is warming up. Jupiter is warming up. Some regions of Saturn (polar just like earth) appear to be warming. Neptune’s largest moon, Triton, is warming up. Neptune herself is warming up but it’s spring on Neptune so that’s to be expected. Uranus is warming up but it’s also spring on Uranus. So all the planets farther from the sun than Earth are also warming.

This last solar cycle was the most active in recorded history. We know from the past that solar minimums, extended periods of no sunspots, produce a substantial cooling effect. Is it then not reasonable to expect a highly active cycle to produce substantial warming? This next solar cycle is projected to be even stronger than the last.

Now, I have an idea what is causing all of this, and I’m not speaking of Richard Hoagland’s hyper-dimensional physics which I do not discount either since I do not at all understand Mr. Hoagland’s theory. No my theory is much less complex. It goes like this…

Neutrinos it turns out aren’t massless after all. They have a small mass and thus are affected by gravity. In the galactic plain, one would expect a higher neutrino flux both because there are more stars and other nuclear actions taking place in a parallel line than above or below the ecliptic and because gravity would affect neutrinos to some small degree as well.

Richard Feynman said that all nuclear reactions are reversible. You can fuse four hydrogen atoms into helium, get some energy in the form of gamma rays and neutrinos in the process. Or you can add energy, neutrinos, and split a helium atom up into hydrogens again, at least theoretically. The important point here is that neutrinos, in addition to being produced by nuclear reactions, should also be able to catalyze them.

The sun (and it’s planets) are nearing the galactic plane presently. The sun bobs up and down through the galactic plane and with every crossing it crosses a higher flux of neutrinos and other high energy particles.

These particles in turn influence the rate of fission reactions in the earth and fusion reactions in the sun causing general heating of the entire solar system.

There seems to be some general debate with respect to how often this happens, every 62 million years, every 65 million years, different sources give different figures, but whatever the cycle is, we’re approaching the galactic plane now and this could be part of the reason for solar system wide heating.

I’m not saying we should go out and burn more dead dinosaurs, not unless we like the climate on Venus, and while I like it warm, when lead melts, that is too warm.

A long long time ago in a place far far away, well maybe not so far away but it was a long time ago, I read a book entitled, “Understanding Einstein”, I don’t remember who the author was but it was a very interesting book because it went through Einsteins various thought experiments and how they lead to his theory of general relativity.

After reading the book, and I had at least thought I had comprehended it, I thought to myself, this is the same logic that lead people to believe that faster than sound travel was impossible, until we did it.

Moreover, things appeared to be mainly a perspective problem. That is, if you were able to watch the clock of someone receding from you of course their clock would appear to be running shorter, because the time it takes the light to travel from the clock back to you is getting ever longer because they are traveling away.

I ran into a bit of a logical non sequitur, because as I understand it, Einsteins theory of general relativity just specifies the motion relative to you, it doesn’t matter if it’s coming or going, the clock still goes slower as observed from your frame of reference. Trying to visualize it the way I was would have the clock going faster if someone was speeding to me at a significant fraction of the speed of light.

So I concluded I really couldn’t make sense of things at all. I read a bunch of physics books and they all made interesting implications as well.

You see, as I understood general relativity, based upon my reading Understanding Einstein, I was under the impression that if I watched someone moving relative to me at .87c, then I would see their clock run at half speed, their ruler become half as short, etc. But, from their perspective, their clock would be running at the normal speed, their ruler would be the correct length, but my clock would appear to be running slow and my ruler short.

I read various physics books, and articles like Alice in Quantum Land, and they give the impression that the moving person experiences their own clock running slow and their own ruler short. If I was running into a non sequitur before, now I’m really lost. That makes no sense at all because then everybody’s clocks would be out of whack equally and it would be as if nothing had happened. Alice in Quantum Land also makes the assertion that one second of time is equal to 186,000 miles of physical space. I don’t understand how that logic follows anymore more than one second of time is equal to 761 miles of physical space based upon the speed of sound.

Think about the classical example, someone takes a trip to a distant star system at a significant fraction of the speed of light, they get there in the expected time frame for them, let’s say Alpha Centauri at 4.3 light-years at .99c so the traveler would experience a trip duration of just over 4.3 years, but when they returned to earth, hundreds of years would have gone by, because relative to us their clock was going way slow.

But wait a minute, motion is relative, there is no absolute frame of reference. So someone traveling to Alpha Centauri at .99c, that’s exactly equivalent to them remaining stationary and the Earth, and Alpha Centauri and the rest of the stars moving towards the traveler at .99c, and then in that case, their clocks, all the clocks on earth would be running slow compared to the same traveler.

So if that is the case, if there is no absolute frame of reference, and I was under the impression that the Michelson-Morley experiment proved that, then from each others perspective, each others clocks ran slow, and so everyone is really still in sync and nothing magical happened in terms of time going bonkers. Most of his friends and relatives would still be alive and hundreds of years would not have passed.

You see none of this really is making good sense to me right now. My latest physics read, “The Dancing Wu-Li Masters“, it gets me back into the “the person in motion relative to me, their clock is going to appear to run slow to me, their ruler short, their mass heavy”, but to them everything is normal. And since everything is relative, likewise my clock, ruler, and mass will be affected in the same manner from their perspective.

Can someone help me make sense of this? I’m going to have a hard time sleeping now.