Heavier nuclei are less stable—that's something we all learned in school. Adding more nucleons (protons and neutrons) makes atoms more likely to break apart. It's one reason why elements heavier than plutonium haven't been found in nature. However, new nuclear physics calculations predict an "island of stability," where heavier nuclei are long-lived, but at much higher numbers of nucleons than exist naturally. The details of this island depend on the shell structure of the nucleus, something that is difficult to calculate directly.

Through experiments on nobelium (No) and lawrencium (Lr) nuclei described in a new Science paper, researchers have measured more details of high-mass nuclear structure, thus helping to pin down this shell structure. E. Minaya Ramirez and colleagues performed the first precision mass measurements, which allowed them to find the energy binding the nucleus together. While neither nobelium nor lawrencium are completely stable, they are relatively long-lived for superheavy elements (yes, they belong to the island of stability). Knowing how they are put together reveals much about the very heavy elements, showing that the center of the island of stability exists for nuclei with 184 neutrons.

The number of protons (labeled Z) determines the chemical element, and the number of neutrons (written as N) determines the isotope. Thus, nobelium has Z=102 and lawrencium has Z=103, but each may have a wide range of N values. And both Z and N contribute to the stability of a nucleus. The shell model depicts this as a series of concentric layers or shells (in a simplified version at least). If a shell is incomplete, the nucleus is more likely to give up or accept a nucleon—or may be unstable, spontaneously decaying into another type of nucleus. Certain magic numbers of nucleons are particularly stable, such as the isotope of lead containing 126 neutrons (208Pb); these correspond to closed shells in the nucleus.

Isotopes are labeled with the symbol for the element from the periodic table (which reveals the number of protons) and the total number of nucleons A = Z + N as a prefix. So, the most common isotope of carbon on Earth is carbon-12, consisting of 6 protons and 6 neutrons, and is labeled 12C. The radioactive isotope used in carbon dating has two more neutrons, so it is labeled 14C.

Beyond lead (elements with Z greater than 82), however, no known element is completely stable, nor is any expected to be from theoretical models. One way to think of this is like an inverted pyramid built of child's blocks; each complete layer (which keeps the pyramid balanced) represents a closed shell, but the higher the structure stands, the more likely it is to collapse, even if every layer is complete.

In this sense, stability in the superheavy elements (SHE, Z > 100) is relative: if an element is long-lived compared to isotopes of similar composition, it's stable enough. For example, one isotope of nobelium (with N=157) has a half-life of 58 minutes, while other isotopes may not last even 5 seconds. The island of stability is a region of the periodic table where elements' half-lives increase after a large group of elements with very short decay times.

However, due to the difficulty in producing SHEs and their short lives (even in the island of stability), it's hard to measure their masses. The mass of a nucleus is not simply the sum of the masses of the nucleons; instead, some of the mass is converted to binding energy (via Einstein's formula E = mc2). The amount of binding energy is related to the shell structure, so even if the shell is open—it happens to be closed with lawrencium or nobelium—it reveals what the magic number should be to close the shell.

In the current experiment, the researchers confined the nuclei temporarily in a Penning trap, which stores moving charged particles using magnetic and electric fields. The motion within the trap reveals the mass (in a similar way to mass spectroscopy used in ordinary chemical applications). The authors measured the masses of 255No, 256Lr (both with N = 153) and 255Lr (N = 152) for the first time, revealing the details of the shell structure for those and similar isotopes. Particularly, they found a gap in the shell structure at N = 152—meaning there is a jump in the amount of binding energy if another neutron is added.

With precision mass measurements and knowledge of the shell structure, the researchers have made significant progress toward mapping the SHE region of the periodic table, including the island of stability. Their method can be extended to other similarly heavy isotopes, even ones that are produced in tiny quantities (like lawrencium). In turn, these results can be used to improve theoretical models, and lend strong support to the nuclear shell model.

What I would like to know is, is there a practical use for these or is the work mostly for testing theoretical models?

I was wondering the same thing...my suspicion is that there isn't a practical use, except for learning more and refining the models (which someday, who knows, may present something "useful"). From the article:

Quote:

In this sense, stability in the superheavy elements (SHE, Z > 100) is relative: if an element is long-lived compared to isotopes of similar composition, it's stable enough. For example, one isotope of nobelium (with N=157) has a half-life of 58 minutes, while other isotopes may not last even 5 seconds.

58 minutes is a "long" time for this element, but probably not long enough for us to do anything useful with the material itself. Plus, as "stable" as this element is, I suspect we wouldn't want to be standing next to it during that hour...

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

Odds are that any knowledge gained either directly from these experiments or from refining the models they're addressing will be useful in things like fusion power research. If we can understand what's happening as we fuse nuclei and how the energy is flowing, we can much more finely direct our efforts at building devices to induce and control such reactions.

What I would like to know is, is there a practical use for these or is the work mostly for testing theoretical models?

Why do you care if there's a "practical use"?

I hate the phrase "practical use" in discussions like this...it always makes me think you just see dollars spit into the wind like so many helium atoms, electrons and photons...

I believe you misunderstood my phrasing of the question, although I tried to make it plain. I view testing of theoretical models as a practical thing to do. How about, are there any uses for these super heavy stable elements the people envision beyond the testing of current theoretical models.

Perhaps you should be more generous in interpreting people's questions.

What I would like to know is, is there a practical use for these or is the work mostly for testing theoretical models?

Well, if we can bind these super heavy elements together with themselves or in an alloy that stabilizes them we might have a newer, thinner (although maybe just as heavy or heavier) armor or structural material. Or maybe it'll be something that can withstand super high heat & pressure so we make a deep earth exploration machine, or an even more advanced re-usable heat shielding on space vehicles.

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

The math involved in QCD has no closed form analytic solutions. Instead we are stuck with really nasty numerical approximation methods.

Along with weather modeling and protein folding this is one of the most computationally difficult areas of science out there.

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

Here's an explanation using my rough understanding:Each proton and neutron is actually made up of three tightly bound quarks, which interact via the strong force. The shell model is a way of approximating what happens, but to do simulations to discover if the shell model is right would require testing not just configurations predicted by the shell model, but myriad others that might work as well. And you can't just simulate the positions of the 6-900 quarks, but the interactions between them. The quarks interact via gluons and the gluons can interact with each other as well. And for any sort of reasonable accuracy, the simulations have to step time on roughly a femto-second time scale (give or take a few orders of magnitude) - all for the purpose of looking for stability that stretches into minutes and beyond.

And to give a sense of how hard all that is, calculating the mass of a single proton from first principles is only recently feasible.

So the number of calculations using first principles for the heavier elements is greatly beyond anything currently practically possible, thus the need for an approximate model and verification thereof.

What I would like to know is, is there a practical use for these or is the work mostly for testing theoretical models?

Well, if we can bind these super heavy elements together with themselves or in an alloy that stabilizes them we might have a newer, thinner (although maybe just as heavy or heavier) armor or structural material. Or maybe it'll be something that can withstand super high heat & pressure so we make a deep earth exploration machine, or an even more advanced re-usable heat shielding on space vehicles.

How about cannon shells that go super-critical on impact? Californium bullets anyone?

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

Actually there are far far more particles involved: The protons and neutrons consist of quarks that also react to the strong force. Also there are a large number of virtual quarks in the nuklei. Then there is the large number of gluons tht mediate the strong force between the mix of real and virtual quarks and these gluons also react with each other. On top of that all interactions seam to follow quantum mechanics with probabilities and non-locality.This get's difficult so quickly that we can't even simulate simple nuklei.

Interesting. During the late 90's I had an obsession with atom physics. I very clearly remember reading several times about different theories hinting at the possibility of an "island of stability" for ultra heavy atoms. Presumably in the 400-500u region.If I would just still be halfway knowledgeable in this stuff today...

And when someone asks me "what is it useful for" about newly emerged theories or discoveries in mathematics and physics, I always answer to ask an engineer some decades from now.It's happened so often that mathematicians found some interesting structure or had some interesting idea that nobody had any use for. Some time later, physicists found it tremendously useful to describe some obscure, newly found phenomenon or describe some batshit crazy theory. Another couple of years or decades later, that batshit crazy weirdo stuff was common knowledge and started to get interesting for some high end applications. Another couple decades later, people have it in their toasters.

Just take a look at that couple years old BMW and search for the knock sensor on the engine. Oh, it is using some ion flow measurements on the spark plugs instead...Or just look at that wrist watch of yours, which hopefully is a classic Casio. I am pretty sure the first time someone has described his experiments about changing the orientation of some crystals by applying a voltage, nobody had thought that LCDs would come out of Time's ass if you feed it that discovery.

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

Other people have approached this from a physics/maths PoV, I'm going to do it from a computer science point of view.

It's not just that there are 200-300 nucleons, it's the fact that they all interact with each other in quite complex ways, this means that an accurate simulation would probably require polynomial time, per tick. At a reasonable n^2, there are (for 250 nucleons), 62,500 operations to do per tick.

While that doesn't seem slow, each tick needs to be incredibly short in order to get an appropriately accurate simulation, think "the amount of time it takes a photon to cross the nucleus", which is on the order of femtoseconds. 1 minute of femtoseconds is 6*10^16 fs.

So lets say you have 250 nucleons, a simulation tick of 10 fs and a computer capable of 2 PFLOPS. If each loop requires 1000 floating-point operations (or equivalent), on average (which is laughably tiny); it would take 312 us per tick, and therefore 3,125,000 real seconds per simulated second, or ~36 days per simulated second. Which means an hour-long simulation would take ~130200 days, or ~356 years.

Clearly not feasible. Even if you make the tick-length 100 times longer, you are still looking at 3.5 years of calculation time, on a supercomputer that sits comfortably near the top in world rankings.

Of course, as many people have pointed out, the nucleus of an atom is far more complicated anyway. If each nucleon has 3 sub-atomic particles, then you are looking at ~3000 years of calculation time. And then there is having to do all sorts of weirdness with the way different particles interact and all that... it's a mess...

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

Computer simulations can only accurately describe things you understand well enough to have an accurate model for. If you aren't sure if your model is correct, simulation won't help.

What I would like to know is, is there a practical use for these or is the work mostly for testing theoretical models?

Why do you care if there's a "practical use"?

I hate the phrase "practical use" in discussions like this...it always makes me think you just see dollars spit into the wind like so many helium atoms, electrons and photons...

I believe you misunderstood my phrasing of the question, although I tried to make it plain. I view testing of theoretical models as a practical thing to do. How about, are there any uses for these super heavy stable elements the people envision beyond the testing of current theoretical models.

Perhaps you should be more generous in interpreting people's questions.

Until an appreciable quantity of these elements are actually produced, the chemical and physical properties are speculative. Unless the properties were near magical (which would mean new physics), practical applications would most likely be limited to refinement of existing theories and techniques. It's impossible to say what effect that might have though.

What I would like to know is, is there a practical use for these or is the work mostly for testing theoretical models?

Well, if we can bind these super heavy elements together with themselves or in an alloy that stabilizes them we might have a newer, thinner (although maybe just as heavy or heavier) armor or structural material. Or maybe it'll be something that can withstand super high heat & pressure so we make a deep earth exploration machine, or an even more advanced re-usable heat shielding on space vehicles.

How about cannon shells that go super-critical on impact? Californium bullets anyone?

This is indeed not a new idea. This is just another incremental step towards sussing out the potential. Here is an article from a few years back ( http://phys.org/news173028810.html ).

Note that minutes, or tens of minutes may not be the ceiling for 1/2 life times. A much longer 1/2 life time in years or decades, and up, could make it much easier for them to be used in things like metallurgy applications.

As for their chemical properties even with similar filled shells there are some minor but important variations that can occur as the whole atom size scales up. We see this, for example, in the noble gases, as the heavier ones are able to chemically bond (in relatively rare examples) while the lighter noble gases are fully inert.

As a complete layman: considering the small number of particles involved (200-300 protons and neutrons) and the well understood strong nuclear force, isn't this something that could be well computer simulated? I'm sure the answer to this is "no", otherwise they would do it, so the question then becomes what makes this too complex to simulate? What other kinds of interactions and forces are involved?

The strong nuclear force is not well understood at all. QCD is strongly coupled at low energy (where "low" is very much a relative term here; I'm talking "not in a particle accelerator"), which is a fancy way to say that the series expansion describing the strong force at nuclear energies diverges badly (higher order terms are more important than low order terms). This is not to say our theory is broken (for example, the expansion of e^-1/x does not converge to the function itself, but the function itself is obviously perfectly well behaved) but that our normal mathematical technique for extracting predictions from it, perturbation theory, fails. In contrast, the other two forces of the standard model are "weakly coupled" at all accessible energies and present no such difficulties.

No one has ever found a way to solve the equations of QCD in this "non-perturbative" regime. Feynman himself spent years making no progress before his death. It is one of the Holy Grails of theoretical particle physics and an automatic Nobel Prize. Compounding this, the strong force in the nucleus is actually a residual force, a van der Waals equivalent, making it even harder to simulate than direct quark-quark interactions.

Beyond lead (elements with Z greater than 82), however, no known element is completely stable, nor is any expected to be from theoretical models.

If the area now has started to come down on quasi-stability, which would predict the absence of these elements seen in nature, my excitement starts to flag.

It is of course nice to have a sand box to develop nuclear physics in.

However, elements that are radioactive we have plenty of cheaper ones of, even if they aren't all perfectly suited to our technologies. (See the problems with RTGs for missions beyond Jupiter.) And synthetic chemistry can now make synthetic superheavy elements by clustering smaller atoms, IIRC.

But I'm guessing the jury is still out on the absolute stability question, the models are still work in development.

redtabsco wrote:

Well this is possibly the most pretentious tripe I've ever read on Ars, and that's really saying something.

Why "pretentious"? This is a venerable area of research, see the comments on the many decades of efforts, with plenty of progress, see the image of the special accelerator technology developed, and no one is pretending otherwise.

Actually there are far far more particles involved: The protons and neutrons consist of quarks that also react to the strong force. Also there are a large number of virtual quarks in the nuklei. Then there is the large number of gluons tht mediate the strong force between the mix of real and virtual quarks and these gluons also react with each other.

So here goes: in his series of posts Strassler points out that the often used image of nucleons = 3 quarks + gluons had to be abandoned. Observations show that a nucleon is 3 unpaired quarks, a fluctuating number of quark-antiquark pairs, and gluons.

All these particles are relativistic, by the way, adding to the model difficulties. However, it can be phenomenologically modeled with "structure functions" (IIRC). This comes out of articles linked in the link above.

On the "virtual particle" view, straight from the article:

"It is often useful in technical calculations to think of the quark/anti-quark pairs as virtual particles — which are not particles at all, but fluctuations in the quark fields — associated with gluons (analogous to fluctuations in the electron field associated with photons — see Figures 5 and 6 of this article), and then to think of the gluons as fluctuations in the gluon field created in the vicinity of quarks (the way electrons create disturbances in the photon [i.e., electromagnetic] field — see Figures 3 and 4 of this article.).

One might try to take this technical point to be a physical one, and suggest that all the quark/anti-quark pairs and the gluons arise as virtual particles associated to one of the three quarks intrinsic to the proton.

But as I’ve emphasized in earlier posts and answers to comments, it isn’t really meaningful to establish a clear difference between virtual particles (which aren’t particles) and real particles (which are nicely behaved ripples in a quantum field) when you’re trying to understand something as complicated as a proton’s interior.

In any case, it turns out that it is hard to get this line of thinking to work for all of the gluons, but you might get away with thinking about the quark/anti-quark pairs this way. I can’t tell you this is entirely excluded, because theorists cannot calculate the proton’s interior well enough to be sure such a picture is false. Nor is there any clarifying measurement you could make to check whether this picture made sense. So you should view this as an unsettled point, a caveat to the article I wrote giving you evidence that the proton’s interior is complicated.

However, even if the minority view turned out to be (at least in some sense) right, my main point for you still would stand: that you have to think about the gluons and quark/anti-quark pairs inside the proton — whatever their origin — in order to understand anything about Large Hadron Collider [LHC] physics. At the LHC (or even many earlier experiments) there’s no point in worrying about whether you can or can’t someday find some way of thinking about the proton as only three quarks.* Any physics you do at the LHC will involve the gluons and the quark/anti-quark pairs in a big way; nothing of any interest arises merely from those three quarks of lore." [Strassler's italics, links removed, my * note]

Now you may argue that this is very specifically tied to "virtual particles out of the three unpaired quarks" and/or relativistic protons in accelerators (see my * note) instead of "virtual particles any which way". But I don't think so.

Instead I think Strassler's view is this: virtual particles are shortlived vibrations in a particle field that didn't quite match the energy of more longlived resonances ("ripples") aka particles. However, particle-antiparticle pairs fluctuate in and out of a vacuum, or as here inside an energetic nucleon, under uncertainty and have another dynamics.

For example, I think it would take a different amount of energy distribution to make a virtual particle real and stable (unpaired momenta) than particles out of fluctuating particle-antiparticle pairs (paired momenta). So when you equate them, they are equivalent in some ways but not in others.

-------------------------* This was in reference to the Q&A starting that segment:

"Is there not perhaps some sense in which the proton should be thought of as three quarks, with all the gluons and quark-antiquark pairs being just an artifact, or an effect, of having accelerated the proton to high speed?

Well, although I have argued against this in my posts, and given you strong reasons, I should still tell you that actually there almost is such a sense, and this is part of why the issue of the proton’s structure has been debated for so long. I don’t think this is the right way to think about the problem, but I cannot promise you 100% that I’m right, and so, in fairness, I should present the dissenting minority view."

Hold on, NEW nuclear physics calculations? I heard about a predicted "island of stability" over a decade ago.

Especially since the "stability" of this Island is only relative. When all the isotopes around this "island" only have life-spans of nano-seconds then a handfull of isoptopes having lifespans of minutes is indeed noteworthy, but not useful in a practical sense. It is of high scientific interest, though.

Or just look at that wrist watch of yours, which hopefully is a classic Casio. I am pretty sure the first time someone has described his experiments about changing the orientation of some crystals by applying a voltage, nobody had thought that LCDs would come out of Time's ass if you feed it that discovery.

I laughed when I read that. I'm stealing this example from here on out. +1