Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?

Don't hold your breath. That 500x number is certainly a "marketing number," in other words it's probably true, under certain special conditions, but wildly optimistic for most applications.
Today's camera sensors are already quite efficient. A perfectly efficient sensor, generating one electron for each incident photon, would have a QE (Quantum Efficiency) of 1.0 (100%). Today's sensors are as much as 45% efficient (Reference, see Table 2), so 2x (1 stop) is about the theoretical maximum improvement that can be achieved through sensitivity improvement alone.
Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.

Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?

Don't hold your breath. That 500x number is certainly a "marketing number," in other words it's probably true, under certain special conditions, but wildly optimistic for most applications.
Today's camera sensors are already quite efficient. A perfectly efficient sensor, generating one electron for each incident photon, would have a QE (Quantum Efficiency) of 1.0 (100%). Today's sensors are as much as 45% efficient (Reference, see Table 2), so 2x (1 stop) is about the theoretical maximum improvement that can be achieved through sensitivity improvement alone.
Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.

Geez....what are you some kind of mechanical engineer? Oh, yea...you are.

Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.

Howz that work out in stops? Are we gonna have a 6D or a D4 with a 1 million ISO?

Don't hold your breath. That 500x number is certainly a "marketing number," in other words it's probably true, under certain special conditions, but wildly optimistic for most applications.
Today's camera sensors are already quite efficient. A perfectly efficient sensor, generating one electron for each incident photon, would have a QE (Quantum Efficiency) of 1.0 (100%). Today's sensors are as much as 45% efficient (Reference, see Table 2), so 2x (1 stop) is about the theoretical maximum improvement that can be achieved through sensitivity improvement alone.
Where the improvement might be more important would be in solar power, where improvement in broad-band QE (including IR and visible) means the possibility of capturing/converting a far greater portion of the power falling on the chip, and needing to reject a much smaller fraction of that power in waste heat.

It's scary....I kind of get what your saying.

What if this, and I am just being hypothetical, and don't know anything about black silicon. But, the spikes mentioned in the article, what if the sensitivity is more efficient, still keeping to the QE 1.0, but say somehow the spikes are able to gather the light more effectively, as opposed to standard silicon.

In other words, what if the spikes are kind of like a net that captures the light more efficiently, where as the old silicon version skips light like a rock on water.

What if this, and I am just being hypothetical, and don't know anything about black silicon. But, the spikes mentioned in the article, what if the sensitivity is more efficient, still keeping to the QE 1.0, but say somehow the spikes are able to gather the light more effectively, as opposed to standard silicon.

In other words, what if the spikes are kind of like a net that captures the light more efficiently, where as the old silicon version skips light like a rock on water.

Does this make sense?

Yep, that's exactly what happens. But remember, if you are Spongebob, trapping jellyfish with a net, the greatest number of jellyfish you can net is the total jellyfish in the swarm.
Same with photons; there are only so many photons to be captured, and a QE of 1.0 means you are capturing them all... there are no more left to capture. Today's sensors capture nearly half the total photons available, and that is quite a feat.
One real area left for improvement is getting rid of the color filter array. If we don't filter, we have to differentiate between colors *somehow* and that could be done a number of ways. Foveon currently does it by leveraging the fact that light of different wavelengths penetrates silicon to different depths. They use detectors positioned at different depths to do this. In effect, they are still filtering, but they can capture and use some of the "filtered" photons. The results have not measured up to the best Bayer pattern sensors, though.
Another, much more sophisticated way to do this would be to measure the energy of each photon detected, rather than just counting presence. The energy of the photon is its "color." This is *much* more difficult than it sounds.

Same with photons; there are only so many photons to be captured, and a QE of 1.0 means you are capturing them all... there are no more left to capture. Today's sensors capture nearly half the total photons available, and that is quite a feat.

How do you calculate/measure the number of photons striking the sensor
?

Same with photons; there are only so many photons to be captured, and a QE of 1.0 means you are capturing them all... there are no more left to capture. Today's sensors capture nearly half the total photons available, and that is quite a feat.

How do you calculate/measure the number of photons striking the sensor
?

Given the measured intensity of light (in Lux or some other quantifiable unit) the number of photons per unit area per second can be calculated. Measuring light intensity is like measuring the speed with which a bucket fills up in a downpour, for example 5 cm per hour. Knowing that rate, yoiu can calculate how many drops per second are required to fill it at that rate. The drops per second is like the rate of photon arrival.
We can take the raindrop analogy further... if we stick a note card out in a light rain shower for a short time, then pull it in and count the raindrops that have hit it, then repeat the test many times (each with the same "exposure time" we'll find that the number of drops is not the same on all cards, but varies randomly. Photon arrival works *exactly* the same way, in fact the statistical distribution of raindrop counts follows the same rules as photon counts do!
Now imagine that you expose an array of note cards to the rain shower, and count the drops on each card. Same result as before, counts vary. This is the same effect as "photon arrival noise" which is a very large component of the noise we see in very high ISO images... not the "fixed pattern" noise, but the noise that's different from frame to frame.