Share this story

Moving from the present world to one where renewable power dominates our energy economy is going to require some additional technologies. These may include storage, enhanced grid management, and demand-response power management, but they could also include something entirely new. Recently, a paper took a look at a technology I hadn't realized even existed.

The paper evaluates the potential of what its authors are terming "nighttime photovoltaic power," and the simplest way of thinking about it is "running solar panels in reverse": generating electricity by radiating energy away into space. The efficiency is nothing like that of standard photovoltaics and can't even get there except in unusual circumstances. But as the name implies, it can keep generating power long after the Sun goes down.

Photovoltaics at night

The easiest way to understand this tech is to think of a photovoltaic device in equilibrium with its environment. Here, incoming photons will occasionally liberate an electron, leaving behind a positively charged hole. These can then combine, radiating a photon back out of the device. When operating as a photovoltaic device, there's a large excess of photons coming in, producing a corresponding excess of electrons and holes that can then be harvested as electricity.

Now envision setting up the reverse situation: there are no incoming photons, and instead we ensure that an excess of photons is radiated away as infrared radiation. An excess of electrons and holes will now form, and those can again be harvested. Except in this case, we don't need any incoming light—instead, we need something to ensure that photons are extracted from the device at high efficiencies.

This latter requirement is something that researchers in a seemingly unrelated area of research has been working on. There's a "window" of infrared wavelengths that aren't absorbed by any of the gasses in the atmosphere. A carefully crafted material can absorb radiation at a variety of wavelengths but only radiate it back out at wavelengths that fit through this atmospheric window. People have been building devices that operate this way as a means of passively cooling buildings, and there are even ways of treating wood so that it operates in this manner.

Further Reading

Potentially, by linking a photovoltaic material that emits photons that are matched with a passively radiating device, it's possible to build a "thermoradiative cell" that can extract some electricity while radiating photons out into space. In contexts where there's a large cooling demand or a lot of waste heat, this could be an added benefit to the cooling that needs to be done anyway. But can it actually work in any practical way?

Running the numbers

That's what Tristan Deppe and Jeremy Munday decided to find out. Their paper analyzes everything from the semiconductor bandgaps needed for this to work to the wavelengths normally present in the atmosphere.

The atmosphere is an issue because no material is completely transparent to all the wavelengths of light that are typically present in our atmosphere. That means that all the materials involved—the one radiating away the infrared and the one generating spare electrons—will absorb some of the stray photons, decreasing the overall efficiency of the device. The bandgap is an issue because the infrared wavelengths needed for this to work are relatively low-energy, so they need a much smaller bandgap than the silicon used in most photovoltaic cells. The researchers suggest that a family of mercury-cadmium-tellurium compounds appears to have the right properties.

For all their initial calculations, Deppe and Munday assume that the Earth will provide a constant source of 27°C (about 80°F) heat to drive the system. Under these circumstances, they estimate that these systems built with current technology can produce somewhere in the neighborhood of two to eight watts per square meter, with the value changing a bit based on season and location.

Compared to a photovoltaic panel, which typically hits in the area of 200 watts/meter2, that's pretty pathetic. But this could potentially generate 24 hours a day, which a photovoltaic panel most certainly cannot, so that offsets some of the disadvantage. Since much of the infrastructure would be identical for photovoltaic and thermoradiative cells, they also did an estimate of the benefits of a system that could automatically switch panels when the Sun goes down. They calculate that this would increase the power output of a solar farm by 12 percent.

Low, low prices

For those sorts of incremental gains, the thermoradiative cells would need to have incredibly low prices to make economic sense. But when there are co-benefits, like cutting down the cooling bills for buildings in hot climates, the economics might shift a bit.

The really interesting part, however, comes when Deppe and Munday look into what can be done with waste heat. With a starting heat source of 170°C, a thermoradiative cell could produce roughly the same number of watts/meter as a solar panel. While that temperature is hot—it's well above the boiling point of water—it's within the range of what might be considered waste heat for industrial plants and steam generators. And if linked to things like geothermal or solar-thermal plants, it would provide a way of extracting more energy from an existing renewable resource.

For now, this is all purely hypothetical; many of the numbers in the paper are from idealized situations, and we have no idea how close to those we can get in the real world or how cheaply these devices could be manufactured. But the idea is a clever extension of existing technology that could potentially integrate with a variety of existing systems.

136 Reader Comments

The consolation is that if the cells are subject to environmental degradation, nobody will be poisoned by the heavy metals, because the units will stink so badly that nobody will come near them, or downwind of them. The lead miners in the high tellurium mines (Hello Telluride Colorado) used to smell so bad that the local ladies of the night charged a hefty premium for servicing those with "tellurium breath".

Huh, I was just handling an HgCdTe device yesterday. Didn't notice any particular odor. Perhaps the manufacturing site stinks, though?

The process involves Dimethyl Telluride which smells similar to garlic.

But garlic smells good. The only reason people are offended by garlic breath is jealousy, a reminder of the good food they didn't get to eat.

I spent many years making sulfur containing compounds, and I worked a bit with selenium compounds. As long as it is volatile enough to have an odor, if it contains SH or S-methyl it will smell somewhere between bad and aggressively revoltingly disgusting. The few selenium analogues I worked with were an order of magnitude more nauseating than their sulfur counterparts. I never worked with tellurium compounds, but they have the reputation for being selenium "kicked up a notch".

Read about this somewhere else yesterday evening. Didn't understand how it worked. This article is much better for a layman science geek, who's bad at higher math.

The basic science is surprisingly easy to demonstrate. Get ahold of a common infrared thermometer, take it outside. Confirm that it's working by pointing at anything solid nearby. It should read the same as a regular thermometer for objects in the shade. Then point it at a patch of clear sky. It'll show a temperature that is off-the-scale below zero. Clouds read somewhat below freezing on the thermometer I use in my kitchen.

CMB is very similar to blackbody radiation. It was stimulated emission of plasma once upon a time. It's now been stretched to microwave wavelengths. But the peak in the spectrum and the overall shape is not too dissimilar from a black body radiating at 2.8 K.

Granted, most of those hand-held guns aren't measuring microwaves. They're looking at IR. And if they are measuring anything at all, the ratios between short and long IR would send it off-the-scale low.

The consolation is that if the cells are subject to environmental degradation, nobody will be poisoned by the heavy metals, because the units will stink so badly that nobody will come near them, or downwind of them. The lead miners in the high tellurium mines (Hello Telluride Colorado) used to smell so bad that the local ladies of the night charged a hefty premium for servicing those with "tellurium breath".

Huh, I was just handling an HgCdTe device yesterday. Didn't notice any particular odor. Perhaps the manufacturing site stinks, though?

The process involves Dimethyl Telluride which smells similar to garlic.

But garlic smells good. The only reason people are offended by garlic breath is jealousy, a reminder of the good food they didn't get to eat.

I spent many years making sulfur containing compounds, and I worked a bit with selenium compounds. As long as it is volatile enough to have an odor, if it contains SH or S-methyl it will smell somewhere between bad and aggressively revoltingly disgusting. The few selenium analogues I worked with were an order of magnitude more nauseating than their sulfur counterparts. I never worked with tellurium compounds, but they have the reputation for being selenium "kicked up a notch".

And I thought heptane smelled bad (think of a room full of people that just had a lot of chili).

Is it possible to have a panel that can run in "normal" mode at normal efficiencies during the day and then switch to this new mode at night?

Rotate it after sun down and put them on the back lol.

Given how many farms seem to be on trackers, that seems like an ideal idea...mount them back to back and just keep tracking the sun past the other side of the earth all night...

Newer solar panels are two sided, depending on ground reflectivity that can be a 30% increase in efficiency. Could use them to recover energy from the cooling systems of a battery farm, but whats the advantage of this over say a Stirling engine to do the same thing?

Is it possible to have a panel that can run in "normal" mode at normal efficiencies during the day and then switch to this new mode at night?

It sounds like they may be suggesting that with this quote from above:

"Since much of the infrastructure would be identical for photovoltaic and thermoradiative cells, they also did an estimate of the benefits of a system that could automatically switch panels when the Sun goes down. They calculate that this would increase the power output of a solar farm by 12 percent."

Not sure if this means the tech could be added on top of existing tech or would have to be a separate farm with separate panels.

Is it possible to have a panel that can run in "normal" mode at normal efficiencies during the day and then switch to this new mode at night?

Rotate it after sun down and put them on the back lol.

Even simpler at medium or higher latitudes: place them in the shadow of the solar panel, pointing at the sky. So they don't block the panel, but they're still active all day.

Why do they need to point at the sky? If they don't require a direct light source, they could just be on the underside of normal panels, pointing at the ground.

The ground will be emitting ir photons so you won't net positive emission. When aimed a the sky, in bands that aren't absorbed (and hence emitted) by water, the photons can leave without an equal number coming back.

You don't have to aim as critically as a solar panel. You have the entire sky.

Likewise in the desert where the night time temperature is over 100F, r.

Know how I can tell you've never lived in the desert?

When I first moved to AZ, I flew into PHX around 10PM in the middle of June, and it was still 100F outside. You can imagine how dismayed I was in that moment, thinking "what have I done?!" But that's not endemic to the desert and has more to with the heat island effect.

I wonder how its efficiency compares with other technologies that do this, like gas turbines, steam turbines, Peltier devices and Stirling engines.

The efficiency probably isn't as high, but the total power is probably higher.

The thermal efficiency of an engine is referenced to a Carnot engine between T-hot and T-cold. Having to radiate to space while still in the air makes that difference very, very small. (The panel can't get very cold).

The efficiency of a solar panel has the electrical power va the photon power. So the two efficiencies aren't on the same scale. The watts per sq meter might mean to the process here for low temperatures as described.

Likewise in the desert where the night time temperature is over 100F, r.

Know how I can tell you've never lived in the desert?

I did. For over 10 years. Night-time temps in the triple digits? No. Not even in Death Valley. But night-time temps around 80f? That happens from time to time during the summer. But more regular is mid 60s to 70s. Granted, during the summer, you won't see those temps until midnight or later. On a triple-digit day, it'll still be near 90 by 10pm.

note: temperature gradients and rate of change depends on *where* in the desert you happen to be. Middle of nowhere, temps drop faster. Middle of a city? They drop much, much more slowly.

A nuclear powered aircraft carrier could dump it's waste heat into these thermoradiative panels to harvest electricity, where there is no means to 'tap into the Earth'.

Likewise in the desert where the night time temperature is over 100F, and you're running AC, meaning your compressor can get to be 200F or 300F easy depending on how large it is and how much it runs.

So at night time you can harvest this heat to get a little bit of electricity and cool the condenser.

"in the desert where night time temperature is over 100F"

Rarely.

That is true. It's only due to the heat island effect, a smog layer to trap heat, and the Earth acting as a thermal battery absorbing weeks of July sun before August can hit those temperatures, occasionally, and then only for a week.

That's why Hot August Nights was tolerable, when the weather is still warm enough at night for an outdoor festival. Through early July it got too cold at night.

It gets down to price to implement and maintain. Fixed panels are usually easy to maintain. Turbine systems... less so. Recovery of waste heat is a big topic, but even when it's "economical" it's sometimes too complicated or too far removed from a core business to want to jump into. That's why you don't see an algae plant at every refinery, or a distilled water plant at every nuclear reactor.

Actually, it looks like even regular thermocouples are going to be more efficient at this temperature differential.

Also, nuclear reactors are not really good places to produce distilled water for various reasons.

So, using the typical sky 10 W/m^2 from the paper and the 162,494 TWh 2017 energy production from Wikipedia, it would take ~1.8 million km^2 of this stuff to power the Earth. That's approximately than the surface area of Texas, Montana, and California combined.

Well you're not powering the Earth. You're only powering the greatly reduced power consumption that occurs overnight. The idea is to offset the storage requirement typically associated with solar.

Isn't that why if you have any reasonable kind of scale you use a combination of wind and solar?

Wind doesn't usually occur at night either, so storage would still be needed to store solar and wind.

Most power generated from wind is generated at night

Wow, then what's all the talk about the sun rise creating thermal currents? I know there is some wind at night but I thought in general it was stronger during the day. Maybe this only refers to surface winds? Good thing I'm not a weather guy.

It gets down to price to implement and maintain. Fixed panels are usually easy to maintain. Turbine systems... less so. Recovery of waste heat is a big topic, but even when it's "economical" it's sometimes too complicated or too far removed from a core business to want to jump into. That's why you don't see an algae plant at every refinery, or a distilled water plant at every nuclear reactor.

Actually, it looks like even regular thermocouples are going to be more efficient at this temperature differential.

At what temperature differential? These "night sky" coolers are good for maybe 20K difference from ambient. A TEG is going to operate at a whopping 1% efficiency across that.

Is it possible to have a panel that can run in "normal" mode at normal efficiencies during the day and then switch to this new mode at night?

Rotate it after sun down and put them on the back lol.

Why would you have to rotate it?

At night time it's just as dark under the panel as on top.

No it's not. Not in the IR band you need. The ground has thermal emission in that band while the (cloudless) sky does not.

Unless I am misunderstanding the concept, by picking a part of the spectrum that air is mostly transparent to, these proposed panels are effectively radiating to space.

Which is pretty neat. If it works as theorized.

That's not theorized. That part actually works. The new idea here is that instead of using it as a passive cooling method matched to a heat engine (getting abysmal efficiency at that tiny differential), they're going to run it as a photovoltaic panel and produce electricity directly from the emission. Thus, they're no longer restricted by the Carnot limit.

Is it possible to have a panel that can run in "normal" mode at normal efficiencies during the day and then switch to this new mode at night?

Rotate it after sun down and put them on the back lol.

Even simpler at medium or higher latitudes: place them in the shadow of the solar panel, pointing at the sky. So they don't block the panel, but they're still active all day.

Better than that. Solar panels are good at gathering solar radiation but most of it results in waste heat which then reduces the efficiency of the solar panel. So the solar panel would be a ready source of waste heat during the day. And having some radiative cooling would make the solar panel more efficient.

So you should just connect the two panels at an angle with something that provides heat transfer. After all it makes no difference which part of the sky the heat radiates towards so a fairly low angle might be fine. Which would allow use even close to the equator.

The consolation is that if the cells are subject to environmental degradation, nobody will be poisoned by the heavy metals, because the units will stink so badly that nobody will come near them, or downwind of them. The lead miners in the high tellurium mines (Hello Telluride Colorado) used to smell so bad that the local ladies of the night charged a hefty premium for servicing those with "tellurium breath".

Huh, I was just handling an HgCdTe device yesterday. Didn't notice any particular odor. Perhaps the manufacturing site stinks, though?

The process involves Dimethyl Telluride which smells similar to garlic.

But garlic smells good. The only reason people are offended by garlic breath is jealousy, a reminder of the good food they didn't get to eat.

I spent many years making sulfur containing compounds, and I worked a bit with selenium compounds. As long as it is volatile enough to have an odor, if it contains SH or S-methyl it will smell somewhere between bad and aggressively revoltingly disgusting. The few selenium analogues I worked with were an order of magnitude more nauseating than their sulfur counterparts. I never worked with tellurium compounds, but they have the reputation for being selenium "kicked up a notch".

And I thought heptane smelled bad (think of a room full of people that just had a lot of chili).

Heptane would not even make the JV team. Volatile phosphorus compounds, the ones which are not pyrophoric, have a sort of sweet edge to add to the nauseating, and volatile isocyanides have their own retch-inducing edge which is difficult to describe in words. A well known chemist once noted that these small odious molecules all bind well to cuprous copper, and theorized that smell detector proteins with a copper ligand would be found. I do not think he was proven correct unfortunately.

Likewise in the desert where the night time temperature is over 100F, r.

Know how I can tell you've never lived in the desert?

When I first moved to AZ, I flew into PHX around 10PM in the middle of June, and it was still 100F outside. You can imagine how dismayed I was in that moment, thinking "what have I done?!" But that's not endemic to the desert and has more to with the heat island effect.

Right. Most of the time, outside the city, after the sun sets the temp drops very sharply. Radiative cooling in full force.

Huh, I was just handling an HgCdTe device yesterday. Didn't notice any particular odor. Perhaps the manufacturing site stinks, though?

The process involves Dimethyl Telluride which smells similar to garlic.

But garlic smells good. The only reason people are offended by garlic breath is jealousy, a reminder of the good food they didn't get to eat.

I spent many years making sulfur containing compounds, and I worked a bit with selenium compounds. As long as it is volatile enough to have an odor, if it contains SH or S-methyl it will smell somewhere between bad and aggressively revoltingly disgusting. The few selenium analogues I worked with were an order of magnitude more nauseating than their sulfur counterparts. I never worked with tellurium compounds, but they have the reputation for being selenium "kicked up a notch".

And I thought heptane smelled bad (think of a room full of people that just had a lot of chili).

Heptane would not even make the JV team. Volatile phosphorus compounds, the ones which are not pyrophoric, have a sort of sweet edge to add to the nauseating, and volatile isocyanides have their own retch-inducing edge which is difficult to describe in words. A well known chemist once noted that these small odious molecules all bind well to cuprous copper, and theorized that smell detector proteins with a copper ligand would be found. I do not think he was proven correct unfortunately.

I think what may be confusing a few people here (including myself), is that there seems to be a conflating of the benefits of radiative cooling and the benefits of using a photovoltaic material to convert infrared photons into electrical current.

I am actually struggling with what role the atmosphere plays here, which means that I am probably misunderstanding what is happening. The way I interpret this, you have a heat source that is emitting IR radiation at certain wavelengths that are tuned to be able to excite a photovoltaic material that can be used to generate a current. This way, you get to harvest some of the IR emissions that would previously go into the atmosphere and use them for electric current generation from the photovoltaic.

I can see where not having the wavelengths tuned to match the atmosphere's transparent windows would result in the atmosphere just absorbing the IR and emitting it back at the heat source and photovoltaic, which would result in far less cooling. But, why would this affect the power generation? Wouldn't it just increase the opportunity for the photovoltaic material to potentially emit another electron? Not good for cooling, but perhaps actually good for power generation.

In fact, the concept that this has much higher output at very high temperatures sounds like it would be a good thing to get that IR back to the source if your goal was simply more efficient power generation.

If I was correct in my understanding (not saying that I am), then you could just plant these things on the back of a regular solar panel and it wouldn't matter where it was pointing. You could also put them on any hot object. This makes me think that this is depending on the heat flux through the material, which would require one side to be cooler than the other, but I can't read the paper so I'm not sure and I didn't catch it from the article. (Please educate me how I'm wrong here. I'm legitimately interested in what is going on!)

I am actually struggling with what role the atmosphere plays here, which means that I am probably misunderstanding what is happening.

In normal PV, light strikes the cell, an electron is popped loose, and you get a potential in one direction. Alternatively, an electron is absorbed, a photon is emitted, and you get a potential in the opposite direction. Obviously, you want only one of these scenarios to be happening at any given time, because both would cancel out and get you nowhere. The traditional PV operates in the first manner. Higher intensity light and lower heat increase the ratio of this first event versus the second one, increasing efficiency.

This PV operates in the second manner. It is tuned so the emissions happen in a certain IR wavelength (3.5-4μm, ~750K?). That wavelength is part of an infrared "window", in which there is very high atmospheric transmittance, which in turn means very low atmospheric emission. All other higher frequencies get reflected by a filter on the front of the cell, preventing absorption, while lower frequencies have too little energy to be of consequence. Thus, you have lots of events of the second types, few events of the first type, and your PV is essentially running in reverse.

In fact, the concept that this has much higher output at very high temperatures sounds like it would be a good thing to get that IR back to the source if your goal was simply more efficient power generation.

That's beginning to sound awfully like perpetual motion. If there's something on the opposite side that will absorb that frequency, then there's similarly something on the opposite side that will emit that frequency, and foil your plans.

I think what may be confusing a few people here (including myself), is that there seems to be a conflating of the benefits of radiative cooling and the benefits of using a photovoltaic material to convert infrared photons into electrical current.

I am actually struggling with what role the atmosphere plays here, which means that I am probably misunderstanding what is happening. The way I interpret this, you have a heat source that is emitting IR radiation at certain wavelengths that are tuned to be able to excite a photovoltaic material that can be used to generate a current. This way, you get to harvest some of the IR emissions that would previously go into the atmosphere and use them for electric current generation from the photovoltaic.

I can see where not having the wavelengths tuned to match the atmosphere's transparent windows would result in the atmosphere just absorbing the IR and emitting it back at the heat source and photovoltaic, which would result in far less cooling. But, why would this affect the power generation? Wouldn't it just increase the opportunity for the photovoltaic material to potentially emit another electron? Not good for cooling, but perhaps actually good for power generation.

In fact, the concept that this has much higher output at very high temperatures sounds like it would be a good thing to get that IR back to the source if your goal was simply more efficient power generation.

If I was correct in my understanding (not saying that I am), then you could just plant these things on the back of a regular solar panel and it wouldn't matter where it was pointing. You could also put them on any hot object. This makes me think that this is depending on the heat flux through the material, which would require one side to be cooler than the other, but I can't read the paper so I'm not sure and I didn't catch it from the article. (Please educate me how I'm wrong here. I'm legitimately interested in what is going on!)

EDIT: typos and clarity

You have to think of the material being in equilibrium under most circumstances. If it is, nothing happens. If the atmosphere is reradiating/backscattering the wavelength that you are employing, the system will rapidly equilibrate, and even if you get an initial current it will die down as you rapidly get to the point where each time you emit a photon of the said frequency, you also absorb one (probabilistically speaking). So, overall nothing happens. If your emitter is in one of the atmospheric windows, it emits just like all of the other wavelength emitters would, and it is fully capable of absorbing the bountiful photons at the same wavelength, at which point it will soon be in equilibrium, and a null condition. However, the particular wavelengths chosen are not absorbed by the atmosphere, which means that the atmosphere does not (re)emit them, so the local source of photons at that wavelength is very weak, and a lot of photons are emitted by the device for each which is absorbed at that wavelength from the atmosphere. As people have pointed out the total flux from stars is very low, because they are a very long way away, and deep space is much too cold to emit at these wavelengths, so the absence of atmospheric radiation is the deciding factor here.

I still struggle with this. I think part of my struggling is with this sentence from the article:

"Now envision setting up the reverse situation: there are no incoming photons, and instead we ensure that an excess of photons is radiated away as infrared radiation. An excess of electrons and holes will now form, and those can again be harvested. Except in this case, we don't need any incoming light—instead, we need something to ensure that photons are extracted from the device at high efficiencies."

I just don't think I ever realized that an electric potential could be created involving photon emission rather than absorption. I had always thought of photon emissions occurring only when an electron's energy state is reduced, such as it moving into a lower-energy shell or filling an outer shell hole. I guess, in this situation, heat is ultimately getting converted into both a displaced electron and an emitted IR photon, and that the problem comes from the fact that when you heat the atmosphere enough, it is emitting IR back at you at the same rate, which is then absorbed as both heat and displaced electrons in the opposite direction. Is that an even remotely correct interpretation?

Is it possible to have a panel that can run in "normal" mode at normal efficiencies during the day and then switch to this new mode at night?

Rotate it after sun down and put them on the back lol.

Why would you have to rotate it?

At night time it's just as dark under the panel as on top.

No it's not. Not in the IR band you need. The ground has thermal emission in that band while the (cloudless) sky does not.

Unless I am misunderstanding the concept, by picking a part of the spectrum that air is mostly transparent to, these proposed panels are effectively radiating to space.

Which is pretty neat. If it works as theorized.

That's not theorized. That part actually works. The new idea here is that instead of using it as a passive cooling method matched to a heat engine (getting abysmal efficiency at that tiny differential), they're going to run it as a photovoltaic panel and produce electricity directly from the emission. Thus, they're no longer restricted by the Carnot limit.

You're still limited by Carnot, but in this case, it's between the temperature of the radiator and empty space. That's a good delta.

One thought occurs to me is that the emission of photons that create energy will go up if you get warmer. And that's what would happen if you used a heat cycle (or thermocouple) to extract some energy from the temperature difference between the "cold" radiator and ambient. The emission would got a T^4 in Kelvin so it's a reasonable increase in photons. If the quentum efficiency stays constant, you could get more electrons from the PV and some from the thermal potential.

I still struggle with this. I think part of my struggling is with this sentence from the article:

"Now envision setting up the reverse situation: there are no incoming photons, and instead we ensure that an excess of photons is radiated away as infrared radiation. An excess of electrons and holes will now form, and those can again be harvested. Except in this case, we don't need any incoming light—instead, we need something to ensure that photons are extracted from the device at high efficiencies."

I just don't think I ever realized that an electric potential could be created involving photon emission rather than absorption. I had always thought of photon emissions occurring only when an electron's energy state is reduced, such as it moving into a lower-energy shell or filling an outer shell hole. I guess, in this situation, heat is ultimately getting converted into both a displaced electron and an emitted IR photon, and that the problem comes from the fact that when you heat the atmosphere enough, it is emitting IR back at you at the same rate, which is then absorbed as both heat and displaced electrons in the opposite direction. Is that an even remotely correct interpretation?

Emission happens as a consequence of thermal conditions. There are holes opening and electrons being promoted to eventually fill those holes all the time. There's two ways to make an electrical potential. As an excess of electrons or an excess of holes (negative and positive charge respectively). So in this case, they're tuning a semiconductor gap to the photons that are out of balance. That makes for a convenient way to make a bunch of holes.