This is something that also relates to SETI and various speculation about aliens. We are still finding larger than Pluto size objects in our own solar system and are trying to pin down the size of the galaxy to an order of magnitude.

This energy scale is slightly off in that the Milky Way energy production comes at a little over 3 X 10**47 joules. Note: the point at 10**41 joules. A Type III civilization with 10**47 joules for one year could accelerate one million earth size planets to half the speed of light using one years worth of energy.

Probably the largest motivation for the study of hardware and software technologies aimed at actually implementing reversible computing is that they offer what is predicted to be the only potential way to improve the energy efficiency of computers beyond the fundamental von Neumann-Landauer limit of kT ln 2 energy dissipated per irreversible bit operation, where k is Boltzmann's constant of 1.38 × 10−23 J/K, and T is the temperature of the environment into which unwanted entropy will be expelled.

But, a wide variety of proposed reversible device technologies have been analyzed by physicists.
* With theoretical power-performance up to 10-12 orders of magnitude better than today’s CMOS! [Note: The best that we are capable of theorizing as possible would expect to be the minimum capability for the galactic Kardashev 3 civilization.]
o Ultimate limits are unclear.

So instead of 3.5 X 10*20 per watt for irreversible.
A likely minimum reversible computing capability for an advanced civilization is 10**29 operations per watt. Plus that civilization definitely could have figured out more tricks to go a lot higher.

If 10**19 operations per second is a likely upper bound to simulate a human mind, then one watt could be used to simulate 10 billion human mind equivalents. This would be simulating all the minds on earth for one watt using reasonably advanced reversible computing.

A watt is one joule of energy per second.

3*10**47 joules per year for the galaxy
31, 556 ,926 seconds per year.

10*40 joules per second for the galaxy. (Note: Wikipedia has this at 10**37 watts which is also joule per second. The Wikipedia estimates are before the adjustments to the size of the Milky Way.)

Because all the simulated minds would be on a reversible computing basis, then in time and everything would be reversible in the advanced simulated existence. Anything could be undone and redone/altered in the high fidelity simulated environment.

Carl Sagan suggested adding another dimension: the information available to the civilization. He assigned the letter A to represent 10**6 unique bits of information (less than any recorded human culture) and each successive letter to represent an order of magnitude increase, so that a level Z civilization would have 10**31 bits. In this classification, 1973 Earth is a 0.7 H civilization, with access to 10**13 bits of information. Sagan believed that no civilization has yet reached level Z, conjecturing that so much unique information would exceed that of all the intelligent species in a galactic supercluster and observing that the universe is not old enough to effectively exchange information over larger distances. The information and energy axes are independent, so that even a level Z civilization would not need to be Kardeshev Type III.

Funding Tens of Millions to Develop sub-systems, Prove and Develop Concepts
The Skylon single stage to orbit spaceplane will cost $10-30 billion to develop.

LAPCAT II is a follow-on to the successful LAPCAT I program aimed at technology development for long range hypersonic civil flight. Two vehicle concepts (the precooled turbojet powered Mach 5 and scramjet powered Mach 8) are retained in the new program. LAPCAT II will be completed over a 4 year period and involves 16 partners. The total budget is €10.4M with the EU contributing €7.4M. Reaction Engines is managing the Mach 5 A2 work package which will include intake, nozzle, combustion chamber and vehicle structure studies.

The Skylon reduces the required mass ratio by improving the engine specific impulse by operating in an airbreathing mode in the early stages of the flight – up to around Mach 5.5 and an altitude of 25 kilometres before the engine switches to a pure rocket mode to complete the ascent to orbit. This makes a very significant difference; a pure rocket needs to achieve an equivalent velocity of around 9200 m/sec (7700 m/sec orbital speed and 1500 m/sec in various trajectory losses) whereas the airbreathing absorbs about 1500 m/sec of the orbital speed and 1200 m/sec of the trajectory losses so the pure rocket phases needs to provide only 6500 m/sec and this increases the minimum mass ratio from 0.13 to 0.21. Even with the extra engine mass required for the airbreathing operation this is a far more achievable target.

Pyrosic, from Pyromeral Systems in France, is the leading candidate material for the skin of the vehicle. PyroSic is aStructural High Temperature Composite, which retains good mechanical strength at temperatures as high as 1000°C (1800°F). The material is incombustible and does not release smoke or gas when exposed to heat or fire.

The Skylon development is estimated to take 9.5 years and cost 9518 M (2004 prices). The development program will produce a vehicle with a life of 200 flights, a launch abort probability of 1% and a vehicle loss probability of 0.005%. Assuming a production run of 30 vehicles each vehicle would cost about €565 M. In operation it should be capable of achieving a recurring launch cost of €6.9 M per flight or less.

Precooler Heat Exchanger
The basis of the Reaction Engines manufacturing expertise was a research project conducted by the University of Bristol (completed in 2000) leading to successful
testing of a heat exchanger with a heat transfer of nearly 1 gigawatt per m**3, well within the required performance of the Sabre precooler. This work has been extended with the successful manufacture of tubes in Inconel 718 which have 0.88 mm bore and
40 m wall thickness which ensures good heat exchange properties without compromising physical strength. The tubes have been successfully creep tested at 200 bar and 720 °C and also for oxidation for 1800 hours. The other key technology is the method of brazing the fine tubes into the feeder headers which has also been successfully demonstrated. The Precooler is designed to cool the engine airflow (about 400kg/s) from intake recovered conditions (up to 1000°C at Mach 5) down to about -140°C prior to compression.

The objective of the test programme was to explore the flow stability and behaviour in an unusual rocket nozzle known as an Expansion Deflection Nozzle. In theory these should allow very large expansion nozzles suitable for operating in the vacuum of space to also perform stably and efficiently within an atmosphere. If so, then the performance of single stage to orbit launch vehicles like Skylon could be significantly improved.

Dr Taylor said: “Test programmes like this usually take years and costs hundreds of thousands of pounds, but we’ve done this in 18 months and on a relative shoestring, the whole team has done a really good job but the guys from Airborne Engineering who designed, manufactured and assembled the test rig have worked near miracles.

The STERN engine burns hydrogen and air, the same as Skylon’s Sabre engines when in air breathing mode. To maximise the engine’s life the test firings were held below the engine design values with measured thrust between 1500 and 2000 Newtons (1/5 tonne). Each firing was restricted to less than a second, as any longer and the (un-cooled) chamber walls could start to melt. This still provided sufficient time for the flow to stabilise, and all the required data to be obtained.

The initial results have confirmed that the flow within Expansion Deflection nozzles is stable across a very wide range of pressure ratios. In addition, broad agreement with computer simulations of their behaviour has also been achieved.

Static Test Rocket Incorporating Cooled Thrust Chamber (STRICT)

Project STRICT is the follow on to Project STERN. This engine will be water cooled so that it can be run continuously - The STERN engine can only be fired for short durations. The objective of Project STRICT is to explore the stability of the exhaust flow and the heat input to the engine walls. Formal planning for Project STRICT started in the latter half of 2008.

The STRICT engine's design features are still being established, but its propellants and thrust level are likely to be similar to the STERN engine. The nozzle type is also the subject of a series of cold flow tests before a decision is made on whether to have an ED nozzle or a dual bell nozzle.

Contra rotating turbines can reduce mass and increase efficiency when the speed of sound in the turbine working fluid is significantly greater than the compressor. A four stage contra rotating turbine has been designed for the Scimitar Mach 5 cruise engine which employs high pressure helium in the power loop. The turbine aerodynamics were optimised by a genetic algorithm supported by CFD and FEM analysis. A full linear scale contra-rotating turbine rig has been designed and built which operates with reduced inlet temperature and pressure and a high molecular weight working fluid (argon). The rig generates 0.5% of the full scale turbine power whilst the flow Reynolds numbers are about 30% of full scale. The Reaction Engines B9 test facility has been modified to supply up to 12kg/s of gaseous argon at 5 bar and 350K to enable blowdown runs of about 5 minutes duration. Turbine testing was scheduled to start in September 2008.

Old C Design Specs and New D Design

The technology programs carried out over the last two decades have shown that the technology assumptions in the HOTOL/Skylon projects are achievable. In many cases experimental investigation has led to further development in new areas so that greater performance may be available when the final design for Skylon is undertaken.

This extra performance can be used to both increase the system margins reducing the technical risk and increase the performance, with consequent reduction in the specific launch costs.

Skylon C specs are above

The next stage is a final set of research projects with substantiall increased funding and a wider range of industrial partners. This will give high confidence in the technology assumptions used in the final design. Skylon configuration C1 is a relatively old design and work is underway to incorporate various improvements into a new baseline; configuration D.

The new D1 vehicle will be slightly bigger, with a 25% increase in payload mass (from 12 tons to 15 tons to LEO). The payload bay is being resized and there is a revision to the mounting provisions and other payload support features. The new configuration will include the result of a number of technology development programmes almost certainly including an Expansion Deflection Nozzle in the Sabre Engine following the successful STERN Engine test programme.

A follow on project called STRICT (Static Test Rocket Incorporating Cooled Thrust-chamber) will be started in early 2009. This engine will be water cooled allowing extended firing runs (STERN is limited to half a second) and explore heat transfer within the nozzle.

A competitive advantage lies with the Sabre airbreathing engine technology combined with the Skylon optimised airframe. A Hybrid Airbreathing / Rocket Engine, Sabre Represents a Huge Advance over LACE Technology. The design of Sabre evolved from liquid-air cycle engines (LACE) which have a single rocket combustion chamber with associated pumps, preburner and nozzle which are utilised in both modes. LACE engines employ the cooling capacity of the cryogenic liquid hydrogen fuel to liquefy incoming air prior to pumping. Unfortunately, this type of cycle necessitates very high fuel flow.

Cost to Orbit

The Skylon vehicle has been designed with the aim of achieving not less than 200 flights per vehicle. This seems a reasonable target for a first generation machine. Various scenarios have been examined but the uncertainty lies with assumptions on traffic growth.

At present the true launch cost of a typical 2-3 tonne spacecraft is about $150 million. Actual costs paid by customers vary from about one-third to one half of this due to the hidden subsidies on vehicle development, range maintenance, range activity and support infrastructure. For Skylon, if no growth occurred and all operators flew equal numbers of the current approximately 100 satellites per year using 30 in-service spaceplanes from 3 spaceports, the true launch cost would be about $40 million per flight [$1200/lb to LEO].

They expect mission costs to fall to about $10 million per launch for high product value cargo (e.g. communications satellites) $2-5 million for low product value cargo (e.g. science satellites) and for costs per passenger to fall below $100k, for tourists when orbital facilities exist to accommodate them.

As high volume flights are performed the 15 ton payload to LEO orbit would be $2-10 million per launch which would be $66/lb to $330/lb.

Efficient Hydrogen fuel with the Skylon design could achieve 2000-3000 ISP. 4 to 7 times better than most chemical rockets.

Rotovator tether systems or laser arrays could make the Mach 8+ segment of getting into orbit more efficient and cheaper.

REL has also recently completed an internally funded study into the launch aspects of Solar Power Satellites using Skylon. A solar power satellite (SPS) collects sunlight and transforms it into electrical energy. This is electromagnetically beamed back to Earth where it is collected by a large array of receivers for conversion to electricity.

Unlike conventional renewable energy (wind, wave) SPS is scalable to very high power levels (>5Gw) and can provide baseload power (load factor >90%). Although first proposed in 1968 the idea has not been implemented due mainly to the very high cost of current expendable launchers. However the report shows that the low cost, high reliability and rapid turnaround characteristics of Skylon would overcome this problem. The launch cost would be reduced by a factor of 50-100 compared to today's expendable launchers and about 5 compared to reusable TSTO rockets.

Development of the Skylon is fighting entrenched interests in ESA for support of Arianespace and existing expendable launchers.

I have concerns that companies purely reliant on a succession of research grant funding from a small number of agencies are inherently vulnerable to changes in bureaucratic climate. The primary "product" of Reaction Engines is test data and engineering R&D reports. The primary product of XCOR and Armadillo is fully working propulsion systems that customers can install and use in their own vehicles. UK NewSpace companies need tangible product that they can sell to paying customers NOW (not in ten years' time), and then reinvest the sales profits in developing new systems.

In some ways, the Bristol Spaceplanes/David Ashford solution of a two-stage winged RLV is "better" in that it affords more intermediate stages which can be monetized before developing a full-scale orbital RLV. The operability and performance may be somewhat lower, but the technical hurdles are also significantly lower (some questions remain about e.g. vehicle separation at supersonic/hypersonic speeds). However, Bristol Spaceplanes doesn't even have the resources to cross the first hurdle (the Ascender suborbital spaceplane) which IMO is desperately sad.

According to some unofficial reports the laptop will feature 2GB of memory, WiFi, fixed Ethernet, expandable memory, and consume just 2 watts of power. India could be swimming in cheap silicon within the next 6 months if the project can keep to schedule.

The Bionic Energy Harvester can produce enough power from a one-minute walk to juice a cell phone for 30 minutes. The Knee Generator can generate 7 watts. Over three times more than the $10 laptop will need. At more than three pounds, the generator, called the Bionic Energy Harvester, is cumbersome. But thanks to lighter gears and a framework made of lightweight materials such as carbon fiber, the latest model, which is expected in the next year or so, should weigh closer to one pound. A microcomputer will replace a standalone computer that is wired to the unit in the current prototype.

One knee brace-wearing subject generated 54 watts of power by running in place.

The Energy Harvester will cost a lot more than the $10 laptop. However, the price for energy harvesters and low power devices will converge for interesting applications.

The $10 laptop has come out of the drawing board stage due to work put in by students of Vellore Institute of Technology, scientists in Indian Institute of Science, Bangalore, IIT-Madras and involvement of PSUs like Semiconductor Complex. “At this stage, the price is working out to be $20 but with mass production it is bound to come down,” R P Agarwal, secretary, higher education said.

Apart from questioning the technology of $100 laptops, the main reason for HRD ministry's resistance to Negroponte's One Laptop Per Child (OLPC) project was the high and the hidden cost that worked out to be $200.

The mission launch would also see demonstration of e-classroom, virtual laboratory and a better 'Sakshat' portal that was launched more than two years ago.

Larry Rome of the University of Pennsylvania has created the Lightning Pack, a backpack that captures energy from the natural up-and-down movement of your hips. As you walk, a bag bounces on a spring, which connects through gears to an electrical generator. Wires carry the electricity to your batteries or gadgets. The output is impressive: 20 watts, enough for nearly all portable devices, Rome says. But the bag is impractical for most people because it needs to weigh 80 pounds to generate 20 watts. (The heavier the load, the more mass that oscillates up and down, and the greater the kinetic energy potential.) The U.S. Marine Corps, however, is interested and has commissioned a pack for soldiers.

And that's the slightly creepy part. All that wireless power was there already. The TV transmission radio waves streaming past us all, pretty much all the time, contain that much energy. And that's not to mention the sea of FM radio transmissions, Wi-Fi signals, GSM signals from cellphones and towers, Bluetooth headsets...

The bending of the knee during walking is identified as one of the more promising opportunities to harvest energy from the body, because the leg muscles work against the motion of the leg for part of the gait cycle (while the leg is falling), during which time energy is turned into wasted heat. The authors estimate that up to 50 W could be harvested this way with little impact on the gait, although a large device would be required with well separated attachment points.

Determining the available power from a specific implementation of an inertial microgenerator powered by human walking motion. Acceleration data were collected from human male subjects walking on a treadmill and fed into a time-domain model of the generator in order to determine the available power. For a proof mass of 1 g and an available internal displacement of 5 mm, power outputs as high as 200 W were calculated; this would appear to assume ideal harvester performance.

Ampere or AmpAn Ampere or an Amp is a unit of measurement for an electrical current. One amp is the amount of current produced by an electromotive force of one volt acting through the resistance of one ohm. Named for the French physicist Andre Marie Ampere. The abbreviation for Amp is A but its mathematical symbol is "I". Small currents are measured in milli-Amps or thousandths of an Amp.

Amp Hour or Ampere-HourA unit of measurement of a battery's electrical storage capacity. Current multiplied by time in hours equals ampere-hours. One amp hour is equal to a current of one ampere flowing for one hour. Also, 1 amp hour is equal to 1,000 mAh

Convert Watt to AmpYou can also convert Wh to Ah by rebalancing the early equation to the one shown below.

Graphane crystal. This novel two-dimensional material is obtained from graphene (a monolayer of carbon atoms) by attaching hydrogen atoms (red) to each carbon atoms (blue) in the crystal. Credit: University of Manchester

What is huge:1. Graphene has already has a lot of great properties. Strongest material. Very conductive.2. Now graphene can be chemically modified to tune the properties even more. Making something highly conductive and highly insulating means all kinds of electrical devices are possible3. This is opening the door to even more chemical modification. 4. Graphene has already been turned into proof of concept liquid crystal display devices (single pixel) and quantum dots and transistors

Professor Geim and Dr Novoselov have used hydrogen to modify highly conductive graphene into a new two-dimensional crystal - graphane.

The addition of a hydrogen atom on each of the carbon atoms in the graphene achieved the new material without altering or damaging the distinctive one-atom-thick ‘chicken wire’ construction itself.

But instead of being highly conductive, like graphene, the new substance graphane has insulating properties.

The researchers say the findings demonstrate that the material can be modified using chemistry - clearing the way for the discovery of further graphene-based chemical derivatives.

“Graphene is an excellent conductor and is tipped for many electronic applications,” said Dr Novoselov. “However it was tempting to look at ways to gain additional control of its electronic properties through the use of chemistry.

“Our work proves that this is a viable route and hopefully will open the floodgates for other graphene-based chemical derivatives. This should widen the possible applications dramatically.”

Professor Geim said: “The modern semiconductor industry makes use of the whole period table: from insulators to semiconductors to metals.

“But what if a single material is modified so that it covers the entire spectrum needed for electronic applications?

“Imagine a graphene wafer with all interconnects made from highly conductive, pristine graphene whereas other parts are modified chemically to become semiconductors and work as transistors.”

You take six pictures of your mixed up Rubik's Cube using the iPhone's camera — one photo per side. If you have an iPod Touch, you can also tap in the color combos manually. CubeCheater is able to recognize the placement of each colored square and generate a map of your cube. It then figures out the quickest path to solving the puzzle and gives you a set of easy-to-follow, step-by-step instructions.

The current guru of Rubik's computer algorithms is Herbet Kociemba, creator of the open source Cube Explorer software program. Kociemba's solver software is currently used by computer science students at universities to build cube-solving robots, some of which also use a camera and image-recognition tech to figure out the color patterns.

Rubik cube solution algorithms and instructions have been around almost as long as the cube. The algorithms have been improved over time. The breakthrough is using image recognition and a device and interfaces to make it far easier for people to use. The "human enhancement" is made more widely available and accessible and easy to implement.

There is also an ipod application for snipers. It is like a sniper caddy. It gives adjustments for wind and other conditions based on distance and direction to the target. This is the role of the sniper spotter when there are pairs of snipers.

Breakthroughs in training physical techniques such as martial arts (like in the Matrix - Neo: "I know kungfu"), dance, gymnastics, swimming etc... could come because of advanced exoskeleton, robotics and prosthetics. An exoskeleton that can guide a persons movements fairly precisely and provide some resistance could be used to accelerate developing proper form and muscle memory of the technique.

There is already motion capture training of robots to record and repeat tasks. Motion capture could be used and then loaded into robotic exoskeletons to provide human training assistance. A person could be moved like a puppet (go along for the ride) in performing the task.

Prosthetics are getting more advanced and connecting to the nerves and muscles of people.

January 28, 2009

Over the past two decades, nuclear power plants have achieved increasingly higher capacity factors with the same or greater levels of safety. The average capacity factor for U.S. plants in operation in 1980 was 56.3 percent; in 1990, 66 percent; and in 2007, 91.8 percent.

* Countries already at the top of the capacity factor league table – Finland, Germany, Belgium and Spain – are holding steady at around 90% or above.

* US capacity factors are now pushing up towards the best of the European fleet.

* The US capacity factor improvement in the 1990s has added the equivalent of 23 1000 MWe plants to the grid.

* Russian and Ukrainian reactors have made noteworthy output improvements.

French capacity factors continue to be lower than technical and operational standards would indicate, due to the surplus nuclear capacity in France and the limitations on exporting the surplus to neighbouring countries.

So higher capacity factor reasons- better components and the whole industry learning and sharing the best way to run things. So unless your reactor design limits what you can do then getting to 85% is definitely doable and that is why 90% is the average.

There was learning but the whole industry learned. Plus there was the spread of quality management programs throughout business. There was also component improvement as noted. Just as there has been improvement in the quality and life of components in cars and TVs since the 70s and 80s.

1967 2 years and 24000 miles1970 reduced back to 1 year and 12000 miles1981 2 years and 24000 miles, chrysler 5 years and 50,000 miles1987 3 years/unlimited miles to 6 year/60,000 miles2003 up to 10 year/100000 miles avg 4.7 years 55,000 miles

A modern/new nuclear plant does not have to relearn the wheel and all of its people and management are not starting with a blank slate and 1975 tech and practices.

Michael Ignatieff, the new and assertive leader of the opposition Liberals, outlined a series of conditions on Wednesday for backing the budget, ensuring the document was likely to be adopted by Parliament.

The Liberals' confident tone is remarkable, given that in last October's election, under previous leader Stephane Dion, the party put in one of its worst ever performances.

One of the many reasons for the poor showing was the fact that the Liberals -- short of money, dispirited and keen to avoid a new election while trailing in the polls -- often backed the Conservatives on key confidence votes, giving an impression of weakness.

Although Ignatieff has much less time for compromise, he, like Dion before him, is again backing the government. The leaders of both other opposition parties mocked his decision and predicted Ignatieff would continue to find reasons to support Harper.

The Fusion Development Facility Mission (FDF): Develop Fusion’s Energy Applications but the Fusion Development Facility could also be the basis for a steady state neutron source for transmuting nuclear waste from nuclear fission reactors• Develop the technology to make– Tritium– Electricity– Hydrogen• By using conservative Advanced Tokamak physics to run steady-state and produce 100-250 MW fusion power– Modest energy gain (Q<5)– Continuous operation for 30% of a year in 2 weeks periods– Test materials with high neutron fluence (3-8 MW-yr/m2)– Further develop all elements of Advanced Tokamak physics, qualifying them for an advanced performance DEMO

The first main blanket fusion system could be sufficient for a 100MW, 30-50% available fusion neutron source. The second main blanket seems definitely seems good enough for transmutation purposes. So 8-13 years to get a fusion transmutation system seems possible. The new Super X divertor success in negating the need for better containment walls could accelerate the schedule.

Researchers at the University of East Anglia (UEA) have carried out the first comprehensive assessment of the relative merits of different geoengineering schemes in terms of the climate cooling potential. Their paper appears in the journal Atmospheric Chemistry and Physics Discussions. [H/T Greencar congress Using a mix of these geoengineering techniques and mitigating production of CO2 (by changing energy production away from coal and oil to nuclear and renewables) are both required. Some have noted that the different geoengineering techniques should be analyzed and compared on how well they help avoid disasterous changes in the chemistry of the ocean. Biochar sequestering would be such an ocean chemistry friendly method.

Climate geoengineering proposals seek to combat the effects of climate change—in particular to counteract the effects of increased CO2 in the atmosphere. There are two basic approaches proposed: reducing the atmospheric absorption of incoming solar (shortwave) radiation, or removing CO2 from the atmosphere and transferring it to long-lived reservoirs, thereby increasing outgoing longwave radiation.

A number of schemes have been suggested including nutrient fertilization of the oceans, cloud seeding, sunshades in space, stratospheric aerosol injections, and ocean pipes.

The critical metric for a geoengineering scheme is its effectiveness in cooling the climate; Tim Lenton and Nem Vaughan at UEA quantified that effectiveness in terms of radiative forcing potential.

Among their findings:

Enhancing carbon sinks could bring CO2 back to its pre-industrial level, but not before 2100—and only when combined with strong mitigation of CO2 emissions. Carbon cycle geoengineering carries less risk associated with failure.

Stratospheric aerosol injections and sunshades in space have by far the greatest potential to cool the climate back to pre-industrial temperatures by 2050. However, they also carry the most risk because they would have to be continually replenished and if deployment was suddenly stopped, extremely rapid warming could ensue.

Existing activities that add phosphorous to the ocean may have greater long-term carbon sequestration potential than deliberately adding iron or nitrogen.

On land, sequestering carbon in new forests and as bio-char (charcoal added back to the soil) have greater short-term cooling potential than ocean fertilization as well as benefits for soil fertility.

Air capture and storage shows the greatest potential, potentially combined with afforestation/reforestation and bio-char production.

Increasing the reflectivity of urban areas could reduce urban heat islands but will have minimal global effect.

The beneficial effects of some geo-engineering schemes have been exaggerated in the past and significant errors made in previous calculations.

Without mitigation, anthropogenic climate forcing could reach ~7W m-2 on the century timescale and remain greater than ~7W m-2 on the millennial scale. Even in a strong mitigation scenario, anthropogenic forcing will remain >1W m-2 for the rest of the millennium, exceeding 3W m-2 on the century timescale.

Climate geoengineering is best considered as a potential complement to the mitigation of CO2 emissions, rather than as an alternative to it. Strong mitigation could achieve the equivalent of up to -4W m-2 radiative forcing on the century timescale, relative to a worst case scenario for rising CO2. However, to tackle the remaining 3W m-2, which are likely even in a best case scenario of strongly mitigatedCO2, a number of geoengineering options show promise.

* Shortwave options that either increase the reflectivity (or albedo) of the Earth or block some percentage of incoming sunlight. These include megascale projects like orbiting mirrors and stratospheric sulphate, as well as more localized and prosaic methods like white rooftops and planting brighter (=more reflective) plants. * Longwave options are those that attempt to pull CO2 out of the atmosphere in order to slow warming. These include massive reforestation projects, "bio-char" production and storage, various air capture and filtering plans, and ocean biosphere manipulation with iron fertilization or phosphorus.

Most effective (again, strictly in terms of radiative impact) over this century would be either space shields, stratospheric injection, or increasing cloud levels with seawater. Any of these, alone, could actually be enough to counteract global warming along with aggressive carbon emission reductions.

Next would be increasing desert albedo (essentially putting massive reflective sheets across the deserts of the world) or direct carbon capture and storage (ideally captured from burning biofuels). These would slow global warming disaster, but wouldn't necessarily be enough to stop it. Biochar, reforestation, and increasing cropland & grassland albedo come in third, half again as effective as the previous proposals; the remaining methods would be even less-effective.

The "bathtub model" by Andy Revkin, New York Times, for too much CO2 and geoengineering.

Imagine the climate as a bathtub with both a running faucet and an open drain. As long as the amount of water coming from the faucet matches (on average) the capacity of the drain, the water level in the tub (that is, the carbon level in the atmosphere) remains stable. Over the course of the last couple of centuries, however, we've been turning up the water flow -- increasing atmospheric carbon concentrations -- first slowly, then more rapidly. At the same time, one consequence of our actions is that the drain itself is starting to get clogged -- that is, the various environmental carbon sinks and natural carbon cycle mechanisms are starting to fail. With more water coming into the tub, and a clogging drain, the inevitable result will be water spilling over the sides of the bathtub, a simple analogy for an environmental tipping point catastrophe.

With this model, we can see that simply slowing emissions to where they were (say) a couple of decades ago won't necessarily be enough to stop spillover, if the carbon input is still faster than the carbon sinks can handle.

That said, our efforts at stopping this catastrophe have -- rightly -- focused on reducing the water flowing from the faucet (cutting carbon emissions) as much as possible. But the flow of the water is still filling the tub faster than we can turn the faucet knob (we're far from getting carbon emissions to below carbon sink capacity). Without something big happening, we're still going to see a disaster.

The shortwave geoengineering proposals, by blocking some of the incoming heat from the Sun, are the equivalent here to building up the sides of the tub with plastic sheets. The tub will be able to hold more water, although if the sheeting fails, the resulting spillover will be even worse than what would have happened absent geoengineering.

The longwave geoengineering proposals, by increasing carbon capture, are the equivalent here to clearing out the drain, or even drilling a few holes in the bottom of the tub (let's assume that just goes to the drain, too). The water will leave the tub faster, but you may have to drill a lot of holes to have the impact you need -- and drilling too many holes could itself be ruinous.

January 27, 2009

One of cancer's cleverest tricks is its ability to hide from the immune system. A new approach to cancer treatment called immunotherapy could spare patients at least some of the grueling battery of chemotherapy treatments by retraining the body's own defenders--the cells of the immune system--to recognize and destroy tumors. Now researchers at Harvard University have developed a simple way to do this inside the body: a polymer implant attracts and trains immune-system cells to go after cancer.

The experimental approach has shown great success in animal studies, increasing the survival rate of mice with a deadly melanoma from 0 to 90 percent. The implant could also be used to treat diseases of the immune system such as arthritis and diabetes, and, potentially, to train other kinds of cells, including stem cells used to repair damage to the body.

Currently when dendrtic cells are trained outside the body most of them died when transplanted.

First, it attracts dendritic cells by releasing a kind of chemical signal called a cytokine. Once the cells are there, they take up temporary residence inside spongelike holes within the polymer, allowing time for the cells to become highly active.

The polymer carries two signals that serve to activate dendritic cells. In addition to displaying cancer-specific antigens to train the dendritic cells, it is also covered with fragments of DNA, the sequence of which is typical of bacteria. When cells grab on to these fragments, they become highly activated. "This makes the cells think they're in the midst of infection," Mooney explains. "Frequently, the things you can do to cells are transient--especially in cancer, where tumors prevent the immune system from generating a strong response." This extra irritant was necessary to generate a strong response, the Harvard researchers found.

When implanted just under the skin of mice carrying a deadly form of melanoma, the polymer increased their survival rate to about 90 percent. By contrast, conventional immunotherapies that require treating the cells outside the body are 60 percent effective, says Mooney.

Mooney developed the polymer systems with more than melanoma in mind, however. He hopes to develop similar implants for treating other types of cancer, which should simply be a matter of changing the antigen carried by the polymer. But the approach could also be used to treat other kinds of immune disorders. For example, different chemical signals could dampen immune cells' activity in order to prevent transplant rejections and treat autoimmune diseases such as type 1 diabetes and rheumatoid arthritis, which result when the immune system attacks normal tissues. Mooney also hopes that the polymer system can train a different class of cells altogether. Just as fragile dendritic cells seem to respond better to being trained inside the body, this might be a more effective way to recruit and reprogram stem cells.

If proved in people, the cell-training polymers might also bypass some of the regulatory hurdles and expense faced by cell therapies, since devices are more readily approved by the Food and Drug Administration. Indeed, Mooney predicts that the therapy will move quickly through safety tests in large animals (the next step before human trials), and he expects to bring the cancer immunotherapy to clinical trials soon. "All the components are widely used and tested, and shown to be safe," he says.

Single atom quantum dots created by researchers at Canada’s National Institute for Nanotechnology and the University of Alberta make possible a new level of control over individual electrons, a development that suddenly brings quantum dot-based devices within reach.

It is demonstrated that the silicon atom dangling bond (DB) state serves as a quantum dot. Coulomb repulsion causes DBs separated by 2 nm to exhibit reduced localized charge, which enables electron tunnel coupling of DBs. Scanning tunneling microscopy measurements and theoretical modeling reveal that fabrication geometry of multi-DB assemblies determines net occupation and tunnel coupling strength among dots. Electron occupation of DB assemblies can be controlled at room temperature. Electrostatic control over charge distribution within assemblies is demonstrated.

Research project leader Robert A. Wolkow described the potential impact saying, “Because they operate at room temperature and exist on the familiar silicon crystals used in today’s computers, we expect these single atom quantum dots will transform theoretical plans into real devices.”

The single atom quantum dots have also demonstrated another advantage – significant control over individual electrons by using very little energy. Wolkow sees this low energy control as the key to quantum dot application in entirely new forms of silicon-based electronic devices, such as ultra low power computers. “The capacity to compose these quantum dots on silicon, the most established electronic material, and to achieve control over electron placement among dots at room temperature puts new kinds of extremely low energy computation devices within reach.”

Zyvex Labs today announced the award of a $9.7M program funded by DARPA (Defense Advanced Research Projects Agency) and Texas' ETF (Emerging Technology Fund). The goal of this effort is to develop a new manufacturing technique that enables "Tip-Based Nanofabrication" to accelerate the transition of nanotechnology from the laboratory to commercial products. Starting with the construction of 'one-at-a-time' atomically precise silicon structures, the Consortium initially plans to develop atomically precise, 'quantum dot' nanotech-based products in volume at practical production rates and costs.

This site had previously looked at non-direct electric uses for nuclear fusion and transmutation was one of them. Transmutation is over three times easier to do than fusion for electricity. It does not have to be positive energy generating for the nuclear fusion part. The electricity is supplied and the fusion device is viewed as an "energy using neutron generator". The uranium is converted by the neutrons back to an isotope or into plutonium that the nuclear fission reactor can use as fuel. The fusion neutron generator only has to be available about half the time.

The CFNS would provide abundant neutrons through fusion to a surrounding fission blanket that uses transuranic waste as nuclear fuel. The fusion-produced neutrons augment the fission reaction, imparting efficiency and stability to the waste incineration process. One hybrid would be needed to destroy the waste produced by 10 to 15 LWRs (light water reactors). So Seven or eleven hybrids would be need to match up with the existing 104 nuclear reactors in the United States. Thirty to forty-five would be need to match up with the worlds existing nuclear fission reactors. The process would ultimately reduce the transuranic waste from the original fission reactors by up to 99 percent.

The subcritical FFTS (Fusion Fission Transmutation Scheme) acquires a definite advantage over the critical FR (Fast reacotor) approach because of its ability to support an innovative fuel cycle that makes the cheaper LWR do the bulk (75%) of the transuranic transmutation via deep burn in an inert matrix fuel form. This cycle is not accessible to the FR approach because the remaining marginally fissionable long‐term radiotoxic and biohazardous transuranics cannot be stably and safely burned in critical reactors. The fission part of the Hybrid consists of standard FR components; a sodium‐cooled metal fueled lattice featuring geometry similar to that of the Generation‐IV Sodium Fast Reactor (SFR) is proposed. The critical milestone in the development of the Hybrid lies in the realization of the CFNS as a relatively inexpensive, high source density fusion neutron source.

Super X Divertor

The CFNS is based on a tokamak, which is a machine with a "magnetic bottle" that is highly successful in confining high temperature (more than 100 million degrees Celsius) fusion plasmas for sufficiently long times.

The crucial invention that would pave the way for a CFNS is called the Super X Divertor. The Super X Divertor is designed to handle the enormous heat and particle fluxes peculiar to compact devices; it would enable the CFNS to safely produce large amounts of neutrons without destroying the system. The SuperX‐Divertor (SXD) is a new magnetic configuration that allows the system to safely exhaust large heat and particle fluxes peculiar to CFNS‐like devices.

"The intense heat generated in a nuclear fusion device can literally destroy the walls of the machine," says research scientist Valanju, "and that is the thing that has been holding back a highly compact source of nuclear fusion."

The scientists say their Super X Divertor invention has already gained acceptance in the fusion community. Several groups are considering implemented the Super X Divertor on their machines, including the MAST tokamak in the United Kingdom, and the DIIID (General Atomics) and NSTX (Princeton University) in the U.S. Next steps will include performing extended simulations, transforming the concept into an engineering project, and seeking funding for building a prototype.

Waste Destruction System

The scientists' waste destruction system would work in two major steps.

First, 75 percent of the original reactor waste is destroyed in standard, relatively inexpensive LWRs. This step produces energy, but it does not destroy highly radiotoxic, transuranic, long-lived waste, what the scientists call "sludge."

In the second step, the sludge would be destroyed in a CFNS-based fusion-fission hybrid. The hybrid's potential lies in its ability to burn this hazardous sludge, which cannot be stably burnt in conventional systems.

"To burn this really hard to burn sludge, you really need to hit it with a sledgehammer, and that's what we have invented here," says Kotschenreuther.

One hybrid would be needed to destroy the waste produced by 10 to 15 LWRs.

The process would ultimately reduce the transuranic waste from the original fission reactors by up to 99 percent. Burning that waste also produces energy.

The CFNS is designed to be no larger than a small room, and much fewer of the devices would be needed compared to other schemes that are being investigated for similar processes. In combination with the substantial decrease in the need for geological storage, the CFNS-enabled waste-destruction system would be much cheaper and faster than other routes, say the scientists.

A new magnetic geometry, the Super X divertor (SXD), is invented to solve severe heat exhaust problems in high power density fusion plasmas. SXD divertor plates are moved to the largest major radii inside the TF coils, increasing the wetted area by 2-3 and line length by 3-5. 2D simulations show a several fold decrease in divertor heat flux and plasma temperature atthe plate. A small high power density device using SXD is proposed, for either 1) useful fusion applications using conservative physics, such as a Component Test Facility or 2) to develop more advanced physics modes for a pure fusion reactor in an integrated fusion environment.

The heat flux is up to 5 times less with the SXD versus a regular divertor.Without the lower heat there would need to be super-materials from the EU or Japan in a few decades because the USA is not developing advanced fusion materials anymore. With the SXD current materials appear to be good enough.

Explosions can shatter bones into what is called a non-union fracture. The bones generally will not heal in a timely manner. It can lead to amputation.

If fracture putty proves successful, injured soldiers could fundamentally regain full use of their legs in a much shorter period of time. It could also be used in emergency rooms to treat civilians injured in traffic accidents and other traumatic events.

Success on even a small part of the project has the potential to revolutionize orthopedic medicine. It could give people with serious leg injuries an opportunity to regain full use of limbs that now require amputations or the use of permanent implants,” Ferrari said. “We’re creating a living material that can be applied to crushed bones. The putty will solidify inside the body and provide support while the new bone grows.”

DARPA Program Manager Mitchell Zakin, Ph.D., said: “This undertaking represents the ultimate convergence of materials science, mechanics and orthopedics. I look forward to the first results, which should present themselves in about a year or so.”

Ferrari’s team will begin the pre-clinical study by testing the mechanical and biological properties of candidate compounds in mathematical models and in vitro systems. Afterward, the compounds will be tested in several animal models. The study, BioNanoScaffolds for Post-Traumatic OsteoRegeneration,” runs through December 2010.

Ennio Tasciotti, Ph.D., a research assistant professor in Ferrari’s lab, said the putty will include a material called nanoporous silicon that was developed in Ferrari’s lab, which will give the putty the strength it needs to support the patient’s weight while new bone tissue is being regenerated.

Developing a new way to repair long bone injuries is extremely challenging. According to Tasciotti, “This problem will require the contributions of a team of the best scientists in the fields of nanoporous silicon, bio-mimetic peptides, bio-polymers, stem cells and adhesives. The solution will come from the integration of nanomaterials with unique properties in a smart composite substance that can mimic bone structure and function.”

He added, “The fracture putty will serve as a bioactive scaffold and will be able to substitute for the damaged bone. At the same time, the putty will facilitate the formation of natural bone and self-healing in the surrounding soft tissue through the attraction of the patient’s own stem cells. The putty will have the texture of modeling clay so that it can be molded in any shape in order to be used in many different surgical applications including the reconnection of separated bones and the replacement of missing bones.”

Tasciotti said the fracture putty could one day be used to address injuries involving the spine, skull and facial bones.

FURTHER READING

Osteomyelitis is (OM) a bone infection caused by various bacteria, and usually occurs in severe fractures when bone is exposed to open air. Current antibiotics often kill a strain of bacteria responsible for a disease, only to create a vacuum quickly filled by related strains. The widespread overprescribing of antibiotics and the speed of bacterial evolution have greatly increased the likelihood that the strains most able to resist antibiotics will thrive. Multi-drug resistant (MDR) bacterial strains are now widespread in all hospitals. While bone cements laced antibiotics against staph and strep infections are common (e.g. vancomycin), no group had ever developed a bone cement treatment using colistin against A. baumannii.

To begin the process of providing such a treatment for soldiers, a team of orthopaedic, military and pharmaceutical researchers came together to conduct the current study, the results of which argue for a human clinical trial with colistin-laced bone cement, University of Rochester Medical Center researchers said.

In order to meet energy demand in China, the high temperature gas-cooled reactor-pebble-bed module (HTR-PM) is being developed. It adopts a two-zone core, in which graphite balls are loaded in the central zone and the outer part is fuel ball zone, and couple with a steam cycle. Outer diameter of the reactor core is 4.0 m and height of the core is 9.43 m. The helium inlet and outlet temperature are 250 and 750 C, respectively. An earlier design of the reactor had thermal power of 380 MW. Preliminary studies show that the HTR-PM is feasible technologically and economically. In order to increase the reactor thermal power of the HTR-PM, some efforts have been made. These include increasing the height of reactor core, optimizing the thickness of fuel zone and better selection of the scheme of central graphite zone, etc.

They figure that as they make their 13th pack of eight 200MW electrical plants that cost will come down to 60-70% of the first one.

They are targeting 90% of the cost of a Pressure Water Reactor (PWR) but worst case the HTR figures to 120% of the cost of the PWR.

China plans to go to a very high temperature reactor design before 2020 and make hydrogen, switch to gas turbine and super-critical power cycle, and use spent fuel.

In 2006, they changed from a 458 MW thermal reactor design to two 250 MW thermal. They had technical uncertainties for the annual core:* Dynamic annual core: reactivity control, helium outlet temperature, fuel flow demonstration, etc..* Solid annual core: graphite replacement, pressure drop, fuel flow at the bottom, etc..* Cost analysis indicates the difference between specific capital costs of 1×458 MWt and 2×250 MWt are limited.* It is estimated that the specific costs of a ready-to-build 2×250 MWth modular plant will be only 5% higher than the specific costs of one 458 MWth plant. When considering the technical uncertainties of the latter, a 2×250 MWth modular plant seems to be more attractive.

The Chinese high temperature reactor powered down within 8 minutes without control rods.

Uranium Distribution in the Earth's Crust The following table is from Deffeyes & MacGregor, "World Uranium resources" Scientific American, Vol 242, No 1, January 1980, pp. 66-76. The total abundance of Uranium in the Earth's crust is estimated to be approximately 40 trillian tonnes. The Rossing mine in Nambia mines Uranium at an Ore concentration of 300 ppm at an energy cost 500 times less than the energy it delivers with current thermal-spectrum reactors. If the energy cost increases in inverse proportion to the Ore concentration, shales and phosphates, with a Uranium abundance of 10 - 20 ppm, could be mined with an energy gain of 16 - 32. The total amount of Uranium in these rocks is estimated to be 8000 times greater than the deposits currently being exploited.

Vapor Fuel Technologies hired an independent laboratory, California Environmental Engineering (Santa Ana, Calif.), which is certified by both the the U.S. Environmental Protection Agency and the California Air Resources Board. The results of the testing showed performance comparable to a stock Ford F-150 test vehicle, while achieving fuel economy that was better by more than 30 percent; about 30 percent less emissions were released, the tests found.

Piggyback electronics and an add-on "vapor chamber" could be used to increase the mileage of existing automobile and truck engines by almost one-third while lowering emissions. Electronic modules from Unichip (Hillsboro, Ore.) are typically used to boost the performance of existing vehicles by intercepting signals from sensors and modifying their values before delivery to an engine control unit. Instead of increasing horsepower, a module being created by Unichip for Vapor Fuel Technologies will modify data flowing to and from the stock control unit to accommodate the super-heated vaporized fuel mix that provides increased fuel economy and lower emissions.

VFT has decided to introduce the system in stages based on engine platforms. The first retrofit packages will be for the larger 8 cylinder Trucks and SUVs. We believe this will make the largest immediate impact to the economy concerning gas prices and environmental issues.

Can my vehicle be used for testing?VFT is not accepting individual vehicle donations for testing of the product. We are however seeking fleet vehicles for testing to begin in 2009.