Within a decade, we could be routinely interacting with machines that are truly autonomous – systems that can adapt, learn from their experience and make decisions for themselves. Free from fatigue and emotion, they would perform better than humans in tasks that are dull, dangerous or stressful.

Already, the systems we rely on in our daily lives are being given the capacity to operate autonomously. On the London Underground, Victoria line trains drive themselves between stations, with the human “driver” responsible only for spotting obstacles and closing the doors. Trains on the Copenhagen Metro run without any driver at all.

While our cars can’t yet drive themselves, more and more functions are being given over to the vehicle, from anti-lock brakes to cruise control. Automatic lighting and temperature control are commonplace in homes and offices.

The areas of human existence in which fully autonomous machines might be useful – and the potential benefits – are almost limitless. Within a decade, robotic surgeons may be able to perform operations much more reliably than any human.

Smart homes could keep an eye on elderly people and allow them to be more independent. Self-driving cars could reduce congestion, improve fuel efficiency and minimise the number of road accidents.

But automation can create hazards as well as removing them. How reliable does a robot have to be before we trust it to do a human’s job? What happens when something goes wrong? Can a machine be held responsible for its actions?

Researchers have found a way to create a new generation of tiny microchips that use DNA – rather than traditional silicon – to achieve potentially revolutionary advances in computing.

A team based at IBM’s Alamaden research facility in San Jose, California, has found a method for building chips that they believe could eventually replace the current standards for creating electrical circuits using silicon wafers.

The technique, which was developed in conjunction with the California Institute of Technology, creates tiny microchips using strands of DNA and carbon nanotubes – microscopic cylinders constructed from carbon molecules.

In a paper published in the Nature Nanotechnology journal, the team describes a method that uses so-called “DNA origami” – pieces of genetic material which can be arranged into patterns similar to those used in the microchips common in computers and other electronic devices.

After creating a scaffold of DNA, nanotubes are then inserted into the design to build a microchip that is several times smaller – and therefore faster – than anything that today’s most advanced techniques can achieve.

“This is the first demonstration of using biological molecules to help with processing in the semiconductor industry,” IBM research manager Spike Narayan told Reuters.

“Basically, this is telling us that biological structures like DNA actually offer some very reproducible, repetitive kinds of patterns that we can actually leverage in semiconductor processes.”

It always seemed a little too good to be true, especially for anyone who has endured a beach holiday huddled under an umbrella. This year was to be different, we were told. A “barbecue summer” – from no less an authority than the Met Office itself.

Yesterday, though, the Met Office conceded what Britons have seen with their own eyes over the last few weeks: apart from a fortnight in June, the summer has been more soggy than sizzling.

And it’s not likely to get much better in August, a prediction that will disappoint, if not entirely surprise, millions of “staycationers” who booked a holiday in the UK to enjoy the sunshine and beat the recession.

A Met Office forecaster, Helen Chivers, said today the summer was still on track to be slightly warmer than usual, with average or slightly above average rainfall, and showed no sign of the washouts witnessed in 2007 and 2008. But there were no more promises of a hot summer.

Instead, more familiar language was used to describe what the UK can expect in the coming weeks. “The weather will remain unsettled… with similar patterns of sunshine and showers, and occasional longer spells of rain” she said.

Leading thinkers in technology, design and science are gathering in Oxford to share their ideas about the future. TED Global (Technology, Entertainment and Design) is the European cousin of an already established top US event.

The invitation-only conferences are dedicated to “ideas worth spreading” and have seen talks by former US presidents and Nobel Laureates. This year’s event will explore questions in neuroscience, astrophysics and economics.

“It is about all the hidden, invisible, not yet discovered or fully explored parts of our lives, society and the world,” said Bruno Giussani, European director of TED. “For example, the human brain; how do you make sense of what I am thinking?”

Other questions to be explored include whether life is a mathematical equation, where motivation comes from and whether it is possible to design the air that we breathe.

The invited speakers, who are each given 18 minutes in front of the audience, are drawn from an eclectic backgrounds. This year’s line up includes an aphorist, a wireless electrician, an underworld investigator and a high-altitude archaeologist.

On 20 July 1969, Neil Armstrong and Buzz Aldrin took their first small steps on to the surface of the moon. Forty years later you can join them, thanks to a new release from Google.

Moon in Google Earth brings the lunar landscape to your desktop, complete with photos, video and guided tours provided by the astronauts themselves.

Downloading the new Google Earth software allows users to roam the moon in full 3D for the first time. You can visit the historic Apollo landing sites to see the astronauts at work, or fly above the surface hunting for your favourite crater.

“Forty years ago, two human beings walked on the moon. Starting today, with Moon in Google Earth, it’s now possible for anyone to follow in their footsteps,” said product manager Michael Weiss-Malik. “We’re giving hundreds of millions of people around the world unprecedented access to an interactive 3D presentation of the Apollo missions.”

Street View-style panoramic photos show the flags and footprints left behind by Apollo astronauts, and satellite imagery depicts the landing sites in detail. Many points on the moon’s surface have been annotated by NASA with information and anecdotes from the Apollo landings.

Previously unreleased footage of the six missions has also been made available, along with narrated tours from Aldrin and Jack Schmitt of Apollo 17.

The Turin Shroud was faked by Renaissance artist Leonardo da Vinci using pioneering photographic techniques and a sculpture of his own head, a television documentary claims.

A study of facial features suggests the image on the relic is actually da Vinci’s own face which could have been projected into the cloth.

The artefact has been regarded by generations of believers as the face of the crucified Jesus who was wrapped in it, but carbon-dating by scientists points to its creation in the Middle Ages.

American artist Lillian Schwartz, a graphic consultant at the School of Visual Arts in New York who came to prominence in the 1980s when she matched the face of the Mona Lisa to a Leonardo self-portrait, used computer scans to show that the face on the Shroud has the same dimensions to that of da Vinci.

“It matched. I’m excited about this,” she said. “There is no doubt in my mind that the proportions that Leonardo wrote about were used in creating this Shroud’s face.”

The claims was made in a Channel Five documentary, shown on Wednesday night, that described how da Vinci could have scorched his facial features on to the linen of the Shroud using a sculpture of his face and a camera obscura – an early photographic device.

A detailed simulation of a small region of a brain built molecule by molecule has been constructed and has recreated experimental results from real brains.

The “Blue Brain” has been put in a virtual body, and observing it gives the first indications of the molecular and neural basis of thought and memory.

Scaling the simulation to the human brain is only a matter of money, says the project’s head. The work was presented at the European Future Technologies meeting in Prague. The Blue Brain project launched in 2005 as the most ambitious brain simulation effort ever undertaken.

While many computer simulations have attempted to code in “brain-like” computation or to mimic parts of the nervous systems and brains of a variety of animals, the Blue Brain project was conceived to reverse-engineer mammal brains from real laboratory data and to build up a computer model down to the level of the molecules that make them up.

The first phase of the project is now complete; researchers have modeled the neocortical column – a unit of the mammalian brain known as the neocortex which is responsible for higher brain functions and thought.