Pages

Friday, March 28, 2008

"All men are caught in an inescapable network of mutuality, tied in a single garment of destiny. Whatever affects one directly affects all indirectly."

Back in 1985, I worked on a gig to celebrate Martin Luther King Day. The event was promoted by Wilf Walker (lovely fellow). Richard Adams (or maybe Mikki Rain) did the badge and poster designs. Marie Seton did a beautiful silk banner that was hung above the stage and then mysteriously disappeared by the end of the gig. I wrote the words for leaflet which was distributed at the gig and helped out on the night. I think such a celebration was one of the first in London; the gig was reviewed for the NME by Danny Kelly.

I didn't realise before that the idea of Martin Luther King's birthday a national holiday came from the labor unions, in recognition of King's activism on behalf of trade unionists. Rep. John Conyers introduced the bill in Congress. The King Center lobbied corporations and promoted the idea to the general public - an effort boosted in 1980 by the release of Stevie Wonder's single 'Happy Birthday'. Six million signatures were collected for a petition to support the bill; it's claimed it was the largest petition in support of an issue in US history.

Although he was at first opposed to the idea, Ronald Reagan signed it into law in 1983 and it was first observed in 1986.

American Scientist is an excellent magazine, full of fresh and interesting information from many different scientific frontiers, guaranteed to get the brain fizzing.

I was particularly taken with the cover story 'The Rise of Coffee' by Fernando E. Vega, which told me a great many things I didn't know, including the fact that Puerto Rico - Vega's native land - is the sole provider of coffee to the Vatican. A research entomologist, Vega is finishing a book called 'Coffee: A Pictorial History.'

*Coffee is the second most heavily traded commodity in the world, after petroleum products, with an estimated retail value that exceeds $70 billion.

*Coffee is grown in more than 50 countries, in plantations which cover 10 million hectares of the earth's surface.

*Only two of the 100 species of the genus Caffea known are commercially traded - Caffea arabica (which accounts for 70% of the world's consumption)and Caffea canephora, which grow naturally in the equatorial lowland forests of Africa.

*Caffea arabica is endemic to the highlands of Ethiopia, Sudan and Kenya. There are conflicting theories how, at some point, the plant spread to the Yemen where, by 1450, it recorded as being used by Sufis. From here, it reached Cairo, then Damascus, then Istanbul. In the process, the cofeehouse, as a meeting place for news, ideas and political debate was born.

*The first coffeehouses in Europe opened in Venice in 1645 and Oxford in 1650. There were 80 coffeehouses in England by 1663 and 3,000 by 1715.

*The first coffeehouse in the US - the London Coffee House - opened in Boston in 1689; the first in New York was The King's Arms in 1696.

*The Dutch started growing coffee in Java in the 1690s, having obtained the seeds from the port of Mocha in the Yemen. Plants from Java were then sent back to the Amsterdam Botanical Garden, from which a plant was sent to France in 1713, where the first scientific description of the plant was carried out. In 1720, a French naval officer took two plants (one of which died) to Martinique and from here it spread to all the other Caribbean Islands. The Dutch took other plants from Amsterdam to French Guyana and Brazil in the same period.

*More than 800 volatile compounds can be detected in the aroma of roasted coffee.

*Professional testers in the coffee industry are called "cuppers".

*One of the most exclusive coffees in the world is known as civet coffee or kopi lawak, which sells for $120-$150 a pound. It is made from coffee seeds found in the droppings of a small mammal called the common palm civet, which lives in Indonesia.

Thursday, March 27, 2008

This is a most astonishing, beautiful and intriguing image, taken from the recently published book 'Built by Animals: The Natural History of Animal Architecture' by Mike Hansell [Oxford University Press. 2008]

You're not going to believe this but it is in fact an intricate portable shelter, built from grains of sand, by a species of amoeba called Difflugia coronata, which they carry around for protection. To give you a sense of scale, this little "house" has a diameter of 150 micrometers. Amoebas are microscopic single-celled organisms without any nervous system.

This little picture is triggering so many thoughts in my mind.

As a boy at boarding school, we used to spend a lot of time in a free hours pond-dipping at the local lake - Wiston Pond - and catching, amongst other things caddis-fly larvae which, as you know, built themselves little protective sheaths made of grit, pebbles and sand. This to me was a fascinating thing.

Secondly, I'm thinking about the picture in the last post - that extremely sophisticated, condensed and stylish piece of Swedish technology - the Pacemaker. Compare and contrast. Which is the most beautiful.Thirdly, I'm trying to get my head round all the hows and whys to do with an amoeba building a house.

I discovered that I was not the only entranced and perplexed by this image. 'Northern Light', is a blogger, living in the Great Northwest of the United States. His blog is called Searching for Bright Light and is a delight. He writes:

Now, WHY would a single-celled "animal" make a "house" with a scalloped door? What are those pointy "fins" in the back that look like a '57 Chevy? How does the amoeba smooth the grains of sand into this oval shape? Where in its single cell does it keep the information or instinct that "tells" it how to make this house? Do these amoebas just live in colonies of little finned, scallop-edged sand-balls? ("Hey Joe, look at Charlie’s new digs!") These guys are ONE CELL, for goodness sake! Am I the only one who is just flabbergasted? How do atheists explain this...random chance? After all, the amoeba could just have likely made his/her door square with zig-zag edges, right?

Here is an extract from review of the above-mentioned book in The Telegraph by Helen Brown.Full text here:

'We like "building creatures" because we see them as sharing some of the attributes that make us "special". We anthropomorphise, and associate construction with cognitive complexity.

'In fact, creatures of very little brain can prove excellent builders. I was astonished by the magnified photograph of the intricate portable home made by the single-celled Difflugia coronata: a sphere made of hundreds of tiny stones with eight defensive stone cairn "spikes" at one end and frilled "door" at the other.'

Then, as if to complete a virtuous circle, I discovered an essay by Mike Hansell [right], in which he describes his memories of the occasion when he made a presentation of Animal Architecture to three hundred delegates at a conference organised in England by the Chartered Institute of Building Services Engineers. He began by flashing up on the screeen the following message:

'You don't need brains to be a builder.' There was instant applause.

He writes:

'This audience response took me quite by surprise because I had not actually thought of my message as a joke, but rather as the most fundamentally important message that I wanted to convey. To illustrate it I give you this structure: It is a sphere composed of a few hundred stones cemented together, on top of which there are seven or eight sturdy spikes, each a cairn of stones, larger ones at the base, smallest at the tip creating a sharp point. At the bottom of the sphere there is a large circular hole ornamented with a pleated collar of particles too small to be distinguishable from the cement that binds them. The diameter of the whole dwelling, for that is what it is, is about 150 thousandths of a millimetre. Smaller than the punctuation mark at the end of this sentence. It is the portable home of Difflugia coronata, a species of amoeba.

'An amoeba, as you very likely know, is a single celled organism. The one cell does everything. It feeds, excretes, moves and reproduces and, in this species, it also builds a home. The cell has no nervous system at all, let alone a brain. Can its extraordinary achievement really be said to be building? Well, as the organism's amorphous bulk glides gently round the bottom of some pond, engulfing food particles and growing, it also picks up tiny sand grains that accumulate as a mass inside it. When it grows to a certain size, the cell then reproduces by dividing its body equally into two. One of these inherits the ancestral home; the other is left the bundle of building material. These stones, we know not how, are then moved to the body surface and arranged to create the distinctive architecture of this species. Just enough particles of the right sizes, big and small, have been picked up to accomplish this.'

'The Pacemaker, priced at €520, or about $760, and created by a small Swedish company called Tonium, reduces the basic DJ equipment of dual players and mixer significantly. The Pacemaker has a 120-gigabyte hard drive, fits in the palm of the hand, runs on batteries and has a built-in mixer to layer tunes seamlessly so the music never stops.'

Tonium began shipping Pacemakers in February 2008. They are focusing first on customers in the European Union, Japan and South Korea, and then on the United States. American consumers will be able to order through www.pacemaker.net

So we return to the consistently interesting magazine Monocle and their important story, in the March 2008 issue, on the future of photography, seen through the lens of the Japanese market. Here can be seen clearly the devastation of film and film cameras that the digital revolution has wrought.

'From a peak of nearly 40 million in 1997, sales of 35mm and single-lens cameras have plummeted. The situation is stark: if sales continue at this rate, film cameras and film itself will disappear within a few years.'

Last year Japan produced 94 million digital cameras and just 800,000 film cameras. Canon stopped developing film cameras in 2006. Nikon still makes some but its focus is digital. Konica was sold to Sony in 2005.Over 50% of the 1,800 members of the Japan Professional Photographers Society are working in digital now. Bulk memory cameras are now standard in Japanese mobile phones.

Japan's greatest photographer Hiroshi Sugimoto is said to be stockpiling film in freezers; Sebastio Salgado is reported to have begged Fujifilm to manufacture film for him in bulk.

Fujifilm's President Sigetaka Komori is a fervent supporter of film photography. The company released the Klasse W, the one new film camera to come on the market since 2006, and, says Monocle, 'the consensus among photographers is that as long as Komori is in charge, the film is safe.' But he's 68 and will not be in the job for ever.

Taushi Horokawa is one of the most prominent of a group of pro photographers arguing for the value of film. He told Monocle: 'It doesn't matter hopw many pixels there are. Digital cameras can never becone like film cameras - they will always run in parallel, but never the same. There will be developments in digital technology and artists will use digital cameras in their work but film is something else, it has its own culture.'

3-D FROM A DOUGHNUT. Photographing a person's face with a cone-shaped mirror in front of the lens creates a distorted, doughnut-shaped image (left). The cone provides two extra perspectives of the face on opposite sides of the center point, providing enough information to construct a 3-D model (right).Image: Computer Vision Lab., Columbia Univ.From an excellent article in Science News.

If you recovered from that thought stream get ready for another photographic future: welcome to the new field of COMPUTATIONAL PHOTOGRAPHY. An excellent introductory primer on the subject, written by Brian Hayes, was published in the March/April 2008 issue of American Scientist. It begins:

'The digital camera has brought a revolutionary shift in the nature of photography, sweeping aside more than 150 years of technology based on the weird and wonderful photochemistry of silver halide crystals. Curiously, though, the camera itself has come through this transformation with remarkably little change. A digital camera has a silicon sensor where the film used to go, and there's a new display screen on the back, but the lens and shutter and the rest of the optical system work just as they always have, and so do most of the controls. The images that come out of the camera also look much the same—at least until you examine them microscopically.

'But further changes in the art and science of photography may be coming soon. Imaging laboratories are experimenting with cameras that don't merely digitize an image but also perform extensive computations on the image data. Some of the experiments seek to improve or augment current photographic practices, for example by boosting the dynamic range of an image (preserving detail in both the brightest and dimmest areas) or by increasing the depth of field (so that both near and far objects remain in focus). Other innovations would give the photographer control over factors such as motion blur. And the wildest ideas challenge the very notion of the photograph as a realistic representation. Future cameras might allow a photographer to record a scene and then alter the lighting or shift the point of view, or even insert fictitious objects. Or a camera might have a setting that would cause it to render images in the style of watercolors or pen-and-ink drawings.'

On May 23 - 25, 2005, a Symposium on Computational Photography and Video was held at the MIT Computer Science and Artificial Intelligence Laboratory

Abstract:

Research breakthroughs in 2D image analysis/synthesis, coupled with the growth of digital photography as a practical and artistic medium, are creating a convergence between vision, graphics, and photography. A similar trend is occurring with digital video. At the same time, new sensing modalities and faster CPUs have given rise to new computational imaging techniques in many scientific disciplines. Finally, just as CAD/CAM and visualization were the driving markets for computer graphics research in the 1970s and 1980s, and entertainment and gaming are the driving markets today, a driving market 10 years from now will be consumer digital photography and video. In light of these trends, the time is right to hold a symposium on computational photography and video. The area is old enough that we understand what the symposium is about, young enough that we can still argue about it, old enough that its practioners can fill an auditorium, and young enough that they still fit in one.

This was my cover story and centrepage spread for 'Connected', the digital supplement of The Daily Telegraph,Tuesday, January 7, 1997. The article was was an extended feature based on a newly-published book of the time: HAL's Legacy: 2001's Computer as Dream and Reality. Edited by David G. Stork. Foreword by Arthur C. Clarke (MIT Press, £16.59). published to celebrate the 30th anniversary of the creation of HAL. Entitled 'Hal's heir - the quest for artificial intelligence', it says 'How close are we to building anything like the famous cinematic computer?' Full text reads as follows:

"I am a HAL Nine Thousand computer, Production Number 3.1. I became operational at the HAL plant in Urbanu, Illinois, on January 12, 1997."

Sunday marks the true birthday of the most fam­ous computer in cinematic history.

In Stanley Ku­brick's film adaptation of Arthur C. Clarke's novel 2001: A Space Odyssey, HAL was born on January 12, 1992; but it is the date given in the novel —1997 — that is being celebrated by researchers as an opportunity to evaluate progress — or lack of it — in the field of artificial intelligence (AI) in that time. Where are the thinking, talking, chess-playing, lip-reading computers like HAL — or preferably, since he also committed murder, not like HAL.

One of the prime movers behind the celebration is David G. Stork, chief scientist and head of the Ma­chine Learning and Perception Group at the Ricoh California Research Centre. He has edited a stimulating collection of essays by luminaries from the computer, per­ception and AI communities — HAL's Legacy: 2001's Computer as Dream and Reality — to be pub­lished, in print and on the Web, for the event. Each asks questions about our progress towards creating intelligent machines, telling us much not only about HAL and 2001 but also about ourselves.

Kubrick's film was released in 1968 — the year of the assassina­tions of Martin Luther King and Robert Kennedy, and the first pho­tograph of the whole Earth from space, taken by Apollo astronauts on the way to the Moon. Computers at that time were not a daily reality for the ordinary person. Most were huge machines that ran on solid-state micro-electronics and used punched cards and tape to input data. The keyboard and video dis­play monitor were new develop­ments. The personal computer, the mouse and the software explosion lay in the future, and the Internet was merely a twinkle in the eyes of a handful of American researchers.

HAL is a child of these times and his conception underlines the folly of predicting the future by extrapo­lating from the present. Even so, 2001, and HAL in particular, con­tinue to fascinate, despite the anachronisms and misconceptions.

Stork writes: "2007 is, in essence, a meditation on the evolution of intelligence from the monolith-inspired development of tools, through HAL's artificial intelli­gence, up to the ultimate (and delib­erately mysterious) stage of the star child."

The consensus in the late Nine­ties, however, is that HAL — reflecting ancient dreams and night­mares — will not be ready by 2001. Beyond that, opinions diverge. Some believe it is only a matter of time before intelligent computers emerge; others that it will never happen because the whole concept is flawed. In many fields we have made great strides, in others piti­fully small steps. Artificial intelli­gence, says Stork, "is a notably hazy matter that we don't even have a good definition for". It is also "one of the most profoundly difficult problems in science".

One of his major contributors is one of the godfathers of AI, Marvin Minsky, who believes that while good progress was made in the early days, the researchers became over­confident. They prematurely moved towards studying practical AI prob­lems such as chess and speech recognition, "leaving undone the cen­tral work of understanding the gen­eral computational principles — learning, reasoning and creativity — that underlie intelligence".

"The bottom line," says Minsky, "is that we haven't progressed too far toward a truly intelligent ma­chine. We have collections of dumb specialists in small domains; the true majesty of general intelligence still awaits our attack." He believes that if we work really hard, we can have such an intelligent system in four to 400 years.

Stephen Wolfram, the principal architect of the Mathematica com­puter system, believes the answer to building HAL lies in the domain of systems in which simple ele­ments interact to produce unexpect­edly complex behaviour. He uses the example of the human brain, in which the relatively simple rules governing neurons have evolved into a complex cognitive system.

Ray Kurzweil, who developed the first commercial large-vocabulary speech-recognition s ystem, believes the way to tackle the task is to reverse-engineer the brain, scan­ning an entire brain down to thelevel of nerve cells and the intercon­nections. We would then need merely to encode all the information into a computer to make a virtual brain every bit as intelligent.

David J. Kuck, a distinguished computer scientist, believes that given the rapid increase in comput­ing power, we could soon build a computer the size and power of HAL. "If automobile speed had improved by the same factor as computer speed has in the past 50 years," he writes, "cars that trav­elled at highway speed limits would now be travelling at the speed of light."

He believes progress in the 21st century will be slower, with gains coming from software and parallel processing, which is used in the human brain. To give some compar­ison, the brain has between a thou­sand billion and 10 thousand billion neurons, plus many more intercon­necting synapses. The fastest com­puter at present has 100 billion switches —10 per cent of the brain's capacity — but Kuck believes that in the future, the physical capacity of computers will match that of the brain.

The only manufacturers that could at present build HAL are IBM or Intel. "However," Kuck writes, "it is not obvious that a HAL-like system will ever be sufficiently interesting to induce governments to fund its development."

HAL's voice is a holy grail for many researchers. Making comput­ers produce natural-sounding speech is remarkably difficult. We have developed programs that work adequately for short utterances or single worlds, but in sentences ma­chines cannot yet convey the human subtleties of stress and intonation. The greatest problem is the ma­chine's inability to comprehend what it is saying or hearing. And while we have made several impor­tant strides in speech recognition, no system remotely approaches HAL's proficiency at speechreading (lipreading) in silence.

A successful automatic speech-recognition system requires three things: a large vocabulary, a pro­gram that can handle any voice and the ability to process continuous speech. We have the first two — and will get the third by early 1998, the book predicts.

Making computers see has also proved to be extremely difficult. There has been success in what researchers call "early" vision — edge and motion detection, face tracking and the recognition of emo­tions. Full vision would include the ability to analyse scenes.

Success has, however, been marked in chess. There are more possible combinations in the game than there are atoms in the uni­verse. Humans play chess by employing explicit reasoning, linked to large amounts of pattern-directed knowledge. The most suc­cessful chess computers use brute force, searching through billions of alternative moves.

The first machine to defeat a grandmaster in tournament play was IBM's Deep Thought, which began playing in 1988. The current champion computer is its successor, Deep Blue, which is capable of examining up to 200 million chess positions a second. Murray S. Campbell, a member of the team that built it, says Deep Blue is actu­ally a system of 32 separate comput­ers (or nodes) working in concert, with 220 purpose-built chess chips, running in parallel.

Garry Kasparov played Deep Blue for the first time in 1989 in a contest he viewed as "a defence of the whole human race". He lost to the machine for the first time last year. Campbell believes man-machine contests will end some time next century. It is only a matter of time before the world's best chess player is a machine, he says, but concedes that "until computers possess the ability to reason, strong human chess players will always have a chance to defeat a computer".

Stork's primary motivation for the book was aesthetic, he says, lik­ening the exercise to that of art his­torians providing fresh insights into a subtle painting. 2001 illustrates many key ideas in several disci­plines of computer science.

"The Internet and the World Wide Web have changed the way people view communication and technology," says Stork. "2001 expressed the anxiety [of the Six­ties] of what computers were and what their potential was. Like much science fiction, it was a metaphor for the salient issues of the present."

He believes the biggest mistake made by early AI researchers was "not to cast the problem as more of a grand endeavour to build useful intelligent machines. The search raises the deepest human questions since Plato."

When Stork saw 2001 in the year of its release, he was "awed. It was overwhelming, and supremely beautiful. It was also mythic and very confusing". The film "shows us and reminds science that it is part and parcel of the highest human aspiration. It also raises the question: is violence integral to the nature of intelligence? It is thus related to Kubrick's Clockwork Orange, which merges violence and aesthetics. It suggests the link can be severed — but at a terrrible cost."

The computing pioneer Alan Turing predicted in the Forties that by early next century, society would take for granted the pervasive intervention of intelligent machines. By the end of this century, scant years away, we will be talking to our PCs and, by 2010, working with translat­ing telephones. Our most advanced programs today may be comparable with the minds of insects but the power of computation is set to increase by a factor of 16,000 every 10 years for the same cost.

Many of HAL's capabilities can already be realised; others will be possible soon. Building them all into one intelligent system will take decades. If we are to achieve that, we must give computers under­standing; but to program them with understanding, we must first under­stand the nature of our own human consciousness. That could take some time.

Tuesday, March 25, 2008

Clarke, the author of more than 100 books, including "2001: A Space Odyssey",died early Wednesday, March 19, 2008 after suffering from breathing problems.He was 90.

A COSMIC COINCIDENCE

This powerful stellar explosion - a bright Gamma-Ray Burst detected March 19 by NASA's Swift satellite - has shattered the record for the most distant object that could be seen with the naked eye. The image shows the X-ray afterglow as seen by the X-Ray Telescope (left) and the bright optical afterglow as observed by the Ultraviolet/Optical Telescope on board Swift.Credit: NASA/Swift/Stefan Immler

On March 19, 2008, the NASA satellite Swift observed four separate Gamma Ray Bursts,the most powerful explosions in the Universe, each the signature of a massive star reachingthe end of its life and exploding. Never before had Swift seen four bursts in one day.

"Coincidentally, the passing of Arthur C. Clarke seems to have set the universe ablaze with gamma ray bursts," said Swift science team member Judith Racusin of Penn State University.See full report here

Friday, March 21, 2008

One of my favourite posters, designed and produced some 10 years agoor more by Lewes designer Andy Gammon.

Increasing pressure on China over Tibet

Published: Tuesday 25 March 2008 17:38 UTC

Paris - There is increasing international pressure on China to resolve the Tibet issue peacefully. On Tuesday, French President Nicolas Sarkozy said he was considering a boycott of the Olympic Games' opening ceremony. He said he was appealing to the Chinese authorities' sense of responsibility regarding the unrest in Tibet.

The Germany government today again called on China to enter into dialogue with the Dalai Lama, but Berlin said that a boycott would be counterproductive. US President George W Bush said he would attend the Beijing Olympics' opening ceremony as planned.

The Tibetan government in exile says that 140 people have been killed during the protests of the past few weeks, whereas the Chinese government says 20 people lost their lives.

Both taken from the marvelous 'Jackets Required: An Illustrated History of American Book Jacket Design 1920-1950' by Steven Heller and Seymour Chwast [ Chronicle Books. 1995]

In a recent essay in the New York Times - 'Great Literature? Depends Who Wrote It', Charles McGrath looks at 'the assumption that genre fiction - mysteries, thrillers, romances, horror stories - is a form of literary slumming.'

'These kind of books,' writes McGrath, 'are easier to read, we tend to think, and so they must be easier to write, and to the degree that they're entertaining, they can't possibly be serious.'

He says the 'distinction between highbrow and lowbrow - between genre writing and literary writing - is actually fairly recent. Dickens wrote mysteries and horror stories, only no one thought to call them that.'

[Digression: In my book 'Curious Facts 2', it says 'the phenomenon whereby social structure affects taste was dubbed 'Highbrow' and 'Lowbrow' in a 1914 essay by Van Wyck Brooks.' Have lost my source for this but this statement now seems unlikely. He no doubt discussed the issue in his writings but the 3rd Edition of the Shorter Oxford English Dictionary puts 'highbrow' (meaning 'intellectually superior') as US 1908 and 'lowbrow' as 1913. The answer undoubtedly lies in 'Highbrow/Lowbrow:The Emergence of Cultural Hierarchy in America' byLawrence

] Levine {Harvard University Press], which I have yet to read. The blurb says that Levine traces 'the emergence of familiar categories as highbrow and lowbrow at the turn of the century.'Further digression: Just discovered 'From Lowbrow to Nobrow' by Peter Swirski.

McGrath goes on to talk about 'that interesting category of novels that are said to "transcend" their genre'. This is false praise' says McGrath. 'To trancend its genre, a book has to more nearly resemble a mainstream novel.'

The above thoughts provide a prelude to THE GENERALIST's round-up of the best of genre fiction I have read in the last 12 months. A few of these books are brand new, others have been published in recent years. They are all intelligent, intriguing and interesting books which will provide you with hours of valuable 'diversionary reading' - taking your mind of worldly and personal worries and concerns. They are thus a valuable and absorbing strategic resource.

'In 1909, Sigmund Freud, accompanied by his then disciple Carl Jung, made his one and only visit to the United States, to deliver a series of lectures on psychoanalysis at Clark University, in Worcester, Massachusetts. The honorary doctoral degree that Clark awarded him was the first public recognition Freud had ever received for his work. Despite the great success of this visit, Freud always spoke, in later years, as if some trauma had befallen him in the United States. He called Americans 'savages' and blamed his sojourn there for physical ailmments that afflicted him well before 1909. Freud's biographers have long puzzled over this mystery, speculating whether some unknown events in America could have led to his otherwise inexplicable reaction.'

Thus begins Jed Rubenfeld's masterful work, that invents a fictional explanation of this real-life conundrum, in which Freud is drawn into the investigation of a savage murder of a stunning debutante. It's a great read in itself but what makes it doubly intriguing are the author's end notes, which scrupulously itemise how much fact lies behind his fiction and exactly where the fiction differes from the known historical record. It will suprise you.

[The book had the effect of sending me back to my collection of Jung books and led me to purchase what is I believe is the best modern biography - 'Jung: A Biography' by Deirdrie Blair [Little, Brown. 2004] - and read the first 200pp of it. (The book is a vast 647pp with a further 200pp plus of detailed notes and references and an excellent index). Its a totally fascinating story which I hope to return to in due course.]'The Chatelet Apprentice' is the first of a series of novels featuring the character Nicholas Le Floch in Paris in the 1760s, written by Jean Francois Parot, a diplomat and historian. These books have been celebrated in France since first p;ublication in 2000 but is was only last year that this first adventure was made available for readers in an English translation by Gallic Books in London. (The second, 'The Man with the Lead Stomach', is due for publication in April 2008). Parot uses all his professional skill to paint an accurate portrait of the sights and sounds of the period, full of telling incidental detail. The central crime is suitably dark and convoluted and involves a cast of a characters full of subtlety and substance. Le Floch is an engaging central figure, full of uncertainties, who in this book is undergoing a trial by fire as he struggles to discover missing documents of vital importance to the King as dead bodies proliferate around him.

Fred Vargas is the pseudonym of a French female academic archaeologist and this delightful and quirky book, first published in English in 2003, is one of a number of her works now available in translation. The mystery starts with an Breton town crier who, three times a day in a small Parisian square, reads out the local news and adverts people have posted in his box, to his small but devoted audience. All is well until a series of disturbing messages start appearing, warning of death and pestilence, followed by the appearance of strange markings of apartment doors. The case comes to the attention of Detective Commissaire Adamsberg whose eccentric techniques enable him to eventually unmask the true secrets behind a dark and diabolical plot. Pitched just on the right side of the unbelievable, its creepy themes pick up contemporary resonances with the paranoid times we live in.

The extraordinary phenomenon of 'The Da Vinci Code' has spawned an ocean of imitators, eager to try and emulate the success of the original. The only one I've been drawn to read is Michel Benoit's 'The Thirteenth Apostle' - a satisfying, well-written mystery built out of time-honoured elements: a young Benedictine who, following the mysterious death of a colleague, begins searching for lost biblical texts that place Jesus in a fresh context, one that would threaten the teachings of the established church. Naturally, there is a dark cabal inside the Vatican who are determined that he will not suceed.

[Of course the Godfather of this genre must be Umberto Eco with his landmark books 'The Name of the Rose' and 'Foucault's Pendulum' - both huge tomes stuffed with long sections of arcane knowledge welded to a gripping plot that I remember finding genuinely thrilling and unique at the time.]

Translated from the German, this No 1 Swiss bestseller is one of a number of Eurocrime novels published by Arcadia Books. Set in the claustrophobic community of an out-of-season hotel in a remote Alpine village in the Swiss Engardine, its main character is Sonia Frey, escaping a violent husband and a bad acid trip, who gets a job there as a physiotherapist, hoping that fresh landscapes, Alpine air and a calm and ordered existence will heal her soul.Bad move as it turns out. Sutter expertly creates an atmosphere of menace and intrigue, in which real-life violence appears to be following the plot of ancient superstitions.

Considered Italy's leading crime writer, Massimo Carlotto's first book 'The Fugitive' is a novelised real-life story about the years he spent on the run from the criminal authorities. It's an extraordinary tale.

Born in 1956 in North-Eastern Italy, Carolotta first got interested in far-left politics at the age of thirteen. He became an activist with Lotta Continua and began getting involved in investigative and counter-information work. In 1976, after discovering the body of an acquaintance who had been brutally murdered, he was falsely accused of the murder, arrested and put on trial. Acquitted and then convicted (there is no double-jeopardy law in Italy), Massimo, on the advice of his lawyers, fled abroad to avoid imprisonment.

From 1982 to 1985, he lived under a series of borrowed identities in Paris and then moved to South America. During these years of exile he was supported and sheltered by the international community of political refugees and worked in a number of capacities (pizzaiolo, translator, academic researcher) whenever he was able. In Mexico, he was betrayed by a lawyer, underwent torture following a case of mistaken identity, and then returned to Italy and to prison.In 1986, Massimo Carlotto became the focus of an international defence campaign that won wide backing: the South American novelist Jorge Amado and the eminent Italian philosopher Norberto Bobbio were among his supporters. In 1993 he was finally released from prison with a pardon from the President of Italy.He had been tried a total of eleven times and had amassed 96 kilos of court proceedings.

After his release, Massimo quickly turned to writing, 'The Fugitiveis first and most autobiographical novel, Il fuggiasco (Fugitive) relates the almost eighteen years between his arrest and his presidential pardon. A film version of Il fuggiasco, directed by Andrea Manni and starring Daniele Liotti, was released in 2003. It has won many awards.

Since then, Massimo has written eight other novels, several plays, countless newspaper articles and essays. Film versions are currently being made of two of his most recent novels ('The Goodbye Kiss' and 'Death's Dark Abyss'); in January 2005, he signed a contract with his Italian publishers for five more novels. He also continues to act as a consultant to criminal lawyers, assisting them in cases involving organised crime, political intrigue and state intelligence.

The books mentioned are all published by Europa Editions, a fantastic imprint which specialises in publishing contemporary European writing.