shoor writes: It seems like the topic of good archiving, via tape, hard drive, optical media or whatever, is always coming up with a lot of debate and no good conclusions. There certainly seems to be a need, and I wouldn't think the problem is all that hard. (Too hard for me, but not for a small research department with an engineer, materials scientist, and appropriate lab equipment.)

To me, an archival system is write once because, from personal experience, I've lost as much stuff I wanted to keep by overwriting it as by any other method. The main factor is pure longevity of course, but other factors are ruggedness (could it withstand moisture? a small fire? getting knocked off a shelf in an earthquake?), compactness (dvds and tape are sure a lot better than punched cards or floppy disks), and cheapness. I would say it doesn't have to be particularly fast, but it should have random access ability, which leaves out tape.

The way the media is written doesn't have to be the way it's read. Trying to think how I might do it if I had the scientific/engineering chops, I conceptually start with old fashioned photographic film. The negative is exposed to light when the picture is taken. The negative is still very fragile until it gets chemically fixed. After that it can safely be 'read' (exposed to light) while an indefinite number of positives are made. In a hypothetical computer data archiving system, the fixing operation could, for example, be a chemical reaction that is triggered somehow immediately after or while the data is set, and it could be triggered mechanically or in response to heat or UV radiation or a magnetic/electrical charge, or something exotic that I haven't thought of, while at the same time, some other effect (mechanical, electrical, photonic,...) is causing chemicals to react, as in photography, or perhaps tiny nanoparticles to either accumulate in a region or disperse or maybe molecules/particles just rotate slightly in one direction or another due to a magnetic field or polarisation of light. What matters is that the end state is stable and non-destructively detectable.

It doesn't really seem like it should be all that hard to do, so what's the problem? Not a big enough market? Not glamorous enough? Are the current solutions just considered good enough? Or is it actually a much tougher problem than I imagine?

shoor writes: I recently saw a BBC documentary, one of a series on supernatural science,
about ESP. They mentioned "ganzfeld experiments" which seemed to be the
most reproducible versions of ESP. I've always been very skeptical myself
about ESP, but this particular episode of this series seemed less skeptical
about the possibility than about other subjects it's tackled (zombies,
levitation, etc). So it occurred to me that it shouldn't be that hard to
write software to allow do it at home ganzfeld experiments. Basically,
the computer chooses randomly from a set of pictures, the 'transmitter'
person, stares at a picture, maybe tries to draw it, for a set amount
of time, then the 'receiver' who has been isolated is shown a set of
pictures, one of which was the image, and tries to pick the correct one.
The computer could keep track of the statistics of success and so on.
Here's a link to a website that offers a little more background (and
skepticism) than the documentary: http://skepdic.com/ganzfeld.html
If I wrote up a program, would there be a chance of copyright or
patent flack over it?