The name Duct Tape Jedi seems rather appropriate given the topic of maintaining old equipment. Also I beleive you forgot a reference to one of the following:A) Obama = OsamaB) Dog PenisC) Footbal player by-product.

Once again, airplane manufacturers have been giving serious consideration to offering Internet access in the skies. Back in 1994, Boeing considered equipping each seat with a serial modem. Laptop users could hook up to the modem and dial out. (Dial-up was the primary means of connecting to the Internet back in those days.)

We chuckled at the though of attaching the serial cable and getting a Plug-and-Play pop-up message:

If you're thinking "Hey, we managed to run desktops on those too" then yes. If you're thinking anything along the lines of "Scientific calculations", then that extra computing power would be very, very handy. Those are the people that never, ever seem to run out of a need for more and faster processors, and I doubt these guys are any exception. Anything they can process onboard or compress better for sending back down to us would cut down on things that are probably a lot more scarce like bandwidth (if not

If you're thinking anything along the lines of "Scientific calculations", then that extra computing power would be very, very handy. Those are the people that never, ever seem to run out of a need for more and faster processors, and I doubt these guys are any exception. Anything they can process onboard or compress better for sending back down to us would cut down on things that are probably a lot more scarce like bandwidth

I would have assumed that the telescope does very little processing on the image and

Most people learn at quite a young age that the word 'better' doesn't really mean anything on its own. Better at what? Better at supporting non-RGB colour spaces? Better at supporting RGB with more than 8bits per colour, or even floating point values? Storing multiple images in a single file? No, png supportings none of these things that tiff does. If you're creating computer graphics for UI's, websites etc, png is probably a better choice, as that's more what it's designed for, but there are many other uses for storing images outside of this scope that tiff fits much better than png. As far as compression's concerned, PNG supports DEFLATE, which existed before PNG did, and the same with TIFF and its supported LZW compression (not that there's anything stopping you compressing either with either).

Hubble's not gonna be wasting it's precious cpu time on running calculations for scientists on earth; they can do that themselves here on much faster processors, rather than divide up processor time onboard a satellite. Hubble will, however, need processing power for alignment; controlling rocket burns to get it pointing the right way, controlling motors to position mirrors, that kinda stuff, which doesn't need huge amounts of processing power. Just decent, realtime, predictable core + software, without things like fdiv bugs, or huge amounts of heat that pentiums+ give off.

Anything they can process onboard or compress better for sending back down to us would cut down on things that are probably a lot more scarce like bandwidth (if not directly, then the power to operate the antenna probably draws more than the processor does).

That's really not the case. Being so close to the earth, Hubble can broadcast with tiny amounts of power (far less than to run a CPU) and NASA's gigantic 65 meter dishes can pick up the faint signal very easily. Radio power consumption becomes a notable issue only with substantial distances from the earth, as it has with Voyager I/II.

Bandwidth is certainly not scarce for such applications, this is very low power, highly directional, line-of-sight communications...

With all the upgrades to other important parts of the system, like RAM and the system bus, in some ways it is completely different architecture. I'm sure if you tried to load Vista onto an old, average 486 box (assuming it's possible), you might agree.

Back in 1993, I think, I replaced my 486SX 33Mhz with a 486DX 66Mhz, and I remember paying $600 for the thing!

Hmm, I think I paid about $1400 for a system based on an AMD 486DX 40Mhz chip, in 1993 or 1994. The DX2-66 was the screamer, and I couldn't quite justify the price premium, even for all the added speed. I also helped the engineers I interned with set up a network in early 1994, and the server was a DX2-66 with something crazy like 32MB RAM. The 486DX-40 sure beat the pants off my parents' 386SX-1

I sent my Amiga 500 into orbit in 2001 using a homemade trebuchet (granted, quite a large one) and a very high mountain. It broadcasts the Pinball Dreams high score list every two hours on the hour. The problem is, the last time I went up to do some improvements (long story) I had forgotten a few vital 68000 assembler directives, so I was unable to make the transition from antiquated late-80s desktop computer to cutting-edge ASAT weapon. Too bad, now the 10kT warhead I attached to it is probably just sitting there, twiddling its sub-critical materials.

I'll tell ya, I wouldn't mind unloading this thing, it's a bitch loading and saving my CV from/to cassette these days - it's difficult to find cassettes! It takes 15 minutes to load the word processor I found in COMPUTE magazine back in 1982, another 15 minutes to load/save the CV, AND, it's even more fun printing to the Timex/Sinclair 2040 roller tape thermal printer, but it makes a really great server since it can't be hacked, and moreover, it uses very little energy! I just creatively tape two rolls of thermal paper on a 8.5" x 11" paper and make a Zerox of the CV - fools most experts into thinking I did this with MS Office or Open Office! When they here how I did everything, I've cinched the JOB!

Many of NASA's long-running missions rely on antiquated systems -- the Voyager probes each have about 32k of memory -- but the scientists say they can manage."

It would be nice if the submitter would add a proposed remedy, like simply sending a service probe out to add some more RAM.

Oh, wait.

Well, I guess when they send a space probe out into the furthest reaches of the solar system, most scientists would expect that they will have to deal with whatever hardware was on board at the time of the launch for the duration of the mission.

in such a small space by a good programmer. Most systems today are so encumbered by having been built by toolkits built on toolkits built on metalanguages ad nauseum that a simple "hello world" program now can run hundreds of K of memory.

My compliments to the programmers who still know how to get the most out of the little resources they're working with on these scientific probes.

That's the point. Does it need to run Vista? I think not. I have a box with a 486 in it, it still does what it was supposed to do. (yes, linux)
I doubt there's any NASA engineers lusting for a dual-core whoopie-doo. They just want their backup to come alive, after all these years.
The original deserves a medal, for service beyond, and a pension. Perhaps it could run for president.

Run Vista? Can you just imagine what the Hubble's Telescope desktop picture of the second is?

I had a box with a PentiumOD in it, and got tired of seeing the FOOF bug patch every six months when It needed to be rebooted, so I pulled the POD and put a DX-66, and it still worked the same. ( it did go from 2% CPU Utiliziation to 3% CPU Utilization ).

The originonal programmers deserve medals. They did NOT use windows/MS bug crap.

Part of the trouble NASA is encountering while fixing the Hubble Space Telescope comes from the fact that it's been up there for nearly two decades, and therefore carries computer systems long outdated here on Earth.

Which "part" of the "troubles" and according to who?

Only the Popular Mechanics article even SUGGESTS that age and technological obsolescence might (maybe-sorta-kinda slightly) contribute: "But perhaps finding a few problems should come as no surprise--not only have Hubble's backup systems sat id

This is a bullshit article. Unfortunately, that has become the norm for Popular Mechanics.

The Intel 486 is hardly some arcane CPU that's so old that nobody knows how to program it. Anybody who can write assembly for modern PCs can write assembly for the 486. And anybody who wants to write in a higher-level language can -- because all the 486 development tools are still easily available.

If you read the article, you'll find that it presents no evidence whatsoever for its assertion that the Hubble's use of a 486 makes it harder to repair. In fact, it reads more like, "The Hubble has a 486, and damn that seems outdated to me! Maybe that's why it's so hard to fix!" Really, that's about the level of the 'logical' argument that you'll find in the article.

Yep and its shielded and certified for space use. The Space Shuttle has a few too. So does some of the Mars rovers, IIRC.

I dont know what the author expects. Some big Hollywood-esque GUI controlling the Hubble? A think a typical desktop user (like the author) would be shocked at how little power embedded systems really need.

486 was officially the only space-rated hardware for a very long time. The problem is that when you create a smaller transistor, it becomes far more sensitive to ionizing radiation... the older the die, the larger - and thus less likely to be effected by radiation. More "modern" processors require more shielding.

They need to have the chips hardened for radiation. I'm not sure what the process entails, but they don't seem to do it with chips younger than 10 years or so./. did a pretty good article on this awhile back I think.

They need to have the chips hardened for radiation. I'm not sure what the process entails

I would hope it involves putting the everything in a radiation shielded box. I could see how smaller chip architectures might be more susceptible to radiation, but a decade is enough time to figure that out and use exterior shielding instead of hardening. Sure that might be much more difficult, but if you can't handle difficult don't work at NASA. Of course with a Hubble sized budget, there is no excuse for not havin

I actually don't think you can realistically shield effectively against some types of high energy particles. Nuclear reactors use 6 ft of concrete to shield against neutrons. There's higher energy particles than neutrons in space.
I'm sure that external shielding plays a large role in it, but there's probably more to it. The wikipedia article on radiation hardening is actually very good. http://en.wikipedia.org/w/index.php?title=Radiation_hardening&oldid=235697687 [wikipedia.org]

One of the reasons particles like neutrons are hard to stop is that they have no charge and don't react with the electromagnetic fields that bind matter together. You basically need a collision between the neutron and an atomic nucleus to stop it.

A particle that doesn't interact electromagnetically, however, is (if I'm not mistaken) less likely to interfere with electronic equipment. Which is not to say hard-to-stop radiation like neutron radiation does no damage at all, but I'd be curious to know whether it's a concern at all for satellites.

There are much better neutron shields, but they are very exotic and expensive. Borated Polyethylene, hafnium, cadmium or any other material with large numbers of hydrogen atoms present, water being one of the better ones.

There are much better neutron shields, but they are very exotic and expensive. Borated Polyethylene, hafnium, cadmium or any other material with large numbers of hydrogen atoms present, water being one of the better ones.

Definitely, the point being that you get the most scattering in collisions between objects of near equal mass. The closest mass neutral particle to a neutron is a hydrogen atom (with it's approximately 1 electron associated with one proton), and so the important oil-well petrophysical measu

at 10,000 a pound to launch the shuttle, weight reduction is most important. sending up lead computer cases because hardening a processor is hard is not an option when plastic weighs several pounds less.

Also up until 3-4 years ago the hubble was going to be shut down in the next year or two and was only extended later. Unlike the mars rovers the hubble's life won't magical extend.

Why bother with heavy shielding when you can just make the transistors big enough to not be flippable by single stray particals? Thick shielding might prevent 99.999% of dangerous bit flipping radiation from getting through, but what about that last tiny bit, you're going to need extra circuitry to detect errors in the processors circuitry... and everything starts getting more complicated, and you end up back where you started. In space, simpler is better.

Or simply have a dozen chips doing all the calculations in lockstep, and then taking the median as the result for all numbers. If a shower of neutrons crews that up, it will likely screw up any shielding anyone comes up with anyway.

Actually, some sorts of shielding make things worse. Moderate amounts of shielding just end up providing targets for the really high energy particles, which releases a big cloud of moderate energy particles on impact. The secondary radiation is both more abundant and more likely to interact with the stuff on the inside, and so causes a bigger problem. For space applications, there are intermediate amounts of shielding that will actually *increase* the total dose. (This is the case for cosmic rays, not solar flares; the latter can be fairly effectively shielded against, but is frequently less of a concern.) If you're not willing to put *large* amounts of mass around the thing to be shielded, it's often impossible to improve things all that much.

Hardening often consists of simple changes that are nonetheless expensive because they involve changes to the whole production line -- things like rating all the transistors for a noticeably higher voltage, to reduce the likelihood of a radiation-induced latchup event. As chip voltages get lower, this gets harder. Other changes include things like using isotopically pure boron in your dopants -- boron comes in two common isotopes, 10B and 11B. 11B is relatively immune to cosmic radiation, but 10B will fision when hit -- releasing secondary ionizing particles that cause a much greater problem than the cosmic ray by itself would. So rad-hard chips end up made with (expensive) depleted boron.

Combine these, and you see why it's difficult to find a decent selection of rad-hard chips, and also why an up-to-date radiation hardened CPU can cost over $100k each -- and also why you nonetheless need them, and can't really substitute anything short of a few tons of shielding.

What you want may well be impossible. There are no magical materials right now to do what you want. Cosmic rays in the range TeV can't be stopped with a box that can be affordably launched, much less fit into the satellite. It's easier to use chips that are designed to handle them.

NASA already has a backup computer, on which are two independent circuits to do the same thing. Side "B" that is on the Hubble right now is handling things right now, after side "A" quit working.

NASA is putting the last of their spare parts on the Hubble right now, after which, there are no more short of restarting production, which isn't going to happen affordably. They made a lot of replacement parts which were gradually used as there were servicing missions.

Why use a heavy metal box to stop the cosmic rays or solar flare protons? They are both positively charged.
Just put a positive charge around the computer box, and negative charge around a few "lightning rods" a few feet away and let magnetic forces do the rest. You don't have to stop the high energy particles, you just have to convince them to miss the few square inches of delicate electronics. Launch weight radiation shielding is something that NASA is going to have to tackle soon enough anyway if we ever

Why use a heavy metal box to stop the cosmic rays or solar flare protons? They are both positively charged. Just put a positive charge around the computer box, and negative charge around a few "lightning rods" a few feet away and let magnetic forces do the rest. You don't have to stop the high energy particles, you just have to convince them to miss the few square inches of delicate electronics. Launch weight radiation shielding is something that NASA is going to have to tackle soon enough anyway if we ever want to leave our magnetosphere for more than about a week. Why not test it on a modern Hubble CPU, while keeping the remaining legacy chip as a back up?

Young man, in this forum we respect the laws of physics.

Go and find out how strong a magnetic field is required to deflect a proton with 1GeV of kinetic energy by 1 cm over a distance of, say, 2 m. Since you're obviously technically literate, that shouldn't be too difficult.

Hint: the answer is, "An impractically strong field is required, by a couple of orders of magnitude." Ever wondered why CERN use helium-cooled magnets which way tens of tons in their beamline?

External shielding is often a bad idea for space hardware. The shielding is heavy (as it must be to stop particles), and itself becomes radio-active over time as it is exposed to wonderful effects like gamma rays. We get this free shielding of miles of atmosphere, here on Earth, and we get even more shielding from solar radiation half of the day. (The technical term for this is 'night-time'). That Earth shielding also gets rid of a lot of the more intense interstellar radiation as well.

Well, they also have to do a test of it before most outfits will try the chip out on a serious mission. NASA isn't about to assume that a given chip really is ready for the radiation environment around the Earth when it comes to a major project like Hubble until it's been demonstrated on a less-expensive satellite. So you have to find someone willing to fly the beast and verify that it's OK for whatever duration people require to feel safe. That adds more time onto the turn-around.

That statement for it taking '10 years or so' for a space suitable chip has to be false. How else do you explain the 486 chip up there? Unless it was 'upgraded' from only God knows what back in the late 90's.

They use what are called Rad Hard devices. Using basically the same technology to make the device, but the substrate is shielded with SiOxide layer. The difficulty lies in getting the layer a few nanometers from the surface without busting the silicon crystal. You usually make a lot and get a few that work.
The big issue with making a substrate is isolating the substrate from the actual device. So that when radiation hits the chips, it doesn't flip your zeros to ones. Quit a number of military chips ar

The little snag with radiation hardening, if I'm not wrong, is that it multiplies the price tag between 100x and 1000x

Also, the physical silicon die is larger, and if here on earth the smaller the better (less heat, etc), up there it's the other way around. The larger the features, less (signal) damage a cosmic ray will do.

I would imagine it's a little more difficult than simply popping out the CPU and putting in a new one. If you were tasked with upgrading a 486 here on Earth, how many components do you think you'd be able to recycle into the new machine? You'd end up replacing the whole thing, maybe keeping the HDD around just long enough to get your data off it.

I'd keep um... Nothing. I wouldn't even bother with the case. Everything is using ISA cards, CD-ROMs had to be supported through the Sound Blaster Sound Cards and the Power Supply is outdated for the connections to the motherboard. (I'm using a semi-working 486 I have for reference for this)

I'd imagine you are correct - but it does raise the issue of whether future space-tech could be designed to be upgraded. It's a pretty trivial task to swap components in PCs these days - why not have telescopes, etc., of the future more plug and play? I could almost imagine an automated service vehicle carrying out an upgrade.

I work for a rather large international retailer, in the US division's in-store IT department.

We have a good number of registers that are 486 based, although we're encouraging the stores to get rid of them.

The published minimum hardware spec for the last software release of our dominant POS application was a Pentium (not P-II, P-III, etc.) with 8M of memory. The reality is that it will run on the 486 register systems too, we just won't support it if it starts acting odd.

Actually, the P5 Over drives had the FOOF bug. You need "Reliable" type tech.

You do not need a significant increase in computational power. You need to increase reliability. If your OS goes bad, just re-read the whole thing from ROM. If a large portion of the program/OS is in rom, you dont need a lot of ram, just to store variables.

Most rad hardened CPUs are RISC (powerPC, SPARC), there are very few options for x86 based rad hardened CPUs. Mil-spec wise Intel is doing well with their newer stuff (dual-core, etc.), but none of it has made it to the rad hardened world yet.
The RAD750 [baesystems.com] is pretty much 'state of art', running at 166MHz.

Replacing an old 486 with one of these would require rewriting / compiling all the code running on them. Probably not enough of a performance gain in relation to the cost / risk of basically rewriting the

IIRC, the 486 was chosen specifically for the physical size of the data paths? Or the dies that cast the chips themselves? Either way, they were large enough that passing radation would be less likely to corrupt data that it would on the newer, smaller pentium based chips.

A quote from the famous "Real programmers don't use Pascal" article written in 1983.
Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the Pioneer and Voyager spacecraft by heart. With a combination of large ground-based Fortran programs and small spacecraft-based assembly language programs, they are able to do incredible feats of navigation and improvisation-- hitting ten-kilometer wide windows at Saturn after six years in space, repairing or bypassing damaged sensor platforms, radios, and batteries. Allegedly, one Real Programmer managed to tuck a pattern matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter.
The current plan for the Galileo spacecraft is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/- 3 kilometers of the surface of Mars. Nobody is going to trust a Pascal program (or Pascal programmer) for navigation to these tolerances.
If you have never read it, it's still a great read (at least for us old-timers).
http://www.pbm.com/~lindahl/real.programmers.html [pbm.com]

A quote from the famous "Real programmers don't use Pascal" article written in 1983.

Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the Pioneer and Voyager spacecraft by heart. With a combination of large ground-based Fortran programs and small spacecraft-based assembly language programs, they are able to do incredible feats of navigation and improvisation-- hitting ten-kilometer wide windows at Saturn after six years in space, repairing or bypassing damaged sensor platforms, radios, and batteries. Allegedly, one Real Programmer managed to tuck a pattern matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter.

The current plan for the Galileo spacecraft is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/- 3 kilometers of the surface of Mars. Nobody is going to trust a Pascal program (or Pascal programmer) for navigation to these tolerances.

If you have never read it, it's still a great read (at least for us old-timers).

You and your fancy floppys. I remember having to cobble together media with saran wrap, iron filings, a Quaker oats container top, and a hot glue gun. Then I had to repeatedly rub my feet on the carpet and zap spots to lay down the formatting data.

Assembler? Bah. Us Real Programmers use a floppy diskette, a needle and a horseshoe magnet.

Bloody kids and their magnetic media. Some of us have used easily repaired, humanly-readable punched cards (IBM-360), which never seemed to have hanging chads. Then there was good old paper tape (PDP-8).

Paper tape needed repairing more often than the punched cards. There was always some good sticky tape and a couple of round hole punches available to repair breaks in the paper tape. Breaks tended to occur a few times per furlong when reading a freshly written tape, but were very rare when writing. Breaks

From reading the article, it didn't sound like they could even do upgrades, even if they wanted to (although I suppose they probably could salvage the mirror and build a new system around it). That actually surprises me a bit, since they knew this would be a long running mission and it is within range to be worked on. I know these days as a computer engineer, my bosses are always telling me to design for the future with upgrades in mind, but maybe that wasn't

I had the opportunity to work on a radiation hardened computer for a satellite. This was in the late 90's, and was probably much like the Hubble Telescope processor. We went through endless simulations and scenarios to try and second guess every conceivable error or fault possible. Bit flips were easily handled, but double bit failures were not. When we had double bit flips we had to reload all the software and data anew.

The equipment we used to design and support the onboard system was modern (for that

I'm willing to bet the algorithm timing was based on how long it took instructions to execute and not an outside clock. In other words, a change in the execution time, not just the clock speed, will mess up the software.