Posted
by
timothy
on Saturday February 11, 2012 @03:40PM
from the disintergrated dept.

acadiel writes "Matthew H from the AtariAge.com TI-99/4A forum has finalized a design of a TMS 9918A replacement (with VGA out) for classic computer systems such as the ColecoVision, TI-99/4A, SpectraVision, MSX1, SpectraVision 128, and Tomy Tutor Home computers. This hardware project replaces the native video controller on these classic systems and enables them to have VGA output for the first time." (It's just under $100 to order one.)

Except that you're still upconverting a signal from 240p to 480p. By going directly to VGA you're at least getting a crisp 480p image (ie: 640x480). And no, doing this after the signal has been produced at the composite outputs is not going to be as pretty.

And no, doing this after the signal has been produced at the composite outputs is not going to be as pretty.

Unless you're using a program that relies on the artifacts in a particular video chip's composite output. The NES PPU's architecture was heavily inspired by the TMS9918, and I know a lot of NES games rely on interactions between luma and chroma to give the backgrounds more texture.

And in fact, NES artifact colors have been emulated for a long time. The key is to emulate the way the original chip generated a composite video signal and then emulate a TV by decoding that back to RGB. But then that's the same thing as using the original chip with a composite-to-VGA adapter, unless of course you want to add the original CPU and I/O and put the whole console on the FPGA.

Putting the whole console on FPGA seems like the best option to be honest. If you're emulating the video chip which is probably really poorly documented why not emulate the CPU. Z80s and 6502 are really well understood by reverse engineers these days. There was a CCC talk about transistor level simulation of the 6502 which is absolutely perfect. The idea was that if you find some weird code that works on a real 6502 but fails on your emulator you can work out what you're missing.

Composit video did have some neat hacks that is more difficult to produce today. There are emulators still trying to match the composit look.First the low DPI of the tv made low resolution less pixelated all the pixels had round corners making low res images look softer.Another cool hack was using particular color offset combinations to create colors beyond what the video card can support.

Best video output possible? That sounds an awful lot like marketing department FUD unless the source resolution of the video signal is actually being increased. I'm sure it might be interesting for a few enthusiasts, but it seems awfully hard to justify the $100 price tag just for a format shift.

It's not a question of resolution or format shift, but of color. The chroma channel of composite NTSC video gets smeared all over the place by design, along with other issues (such as dot-crawl) that come with bad interlacing implementations -- especially with composite sync.

S-Video would help some of these problems without increasing resolution, but VGA does not have these problems at all, which is what makes it useful.

To be clear: It's into the realm of esoteria. But in this area of stuff, we also have

You're completely, absolutely, out of you mind deluded. Sorry. At work I use some test instruments, made by Tektronix and HP, where the date codes on chips are all in the 70s. They work beautifully, and I regularly "hack" on them. They are anywhere between 30 to 40 years old at this point. There's nothing fussy and temperamental about those systems, and some of them are so complex that a consumer-grade microcomputer or game console holds no candle to them. I'd say that all of the consumer systems that this chip replacement would go into are comparably simple. If you would really have a problem with them, then it's your problem, not a general one. If you want complex, take any modern PC and try replacing a BGA chip in it. I'd take a 30 year old piece of gear any day, I probably could do chip swaps in those blindfolded.

Probably the biggest problem is going to be all the old school electrolytic capacitors. I know my TI-99/4A is a bit flaky, and I suspect that's why. The VDP was running at the edge of process technology in those days (5.37MHz!) and it wants nice, clean clocks and nice clean supply rails. The rest of the machine runs a fair bit slower, with possible exception of the 256 byte SRAM that the TMS9900 CPU stores its "registers" in.

Thankfully, those big old electrolytic cans are easy to spot and easy to solder in replacements for.

Electrolytic cans are one thing, another thing is probably the abysmal power distribution (long, winding tracks) and poor decoupling techniques. A 5MHz "clean" clock has useful harmonics up to about 50MHz, so a decoupling cap that's far away from all the circuit points where the clock goes/comes from may be useless -- and you have to measure along the traces, not as crow flies. When you design with general purpose electrolytic caps, you pretty much assume that for digital logic decoupling purposes they are

Electrolytic capacitors only fail very rarely. What you find is that disc ceramics fail and pull the +5V rails down (they go leaky, and there are dozens of them used to decouple supply rails), and tantalum beads just plain explode when they fail.

You might find that very very old electrolytics have dried out a bit and lost capacitance, but that should be obvious because the supply rail will ripple badly. Leaky disc ceramics will get warm.

I'm sorry you apparently fail at reading comprehension so allow me to break it down. What you have was NOT consumer crap, what you have was designed for business and engineering which had waaaaay better quality parts. the caps and chips used in CONSUMER grade crap was then as now simply not up to the quality of professional instruments which is why we have workstations and desktops with the desktops having significantly lower quality caps, PSUs,fans etc.

We're talking about stuff built for kids in the early 80s and you are talking about HP back when they were THE scientific brand, I'm sorry pal but you couldn't be any more off base if you actually tried and those that marked you interesting obviously don't know how big a quality difference there was between HP and brands like Coleco whose other claim to fame was fricking cabbage patch dolls.

I have an Atari 400 made in 1980, an Atari 600XL made in '83 and a 130XE made in '85. Along with a desktop DEC MicroVAX of mid-80's vintage. All still work just fine with no flakiness. Clean the cartridge ports and take care of them they'll outlive you. The 8-bits may have been low cost but they were QUITE well engineered. At least the Atari machines and Commodores were great. Not sure about TI or Coleco.

They made much higher quality caps then and used lead solder and much thicker PCB's with more copp

Well we know now that the caps from the late 90s through mid 00s were crap due to industrial espionage where they stole only part of the formula and boned the recipe, but Atari before the Warner buyout (and for a year afterwards as they cleared out existing stock) frankly wasn't consumer grade quality, they actually put care and love into those. Compare them to someone like Coleco or Mattel who were toy companies that simply jumped on the electronics bandwagon and its night and day. With the Coleco you had

Actually the original PS1 isn't so bad, neither is the Dreamcast. I've worked on both. There's actually a fairly common issue with controller ports failing on the Dreamcast that's relatively easy to fix. The XBox and the 360 are nightmares.

It is quite possible to repair newer electronics using modern techniques with hot air reflow soldering and surface mount components. Techniques are different and with silver solder the melting point is closer to the failure point of components but it is still quite do

While it is true that the Pi could probably make a hell of an HTPC frankly there just isn't that much more you can do with the thing, not unless you are an expert coder with intimate knowledge of assembly so you can squeeze every last drop of performance out of the chips. With that Athlon I can make an HTPC, nettop, file server, jukebox, firewall,workstation, hell depending on the board i can even slap a good cheap PCIe card into it and make it a render box for video transcoding, because of the generic natu

For the record, I have a 12 year old 450 Mhz PIII Dell desktop that continues to do duty as a network monitor, with 100% uptime except for reboots for software updates, and forgetting that it won't boot unless a keyboard is plugged into it.

I could replace it, but why? It's been perfect so far, and recently took an upgrade to 32 bit RHEL6 without a hitch.

Usually it's not about parts, but about quality of the design of the circuit (and parts, too, of course). It takes a whole lot of money to open a chip fab, no one will a-priori decide "hey, we're making a fab for poor quality chips!". A, say, 6502 or Z80 CPU used in one of those "consumer crap" devices is not graded for consumer crap (except for temperature range, and they avoid the use of the term "consumer crap"), it's the same one that went, at the time, into industrial and T&M equipment.

Let me see if I have this straight, feel free to correct me if I'm wrong. You first having to have a working one of these machines

Yes. Personally, I have three...

we are talking consumer level quality of the early 80s

Yes. One of mine has no color output, only B&W because the 9918 is fried...Thus, I intend to get one of these chips and make that one into a VGA-out unit. Neat!:)

The chip is even socketed from the factory on the 99/4a (unlike the monster double-wide 64-pin DIP CPU), you don't even need to desolder it. If I remember right there's even a little aluminum heatsink on it attaching it to the chassis. It really was a pretty powerful little c

"Do you know how damned fussy and temperamental some of these machines were to start with?"

No, I don't Hairyfeet. Why don't you enlighten us? Name a few, and what was fussy and temperamental about them? And also please, define precisely what you meant by those terms. The computer not responding when you talk into the mouse doesn't count.

Please consider that just thirty-odd years ago, one could own a computer that wasn't the university's or corporation's. Whether one came fresh to it or from mainframe milieu, there was an immediacy, a power, a whole new realm of discovery. One no longer had to submit their deck of cards to an acolyte to the high priests of a Burroughs or CDC Behemoth only to get back a core dump due to an errant comma. Some, even now, for reasons of nostalgia or fun, continue their interest and enthusiasm - vibrant 8-bit micro communities are but a search away.

The TI-99/4A offered, amongst other things, 16 sprites with built-in collision detection. At the time this was nigh magical. Sprites were effectively independent of screen - they were a 'floating' layer above it and allowed for some interesting game and simulation possibilities. SCREEN itself was a defined device; one could PEEK and POKE 'most anywhere, and PUT and GET to any device. An entire screen could be represented with a string in memory, its contents readily changed on the fly. One could read data for a string from a DATA statement in program code or from (eventually) floppy; with several strings screen-swapping, almost animation, could be done. Graphics could accompany text adventures. Add sprites? Oh, my. And now with VGA?

Actually, it was 32 sprites, with a limit of 4 to a line. It had collision detection but it was rarely useful. It had a single bit to tell you that any sprite hit any other sprite. To figure out what hit what, you'd have to walk the descriptor list and do the actual computation yourself. (Or, in the case of TI Extended BASIC, the interpreter had to do it for you.)

On the TI-99/4A, that meant actually accessing VDP memory, since there wasn't much other RAM in the system. That itself was pretty slow, because it wasn't memory mapped for the CPU. You have to write to the VDP's address register, and then do repeated reads after it fetched the byte. Depending on the display mode, that could be as long as 8us during active display (Graphics II mode -- everybody's favorite "bitmap" mode.). Fortunately, the address pointer auto-incremented, so if you were accessing a contiguous structure like the sprite descriptor list, at least you didn't have to keep reloading the address.

Not that TI Extended BASIC was necessarily able to do that, of course. (Read up on the abomination that was GPL. Not the license, but the interpreted language that much TI software was written in, including TI BASIC.) But if you wrote your own assembly code, you could make that optimization, which is probably how Parsec was able to do its soft-scrolling in the time allotted.

(Actually, VDP RAM isn't memory mapped on any platform that I know of. But other systems have CPU-addressable memory that you could store a shadow copy of data in at least. The paltry 256 bytes on the TI-99/4A, though, are far from enough in many cases.)

Thanks, not only for correcting my paltry memory, but also for the clarification and further exposition. Had I mod points, you'd get some.

It was my brother-in-law's TI; I didn't spend much time with it, rather my own Atari 800. On that, I really never got down into the hardware, but did manage a few things during vertical and horizontal blank interrupts. The more heavy-duty technical aspects of all that stuff, then and even more so now, is way over my head.

Actually, VDP RAM isn't memory mapped on any platform that I know of. But other systems have CPU-addressable memory that you could store a shadow copy of data in at least. The paltry 256 bytes on the TI-99/4A, though, are far from enough in many cases.

On the contrary, many 8-bits had memory mapped video. Commodore (VIC/64/Amiga), Sinclair (Spectrum, zx81, zx80, dist by Timex in the US) Atari, VT100... etc etc. Not that there weren't machines with distinct video RAM, the Commodore PET had specific video memory, though it was still mapped into the address space like a modern PC video card. Having the video RAM in a inaccessible (I/O bus) location was rare.

The reason was simple, at that point in time only a relatively small amount of RAM was needed for the machines and it ended up being faster than the CPU. So much so that you could assign 50% of the RAM bandwidth to the video subsystem without impacting the speed of the processor at all.

The ZX81, however, was a bit of a foreshadowing of things to come. It was built really, really, cheaply and they used really cheap DRAM. This cheap RAM wasn't fast enough to feed both the CPU and the video at the same time so the CPU was basically turned off when the video was being displayed (In fact it was physically used as a counter chip by the ultra cheap video controller).

Nowadays people want astronomical quantities of RAM so it basically has to be the cheapest design possible; this type of RAM can't keep up with just one CPU, let alone multiple CPUs and a video controller. So the video controller has to reduce the performance of the main CPUs by stealing cycles, or it gets it's own RAM.

Note: There are several Intel video controllers for PC clones that use main memory as the video RAM, they get added as a cheap motherboard video controller. Because of the fact that they're using slow RAM and stealing cycles from the CPU these are rightfully seen as very low performance.

On the other side, a common mistake for designs with distinct video RAM is that the CPU only ever needs to write to this RAM, unfortunately the problem is frequently only recognised in production.

Narishma already said it here [slashdot.org], but I was referring specifically to machines that used the TMS9918/28/29(A) VDP, often just referred to as "the VDP." So far as I know no system that uses the VDP was able to memory map and dynamically multiplex CPU and VDP access to the 4K or 16K of DRAM connected to it. And, that walled off access to the DRAM was a particular drawback for machines that used the VDP, which is why I pointed it out.

But that does make me wonder why people would be so enamoured with this chip, a video controller that can't (it appears) share the memory with some other DMA/CPU etc chip and forces all accesses to it's memory to be made through a 'pin hole' of a couple of I/O ports really strikes me as a bit of a deal killer whatever other features it might have.

Well, it did do a lot for you. And the large RAM with flexible descriptor tables meant that in practice, you could avoid doing too many writes over to the VDP most of the time. And, the separate dedicated-RAM architecture does guarantee no cycle stealing, unlike, say, the VIC and VIC-II chips in the Commodore computers, or the need to wait for horiz/vert refresh to avoid "sparkles" like the old CGAs.

then just implement DVI-D over HDMI... so you don't have to bother with the DRM. Check out the wikipedia article I imagine there are some kinks but HDMI is mostly backwards compatible with DVI-D
And I have real world evidence as well.. Pandaboard does DVI-D for the very reasons you mentioned and it works just fine with a couple TV's I have tried that said it was picky one one monitor I tried though it seemed to be a kernel issue as one kernel would boot up on HDMI the other on DVI so had nothing to do wit

Just about every LCD TV that I've seen in (U.S.) stores has a VGA input. It might be the case that you live in Europe and your local TVs include a SCART port instead. I'd bet the actual video processor in such TVs can sync to both 480i SCART and 480p-1080p VGA.

Of all the chips that one on the Commodore 128/128D is a pain to convert to anything modern as it uses the old CGA/RGBI interface. All the CGA adapters ive found dont handle the intensity signal, they are more RGBA compatible.

But this is not a big problem -- there's dead-simple passive analog circuits (e.g. [google.com]) to do a passable conversion, and if you want to fix the dark-yellow/brown issue, that's not hard either.

RGBI signals in CGA are TTL, so converting to analog RGB is as simple as connecting them to the address lines of a suitable 8-bit PROM (or SRAM, in which case you'll want a battery to retain memory) programmed with appropriate RGB values, and three 2-bit ADCs on the output (0=0v, 3=0.7v for VGA).

I believe I saw one that did that and did not have the brown problem but it used a gal chip which most likely did the value change before being sent to a DAC. It was way more expensive than the solution listed which has more features as it was such a niche device. This was 1998-99 and was 350

It's a composite to RGBI to VGA converter.;)http://home.comcast.net/~kkrausnick/c128-vga/ [comcast.net]This is pretty nifty and they have a workaround for the intensity problem. Price is now higher for the parts mentioned about 190. One of the companies listed has a dead website, dns problem maybe.

Most home computers have analog circuits too - most notably for reading paddle controls, which FPGA's cant handle. And yes, folks do use paddles (also applies to some mice, and mini tablets like the Koalapad.

Most home computers have analog circuits too - most notably for reading paddle controls, which FPGA's cant handle. And yes, folks do use paddles (also applies to some mice, and mini tablets like the Koalapad.

Even more surprising than that: There's an active TI-99/4A group? Really? Is Bill Cosby a member? That was my first home computer and so it'll always have that special place in my memories, but that thing wasn't very useful when it was still current. I can't imagine trying to do anything useful with it now.

Yes, it was.. Real 16-bit even, though only the stock 256 byte system RAM (yes people, 256 BYTES) is on the 16 bit bus, but it's SRAM and runs at full processor speed (like L1 cache in todays processors)... I have an extra 32K of ram installed on one of mine directly on the 16-bit system bus... That would have cost a fortune back then, now it's just a few chips from the junk box!:)

That was my first home computer and so it'll always have that special place in my memories, but that thing wasn't very useful when it was still current. I can't imagine trying to do anything useful with it now.

You must not have had a PEB... With the Peripheral Expansion Box the 99/4A was capable of similar performance to the early PCs.

Even more surprising than that: There's an active TI-99/4A group? Really?

Yes, there are those of us still active in various old computer projects, building IDE disk interfaces, etc. to allow easy use of these fun old boxen.... I've built my own CROM/GROM emulator for the TI (someone else designed an IDE interface but I haven't built one for myself yet) and I'm currently designing and building an IDE disk interface for the old WANG 2200 minicomputer line

ok its the same chip just with RGB output though the Coleco didnt use RGB, which has really confused me, most of the computers use the 18 which spits out compostite, where RGB would be preferred, the Coleco used an RGB chip and summed it together into composite, talk about ass backwards.

It's not really RGB output, but rather Y, Y - R and Y - B luma/color difference signals -- actually frightfully close to S-video. But I'm pretty sure they had an app note back in the day that showed how to sum those to get RGB almost trivially.

The reason they went with the 9928A (and later 9128A) was to avoid the "rainbow effect" that was is prominent on the 9918A. See, the 9918A didn't flip the chroma carrier field-to-field, which leads to reinforcing chroma errors. That's also why you couldn't use the EXT VIDEO input on the 9918A to mix with arbitrary video sources (say, for a video overlay), but you could use it to daisy-chain VDPs to get more sprites and such.

you know recycled stock of these things come up all the time for less than 10 bucks

Oh, I know... I just never bothered because I have three 99/4As and it still works, just not color and could be useful for parts if one of the ones I actually play with (one is original, one is modded) have a problem. Now if I buy one of these, it's $10 saved by not buying a 9918.:) I had always thought of getting one of the later Yamaha chips like used in the Geneve, can do a little board for one of those with 128k or 192k of video RAM and do whatever it was, 512x512 graphics... I don't have a Myarc so

The VBXE video board for Atari 8-bit XL and XE machines. Will do 15khz RGB and VGA out and coexists with and extends the original video coprocessor chips (ANTIC and GTIA) providing a blitter and extending the color palette. Enhanced sprites too and more stuff. The Atari graphics chipset was much more programmable and flexible than this thing though every machine deserves to still have modern video output options. The Atari 8-bit is kinda like a baby Amiga in ways.

In Democracy in America (1840), Alexis de Tocqueville noted the rise of planned obsolescence in the United States:
"I accost an American sailor, and I inquire why the ships of his country are built so as to last but for a short time; he answers without hesitation that the art of navigation is every day making such rapid progress, that the finest vessel would become almost useless if it lasted beyond a certain number of years."

uh if you have ever used one of these shitty chi-co video samplers you would find the video quality was worse than plugging the composite signal into the antenna input of a 1977 black and white portable