I can't understand why anyone would want a memory mapped graphics engine, where the CPU is used to draw the bloody graphics. In the case of a 68k it certainly doesn't make a heap of sense, really.

A GPU, or a Graphics Processing Unit, is most certainly the way to go ... and of course something that takes ALL graphics away form the CPU ... and basically puts the entire AES into a GPU, so that the CPU doesn't have to do jag shet, except tell some graphics engine to "draw" and "fill", basically speaking. Drawing text, should be the same thing ...

I'd say both is required. Of course you're right, I also believe external GPU and VRAM will be the way to go. Let's not forget that 1024x768x8bp equals 768k RAM. Double for high colour, tripple for true colour.

But if you want backwards compatibility as well, and be it only not to swap displays all the time, you would need to emulate standard graphics as well. And that requires RAM access.

Or maybe you could grab the original RGB signal and route it through the new graphics solution, when in "original mode". When this involves line doubling it also could lead to not needing multiple displays (or converters) as well.

Ingo

“Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.” - Antoine de Saint-Exupéry

oehansen wrote:I can't understand why anyone would want a memory mapped graphics engine, where the CPU is used to draw the bloody graphics. In the case of a 68k it certainly doesn't make a heap of sense, really.

If you want to use existing software then it does make any sense to NOT used a memory mapped graphics engine. Yes, you can of course use a GPU. I have done so since 1997. But you can't use a GPU alone. Only applications that use nothing but VDI and AES calls will work, and that puts a lot of limitations on what the software can do. Even what appears to be clean GEM applications use sorcery and voodoo behind the scenes to be able to use non-palette bitmaps, because the VDI does not have the necessary features.

Realistically speaking all new hardware will have to be able to work with 20 year old software. Because no-one will create replacements for that software.

Ah, a ROM port solution. I have such an idea ony my mind for many years. A simple, cheap (a small FPGA + SRAM based) plug&play solution that would work on every Atari from the very first 260ST to the TT and F030. Even the driver could be loaded directly from the device at boot. However, the hard part would be to create a clever way how to transfer data to ROM port as fast as possible and also to address a suitable amount of videoram (1MB at least) and, of course, to make the driver (probably fVDI based, becuase it is opensource).

OR, we could take ROM port only as a base and pass missing lines (reset of the address signals, R/W, etc..) from the inside of the computer to the ROM device via a ribbon cable to make a complete 68k BUS.Or both

Ah, a ROM port solution. I have such an idea ony my mind for many years. A simple, cheap (a small FPGA + SRAM based) plug&play solution that would work on every Atari from the very first 260ST to the TT and F030. Even the driver could be loaded directly from the device at boot. However, the hard part would be to create a clever way how to transfer data to ROM port as fast as possible and also to address a suitable amount of videoram (1MB at least) and, of course, to make the driver (probably fVDI based, becuase it is opensource).Or both

ODIN didn't connect to the ROM port - it connected to the ST monitor port and the VGA monitor. I don't know whether later driver versions improved performance/compatibility.

Would it be possible to replace the Shifter with a FPGA with additional features?

penguin wrote:Would it be possible to replace the Shifter with a FPGA with additional features?

Everything is possible... But the real question is, is it a realistic solution?

The answer would be no. There's all sorts of problems replacing the shifter, limited memory bandwidth is one of the bigger obstacles, and adding secondary videomemory would make the finished build require more work and soldering than just about anything else you can think of.

The most viable solution imho would be an alt-ram board, such as the MonSTer, using an FPGA to interface the fastram for the CPU, and using the same FPGA to house a SuperVidel or similar solution, the ram would double as video/fastram.

Another solution would be using the cartridge port, upside would be easy installation. With bankswitching and other tricks you should be able to achieve decent resolutions. Downside would be the speed, without internal modifications you would not be able to benefit from accelerators etc,

penguin wrote:ODIN didn't connect to the ROM port - it connected to the ST monitor port and the VGA monitor. I don't know whether later driver versions improved performance/compatibility.

ODIN looks like yet another ISA-port adapter using an ISA graphics card.

Since the picture only shows the topside of the ISA adapter with the graphics card stuck in, I can only speculate how it was connected.

But judging from the "tilt" and that it is supposedly STE only, my guess is that it plugged into the CPU socket. Purely speculation though.

Sorry, external solution for all STs... read to fast. No, I can guarantee you it was not only connected to the monitor port. Likely the cartridge port aswell, if not the cpusocket inside via ribbon cable.

Greenious wrote:No, I can guarantee you it was not only connected to the monitor port. Likely the cartridge port aswell, if not the cpusocket inside via ribbon cable.

... and from the (correct) picture you can in fact see that it was indeed only connected to the monitor port. It also explains how it is done: One high-res frame was stitched together from up to five frames output via the monitor port. Of course this seriously reduces the update rate of your screen in high-res mode.

czietz wrote:... and from the (correct) picture you can in fact see that it was indeed only connected to the monitor port. It also explains how it is done: One high-res frame was stitched together from up to five frames output via the monitor port. Of course this seriously reduces the update rate of your screen in high-res mode.

Yes, I can see that now. And tbh, besides a horrible update rate, the screen must have been prone to various constant glitches aswell. It is demanding already to fill a completely fresh screen every vbl as is, and this solution is counting on it for the stitching to work...

how does (if it works) such a graphic card solution work with old, ST-high (640x400 mono) software (i.e. C-Lab Notator)? Does the software see the ST as just being in ST-high, and display its windows in mono as before, or won't it work (will only handle "modern" software which is designed for hires colour)? I'm suggesting some sort of compatibility because I think I saw a mention somewhere of setting the graphic card to 640x400, but in colour mode.

And with a "modern" desktop replacement such as Thing, would it then be possible to use hires colour icons, but still be able to use traditional mono-only software?I believe I also read somewhere you need a multisync monitor with a switch to select between the ST and graphic card outputs (I assume nothing will show from the graphic card output until a driver in the AUTO folder has first run, so perhaps when everything is configured properly it's OK to just use the graphic card output).

There will be a solution which can be plugged just on the CPU. It is not only graphics card. It the thing many of you don't believe in that it happens. But it will happen this year and you will love it. "The others" have it already.

If I am to play the guessing game I hope that whatever it is it'll give USB keyboard/mouse support in addition to IDE (or CF/SD card) storage and flash-ROM and ALT-RAM and a graphic card solution which will be fully ST compatible.One can dream

When you say this year, do you mean "very soon" or late this year? By "the others" I suppose you're talking about the Amiga scene, but it still doesn't give me any clues. Anyone else care to take a guess?

Considering that 1st1 is a big fan of the Vamp---ahem, "card that must not be named"--- one can indeed take a guess.

Anyway, even if that card was available anytime soon, your questions would still be relevant. I cannot say about Notator in particular, but cleanly written GEM software will work on a third-party graphics card. This does not necessarily mean "modern" software, since there's also software from the 1980s that properly uses VDI & AES and as such works on a graphics card, even if that happens to display 1024x768 pixels with 256 colors.

I believe Notator used a few "tricks" to make it work more efficiently, so in that case it might not work. From what i hear this was a pretty common thing to do back in the day with MIDI software. But it would be interesting to hear if someone with an add-on graphic card had tried running Notator, Creator, Logic etc.

Just out of curiosity: what would happen if some software (i.e. C-Lab Notator) was run on such a system without being compatible? A simple crash? Video output dirverted to the ST monitor output connector instead of the graphic card output? Is there some sort of "fallback" setting where in such a situation the graphic card would just switch over to 640x400 in mono (ST high) while in other situations (software which is compatible with it) the graphic card would display colours at a higher resolution? I've never tried an ST before with a graphic card so excuse my ignorance.

Woulnd't this test also apply to the TT030 with it's higher resolution modes? I think most proper GEM based programs wouldn't have an issue. But just because it looks 'gem-like' doesn't mean it's actually using GEM. I assumed this with Phantasie 1-3, but they don't quite work right on the Falcon, granted that could be due to other things as well.

It already could be on the way since a year. What I can tell, is that it is now under testing. It still will need it's time, but we will see it. Now you hate it, you don't believe it, then you will love it. And those who do that work now will be your heroes.