Can GroovyMame be used on an Nvidia card?? I know it wont be able to do the generated resolutions, but want to use it for the timing and screen tearing tweaks that is uses??

Yes it can, although mostly the timing/tearing fixes are from it actually changing the modelines for the ATI drivers/cards. It's still useful and can be combined with Soft15khz to basically do what cabmame can do, essentially can have the setup be done that way with an NVidia card.

Well heres the thing have tried that, and it keeps coming up that the rom files are missing, even though i have pointed the ini file to c:\mame\roms

even if i just type 'mame' in the command line without a rom file name after it, it loads mames own built in gui, and even lists the roms i have in the rom folder, but when i select one it says that a file or CHD is missing!??

It appears to be both, but have just downloaded some more roms and these work so i think it might be old rom version that are the problem not mame? Im running win7 64bit

Yeah probably win7 uses a different registry path for the video settings, so it fails when trying to access the hardcoded one, we'll need to fix that, or at least deal with that error as for the case of non ati cards it wouldn't be needed to read the registry anyway.

ATI drivers already support up to 60 custom video modes with their own custom timings without any hacking, compared to nVidia that has the 32 custom modes limit.

Anyway the most important advantage is that ATI allows dynamic modification of these custom timings on the fly, although this feature is undocumented afaik. That's how GroovyMAME can use recalculated modelines for each game.

Unfortunately nVdia drivers do not support this feature. I've done different tests these days using similar methods with registry modelines, but I'm afraid the new information is not used by the driver until the system is restarted.

However, semi-dynamic modelines will hopefully be possible for nVidia cards too in the near future, by using the Powerstrip API.

I recently acquired a MAME PC (configured with Soft15kHz, MALA and GuiMAME) with NVIDIA (AGP card) graphics but as I'm a very curious guy, I'd like to try GroovyMAME instead of the GuiMAME. Also, it's 0.126 and I want to run the PGM games added recently.

So, if I just set it up to be run from MALA, will the correct settings for each game be used immediately, or what else do I need to do?

I recently acquired a MAME PC (configured with Soft15kHz, MALA and GuiMAME) with NVIDIA (AGP card) graphics but as I'm a very curious guy, I'd like to try GroovyMAME instead of the GuiMAME. Also, it's 0.126 and I want to run the PGM games added recently.

So, if I just set it up to be run from MALA, will the correct settings for each game be used immediately, or what else do I need to do?

Yeah, GroovyMAME should use the modes set by Soft15Khz out of the box. However, as you're using a nVidia card, GroovyMAME won't be able to recalculate each modeline so it will run with the predefined ones. It won't be as good as running it with an ATI card but can't be worse than using regular MAME. And of course you can force the -syncrefresh -soundsync options to have perfect scrolling/sound in all games.

and got fullscreen in a game (Ketsui, PGM hardware @ 224x448 59.170000Hz) that's normally not fullscreen (perfect modeline not available) but it is blurry. If I use your modeline generator and create the resolution needed I should be able to add it by running Soft15kHz?

and got fullscreen in a game (Ketsui, PGM hardware @ 224x448 59.170000Hz) that's normally not fullscreen (perfect modeline not available) but it is blurry. If I use your modeline generator and create the resolution needed I should be able to add it by running Soft15kHz?

It's blurry because it's stretched on an interlaced mode. Bear in mind you need 448 lines to render that game without artifacts, and that's not possible on a standard arcade monitor. What kind of monitor are you using? If it's a multisync monitor, you can use a 31Khz modeline for that game. VMMaker can calculte that resolution if your monitor supports it, then you can use Soft15Khz to add it.

It's blurry because it's stretched on an interlaced mode. Bear in mind you need 448 lines to render that game without artifacts, and that's not possible on a standard arcade monitor. What kind of monitor are you using? If it's a multisync monitor, you can use a 31Khz modeline for that game. VMMaker can calculte that resolution if your monitor supports it, then you can use Soft15Khz to add it.

Well, it's a 15kHz game and I've run the actual board just fine on my NANAO MS9 monitors (that's what I'm using now btw).

Quote from: MAWS

A flexible cartridge based platform some would say was designed to compete with SNK's NeoGeo and Capcom's CPS Hardware systems, despite its age it only uses a 68000 for the main processor and a Z80 to drive the sound, just like the two previously mentioned systems in that respect. Resolution is 448x224, 15 bit colour. Sound system is ICS WaveFront 2115 Wavetable midi synthesizer, used in some actual sound cards (Turtle Beach).

For most games I've tried with GroovyMAME, just using the rotation built into MAME's GUI menu has worked fine. I'll try this though.

No, you actually need to use the command-line param otherwise the modeline won't be calculated properly in the first place. Similarly, you need to lauch games from command-line: if you launch them from the UI built-in MAME they won't get the right video modes in GroovyMAME.

Yes, at the moment the Powerstrip option only works with the current desktop resolution, that's why I recommended it for LCDs only. Of course if you use a PC CRT without resolution switching it's much the same thing. Actually it mostly ignores your 'monitor_specs' settings but for the line and frequency limiters. What it does is to grab the current videoport values and tweak the dotclock and the total number of lines in order to adjust the vertical frequency, so it's a rather hacky method.

The idea is to fully integrate the Powerstrip thing into the modeline engine. There's a weakness to this method and it's the dotclock accuracy and stability, which is very variable depending on the chipset you use according to my experience, and always inferior of what you get with ATI driver based modelines.

Yes, I highly doubt that the Powerstrip method will ever compare to what we get with ATI drivers, where we're used to an almost deterministic behaviour. I'm not even sure if it's a good idea to fully implement this because it will encourage people getting the wrong video cards.

The fuzziness lies in the method used by Powerstrip for programming the videocard dotclock. We can only request PS for a given dotclock, and PS will produce the closest possible stable dotclock.

Part of the problem is that the dotclock granularity can show huge variations along the videocard's operational range, so you can have a lot of options for low resolutions but just a few of them for higher resolutions. We can't know this at first hand, so we have a problem if we wan't to create new modelines in real time.

This varies a lot depending on the video card, so while I can get excellent results with this laptop for any desired resolution and refresh (GeForce Go 7400), things are not perfect in other systems I've tested.

Second (and worse), is the fact that results are not always consistent for some cards and situations. So I've seen that some modelines work today but not tomorrow, and my guess is that it has to do with dotclock stability.

As for your case, you may get better results lowering yout desktop resolution to something closer to the ones that the vga monitor setting is picking. Probably around that resolutions PS finds more stable dotclocks to choose from.