Once you're emulating the S-PPU1, S-PPU2, S-SMP, and S-DSP, you might as well emulate the CPU and DMA too and make a Super Famiclone.

Like I said, people already emulate the NES PPU for the HDMI mods. Hasn't hurt sales any.

I'm probably the first proponent for just using a full emulator. That's what I've spent nearly a decade working on, after all.

But there's always a market for people who want to pay $300 for each loose game cartridge to run on their LCD TVs. And if you want to make some money, there's plenty of market demand here and absolutely no competition.

Once you're emulating the S-PPU1, S-PPU2, S-SMP, and S-DSP, you might as well emulate the CPU and DMA too and make a Super Famiclone.

Like I said, people already emulate the NES PPU for the HDMI mods. Hasn't hurt sales any.

The NESRGB and Hi-Def NES are functionally the same as ripping out a TurboGrafx-16's HuC6260 VCE (color encoder) while keeping the authentic HuC6270 VDC (which generates pixels as digital offsets into the palette). There's no analogous clean functional separation between S-PPU1 and S-PPU2 that I'm aware of. This means you'll have to emulate a lot of the functionality of the S-PPUs, more than you would have had to with the NES PPU, and you might end up something that's at least as far away from the behavior of the normal 1/1/1 or 2/1/3 consoles as the 1CHIP is.

Is it really worth going through that effort rather than just sampling the analog RGB directly? The functionality is the same from the user's perspective (install mod in SNES, new HDMI port in the back), and there are some sampling/filtering tricks you can do to get damned close to what pure digital would look like (see HD Retrovision's proposed SHVC to 1CHIP processing for an example).

Obviously in the NES a pure digital approach made sense since there was no native RGB signal to start with, but in the SNES there is, and emulating most of the hardware inside the SNES just to get what could end up being an imperceptible improvement in quality doesn't make sense to me.

EDIT: It would also be a heck of a lot easier to install than an NESRGB or HiDef NES, seeing as how you wouldn't need to desolder anything, just solder R/G/B/S/L/R/VCC/GND wires.

- They're still passing 240p, and many displays don't handle 240p as well as they should. They typically try to deinterlace it poorly, thinking it's 480i.- They're not digital, and that means many displays will add additional latency as they digitize and upscale (often using very blurry/distorting interpolation)- They're not HDMI, and some televisions these days only have HDMI inputs

The cables are a great option to have: some televisions don't have any of the problems that I've mentioned above, and they're going to be way cheaper, and many more televisions have component input than the increasingly rare composite. But some people will just want the convenience of a standard HDMI plug that outputs a clean 1080p signal.

So, my argument is that what's being discussed (re-implementing most of the SNES on an FPGA) is not going to provide meaningfully better video quality either, not compared to a good RGBSLR to HDMI solution that is targeted specifically at the SNES and the quirks of each hardware revision. You can do a lot better than the framemeister in that regard. Compared to the analog solution, the digital approach is going to be an enormous amount more effort to design, and is going to cost a whole lot more to build. And if it doesn't go through the expansion port, it's going to be a whole lot harder to install, too.

EDIT: I mean to say that, you can take advantage of some known facts about the SNES, including the fact that each channel is intended to be only 5-bits, and being able to model exactly how much intensity bleeds into the next pixel on the same scanline. Conveniently, HD Retrovision has access to basically every revision of the SNES ever, and has already done something like this with a mind towards their HDMIzer project.

One of the big problems with the SNES and NES is that they generate NTSC-shaped video at 60.1Hz, and many HDMI sinks will reject (or cause visible tearing, or drop frames from) sources that aren't much more precisely 60.000Hz.

The best solution to prevent those problems is to slow down (i.e. replace) the NES/SNES's master clock, as tepples outlined above.

Is the clock source not provided by a crystal that can easily be replaced such that you end up at 60Hz? I'm not sure how that would be a roadblock, unless I'm misunderstanding the problem, or it's crazy hard to desolder them. I'm not sure why you'd need to re-implement all the hardware from scratch to change the master clock.

Alternatively, you could have your scanline buffer grow gradually over time and drop a frame whenever it fills up, which I think would be every 10 seconds. Less annoying than tearing.

One cannot easily get the requisite 21.44196 MHz crystal. You'll probably have to synthesize it with a PLL (in practice, the PLL in an FPGA).

But in any case, my real point is: You can get RGBSc/LR directly from the A/V port on the SNES. If your plan is "just ADC all that stuff", there's no need for a modification at all, just a special A/V cable.

BUT if you have a SNES that has a visible vertical bar due to the DRAM refresh, the magnitude of that noise is often more than 1LSB of PPU2's video DACs, and that's why it might be worth figuring out what exactly it is that PPU2 is doing and emulating that.

No, the plan is to not build a super famiclone inside a SNES shell The idea is to avoid re-implementing key hardware if possible, and gain the quality improvements via targeted processing rather than a pure-digital path.

Is the clock source not provided by a crystal that can easily be replaced such that you end up at 60Hz?

You need to clock it slightly faster in games that use interlaced video, such as RPM Racing. These are the frequencies that the FPGA would have to synthesize in order to generate video at precisely 60 Hz and audio at precisely 48 kHz (assuming 1.5x resampling):

Audio: 24576000 Hz224/239 lines: (341*4*262-2)*60 = 21441960 Hz448/478 lines: (341*4*525)*30 = 21483000 HzAnd possibly other frequencies during transitions among 224, 239, 448, and 478 line modes, as the PPU centers the 224-line mode by moving the vsync pulse up rather than by moving the picture relative to vsync

Quote:

Alternatively, you could have your scanline buffer grow gradually over time and drop a frame whenever it fills up, which I think would be every 10 seconds. Less annoying than tearing.

That's what my DVD recorder does, with its full frame buffer. It needs the frame buffer for MPEG-2 compression anyway, but it does add lag. And unless the upscaler runs 480i in an all-bob mode, it'll need a full frame buffer in order to weave 480i together.

Well, you need to look at how bad the problem actually is: as far as I know, only a small number of games use interlaced mode at all, and even then all but RPM Racing seem to use them just for menus, where frame drops aren't really an issue, and nobody would ever notice it. If you just adjust things to work for 224/239, then yeah, RPM Racing won't work, but we're talking about only one game out of 783 which would have sub-optimal performance.

As for transitions, if they're generally just happening going into menus, then dropping a frame isn't an issue, although if you explicitly don't support the interlaced timing, then there isn't any transition anyhow. And keep in mind that the framemeister just blacks out video for multiple seconds when the transition happens, so dropping a frame is a massive improvement.

The expansion port <> HDMI adapter would definitely eliminate the light vertical bar in the middle of the video output completely.

Still kind of surprising no one really knows for sure what causes it or how to fix it. Key bet is that it's related to DRAM refresh, but people have had varying degrees of success 'fixing it' by replacing the Vreg and/or caps on the board.

> And keep in mind that the framemeister just blacks out video for multiple seconds when the transition happens

Yeah, much as I love the space saving of an LCD ... I've gotta say I'm really not impressed with the XRGB-Mini compared to my PVM-2030. The latter gives a far better picture, with much richer contrast (obvious benefit of a CRT), has lower latency, hides the light vertical bar better, etc.

I also get this strange fade-in effect with the XRGB. When I go from a really dark screen to a bright screen, the bright screen is somewhat dim and fades in over 1-2 seconds. Definitely not there on the PVM.

If we were going all-out with an expansion port adapter, I'd want to tie it to something like adaptive sync so we could perfectly handle both video modes without ever dropping a frame.

...

Another thing I'm surprised no one's really worked on, would be a simple 15.5khz->31khz analog RGB adapter. The PVMs set you back $300 each plus another $200 for shipping from California. But there's millions of PC CRT monitors out there with VGA connections. I picked up a beautiful Viewsonic 23" for $20 from a local used computer store.

I know there's professional equipment that costs a fortune (and often can't make sense of 240p content), but this doesn't seem like it would be all that challenging to build inexpensively.

Isn't that exactly what Marq's scanline doubler does, although that's outputting 480p HDMI and not RGB? IIRC it has two scanlines of latency and handles both modes just fine. Costs far less than the framemeister too.

Who is online

Users browsing this forum: No registered users and 7 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum