It really depends on the TV; if it's an LCD TV, then my palette generator with the default settings ought to suffice, though I've discovered that a white point of D55 (0.3324, 0.3474) looks even closer to what my CRT TV displays. To input that into the palette generator, click on "get current preset" and put that coordinate in for Wx and Wy. For you, it might look better or might look worse, this is just what worked best for me.

If you're using a CRT TV with your emulator, I'm not sure what to recommend, because I haven't tested that. I guess it comes down to how warm or cool the grays are.

A few questions, if I may, from someone who has spent way too much time on the NES palette himself:

How are you normalizing video levels? Are you just treating color 0x0E as black and colors 0x20 and colors 0x30 as white?

Are you using the color burst amplitude as an amplitude reference for the chroma signals?

What does a hue of -0.25 actually mean? How many degrees is that? I notice that 360 degrees (meaning no change) is at about 31.4, so it seems to be related to radians. Better specify it in degrees, because that is what the television literature does.

Are you taking US NTSC's 7.5% setup into account? It seems you do, given that the default brightness and contrast settings are different from zero, though to a larger extent than 7.5% setup would do.

Suggestions:

I assume that "colorimetry" refers to the colorimetry of the emulated ("source") display, and that the target display is always sRGB. Since wide-gamut displays are quite widespread these days, allowing to select the target display colorimetry might be desirable as well.

There are at least two more white points that were (and to some extent are) in widespread use: 9300K+27MPCD (xWhite 0.281, yWhite 0.311) and CIE D93 (xWhite 0.285, yWhite 0.293). Should you decide to add these, you might want to separate the RGB primaries from the white point. Note that when trying to simulate D93 white on sRGB, the blue value will become greater than 1, so everything must be scaled accordingly. Right now that is not done when entering these as custom values.

How are you normalizing video levels? Are you just treating color 0x0E as black and colors 0x20 and colors 0x30 as white?

I'm using the normalized values from here, which indeed normalize as you say.

Quote:

Are you using the color burst amplitude as an amplitude reference for the chroma signals?

I'm not, because I've read documentation somewhere that says the color burst amplitude has absolutely no effect on the resulting picture. Whether or not that's true seems to depend on who you ask, so I don't have a clear answer as to what to do, nor if any kind of DC bias on the colorburst signal makes any difference.

Quote:

What does a hue of -0.25 actually mean? How many degrees is that? I notice that 360 degrees (meaning no change) is at about 31.4, so it seems to be related to radians. Better specify it in degrees, because that is what the television literature does.

Yeah, the hue tweak is in radians, and that's just because that's how most programming languages handle trig functions. I probably should change it to degrees, now that you mention it, if only because radians kinda suck.

Quote:

Are you taking US NTSC's 7.5% setup into account? It seems you do, given that the default brightness and contrast settings are different from zero, though to a larger extent than 7.5% setup would do.

I'm not actually sure what that is, to be honest. The reason the brightness is lower is because I noticed the darkest row of colors didn't look right until I lowered it, and when raising the brightness on my TV, it takes a while for 1D to actually start lightening, so I figured the brightness setting is actually supposed to be lowered. The contrast setting is to compensate for the reduced brightness.

Quote:

I assume that "colorimetry" refers to the colorimetry of the emulated ("source") display, and that the target display is always sRGB. Since wide-gamut displays are quite widespread these days, allowing to select the target display colorimetry might be desirable as well.

Your assumption is correct. I can add the target colorimetry at some point, but I'd be doing it without understanding wide gamut displays nor why it's necessary.

Quote:

There are at least two more white points that were (and to some extent are) in widespread use: 9300K+27MPCD (xWhite 0.281, yWhite 0.311) and CIE D93 (xWhite 0.285, yWhite 0.293). Should you decide to add these, you might want to separate the RGB primaries from the white point. Note that when trying to simulate D93 white on sRGB, the blue value will become greater than 1, so everything must be scaled accordingly. Right now that is not done when entering these as custom values.

I started thinking this was the case, given that neither C nor D65 looked correct for me. The custom colorimetry was a quick afterthought, so it doesn't do any scaling or anything like that. I'm not even sure how I'd need to properly "scale" anything.

Thanks for the literature! I knew the YIQ->RGB matricies had to be different from what the FCC specified, if only because I couldn't get anything to look exactly right, I could only get it "close". (Trying out D55 like I mentioned before helped me, which kinda opened me to the possibility that TVs may not all be using D65)

Are you using the color burst amplitude as an amplitude reference for the chroma signals?

I'm not, because I've read documentation somewhere that says the color burst amplitude has absolutely no effect on the resulting picture. Whether or not that's true seems to depend on who you ask, so I don't have a clear answer as to what to do, nor if any kind of DC bias on the colorburst signal makes any difference.

DC during colorburst definitely doesn't matter, but is "supposed" to be 0 IRE.This is how Macrovision works: VCRs have AGCs, and they sample for "darkest value on a scanline" during colorburst. Macrovision adds a large positive offset during colorburst.

Atari's 2600, as initially released, specifically was designed to take the nominal chroma scaling into account by attenuating the colorburst to the nominal ~40 IRE. Later revisions removed the connection for cost savings, because it was found that basically no televisions cared. Most don't even scale luminance through composite, simply assuming that the sync depth is "good enough"; the only AGC is on OTA because it's amplitude modulated and so has to be adjusted for distance from the broadcaster.

Quote:

Quote:

Are you taking US NTSC's 7.5% setup into account? It seems you do, given that the default brightness and contrast settings are different from zero, though to a larger extent than 7.5% setup would do.

I'm not actually sure what that is, to be honest.

Nominally, US NTSC TV (but not Japanese NTSC TV) defines "black" as 7.5 IRE, and values below that as blacker-than-black. Most early video game consoles don't support this, but the end result is that a console that doesn't compensate for this in the US will have a smidge more contrast and darker than in Japan.

I'm using the normalized values from here, which indeed normalize as you say.

The problem is that no television set will do it this way, because it has no way of knowing at what video level the console wants its black and white to be. Consider this post of mine, which might explain why you need brightness values lower than -0.075 to replicate a particular television set.

lidnariq wrote:

Atari's 2600, as initially released, specifically was designed to take the nominal chroma scaling into account by attenuating the colorburst to the nominal ~40 IRE. Later revisions removed the connection for cost savings, because it was found that basically no televisions cared.

With a baseband composite connection, my multi-standard Sony CRT uses the color burst amplitude as an amplitude reference for chroma with PAL signals and as an amplitude reference for the entire signal with NTSC signals.

lidnariq wrote:

the only AGC is on OTA because it's amplitude modulated and so has to be adjusted for distance from the broadcaster.

This is very important, and implies that the same television with the same console would produce pictures of different brightness between a baseband composite connection and an RF-modulated connection.

Drag wrote:

I can add the target colorimetry at some point, but I'd be doing it without understanding wide gamut displays nor why it's necessary.

Wide gamut displays provide a more saturated picture. Normal sRGB images will appear oversaturated on them. Most web browsers nowadays can be made to color manage these images, that is, convert their values from sRGB to the monitor's native primaries. I have not seen any NES emulator provide that functionality. Therefore, it would be useful to specify the target monitor's primaries to generate the correct colors directly. Another advantage of this is that on these monitors, saturated reds and greens can be seen without them needing to be clipped (as much).

Drag wrote:

The custom colorimetry was a quick afterthought, so it doesn't do any scaling or anything like that. I'm not even sure how I'd need to properly "scale" anything.

Convert 100% white (R=G=B=1.0) to the target colorspace, take the largest value, and divide everything else by that, i.e. if you get R=1.0, G=1.1, B=1.2, then divide everything by 1.2. All this with linear values (the ones you use for color space conversion), not gamma-correct values.

By the way: if you wanted to get really crazy, you could emulate differential phase distortion as well. Basically, your current hue setting is amplitude-independent, so it influences the hue shift at zero amplitude. You could add a second hue setting that is multiplied with Y, resulting in a total hue shift for any pixel of baseHue+Y*diffHue. While it does replicate what is definitely going on in some devices, it might be a bit far out there for emulation purposes.

This is very important, and implies that the same television with the same console would produce pictures of different brightness between a baseband composite connection and an RF-modulated connection.

Yes, but the usual AGC is on sync depth, normalizing it to -40 IRE. There's no clearly correct thing to do if given contradictory gains needed to normalize both colorburst and sync depth.

On my Magnavox CRT SDTV, I do get a noticeably brighter picture with composite out than with RF out. I noticed this when I decided to use RF out when recording composite with my DVD recorder to prevent lag during gameplay

{-0.25, 0.8, -0.2, 1.0, 1.0}This is a slightly darker palette, but represents the colors a lot more faithfully, for instance, I can actually tell what hues the lightest colors are, just like I can with my TV. If you're looking to design some graphics and you really need an accurate representation of the colors and how they contrast against each other, this is what I'd recommend. Even though I said this palette is darker, you don't even notice unless you have something BRIGHT WHITE open next to your emulator.

This palette is pretty awesome. One question, is there something i can do to make the palette retain the green color from the border on the second stage of contra for nes? It is supposed to look like this:

but instead it is just one single gray color, hardly any green. When this palette is used in conjunction with the ntsc filter it really shines. Just a few little tweaks and it will be almost perfect. Just figured i would ask since you know more about the settings on the palette generator page than i do. I always remember the nes colors being dark and gritty than most emulators show. Do you know what i can do to retain this green back and possibly just lighten up things just a tiny hair?

anyways this palette generator is awesome, thanks for all you have put into it. Im definitely getting use of it.

I apologize for bumping this topic. I am sure many are aware of my open source project Retro Graphics Toolkit http://forums.nesdev.com/viewtopic.php?f=21&t=9894. If you don't mind would it be okay if I use your code for palette generation? It was easy to port to C++. I have not committed the code as I have not implemented de-emphasis bits yet which I believe to be a necessary feature. If you have any pointers on how to implement de-emphasis bits I would appreciate that greatly.

Who is online

Users browsing this forum: No registered users and 2 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum