On a well-formed 480i NTSC signal, there are exactly 227.5 × 262.5 = 59718.75 chroma cycles per vertical refresh, so it actually takes 4 fields for the chroma phases to cancel out again.

With a 240p signal, it's a bit harder. The most-frequently done and most problematic thing done was to use 228 chroma cycles per scanline. (CGA, Apple 2, Master System, Genesis). Crosstalk artifacts occur at fixed X positions on every scanline, but aren't a function of which scanline. This produces a stable image, but produces very visible artifacts if there's any movement. As nocash says, "perfect ugliness".

The simplest thing to do would be to have the right hsync rate – 227.5 chroma cycles per scanline – with some odd number of scanlines.

In fact, the NTSC VIC-20 does. It uses a pixel clock of 260÷227.5 times the chroma carrier, and generates 261 total scanlines, for a total of 59377.5 chroma cycles per vertical refresh, and thus both every scanline and every field is exactly opposite the phase of the previous.

The NTSC C64 apparently came with two revisions of the VIC-2. Both used a 8 2/11 MHz pixel clock, but one apparently used 224 chroma cycles per scanline and 262 scanlines (...?) generating stable but highly-visible artifacts; the later revision generated 227.5 chroma cycles per scanline and 263 scanlines, achieving the same objectives as the VIC-20. (see also)

The NTSC NES and SNES both start with 227 1/3 chroma cycles per scanline. The video generated by the NES dramatically exceeds the chrominance bandwidth that would be transmitted over-the-air, and they decided that having the chroma interference pattern repeat every 3 fields wasn't desirable. By removing one pixel = 2/3 of a chroma cycle every other vsync, they got something close to the desired 30Hz stability.

Well, at least the NES' designers weren't as lazy as the Apple IIs, which uses an integer number of color cycles per scanline. This makes cross-color artifacts consistent from line to line and field to field. In fact, since the pixel clock also is a multiple of the color subcarrier frequency, the cross-color artifacts are all the color there is, as the Apple II itself does not generate any color at all, other than the color burst in the back porch!

I would not call the Steve Wozniak lazy when his design for the Apple II's mainboard (which was almost totally a one-man effort) was the first practical and consumer affordable solution to display color graphics on regular TVs without specialized hardware. It was a cool hack of the NTSC decoding system that was sufficiently reliable and cheap enough to use for the first color consumer computer.

However, the Tandy CoCo doesn't have the excuse of coming first. The CoCo's startup composite artifact colors are random when you boot the computer. "Pixel 0" may be orange or blue depending on where in the clock cycle the system starts up into, and you would have to reset the system until you got the "right" colors. That doesn't happen on an Apple II, you know even lines will be purple/blue dominant and odd lines green/orange dominant.

You can literally think of the four bits in chroma period of the Apple 2 (or CGA)'s output as literally being the four samples [Y+U] [Y+V] [Y-U] [Y-V] (or maybe reverse that?). Doing so even makes it easy to see why you get two identical greys out of the Apple 2 or from CGA artifact colors—they're the ones where +U and -U are the same, vs +V and -V are the same.

On a well-formed 480i NTSC signal, there are exactly 227.5 × 262.5 = 59718.75 chroma cycles per vertical refresh, so it actually takes 4 fields for the chroma phases to cancel out again.

With a 240p signal, it's a bit harder. The most-frequently done and most problematic thing done was to use 228 chroma cycles per scanline. (CGA, Apple 2, Master System, Genesis). Crosstalk artifacts occur at fixed X positions on every scanline, but aren't a function of which scanline. This produces a stable image, but produces very visible artifacts if there's any movement. As nocash says, "perfect ugliness".

The simplest thing to do would be to have the right hsync rate – 227.5 chroma cycles per scanline – with some odd number of scanlines.

In fact, the NTSC VIC-20 does. It uses a pixel clock of 260÷227.5 times the chroma carrier, and generates 261 total scanlines, for a total of 59377.5 chroma cycles per vertical refresh, and thus both every scanline and every field is exactly opposite the phase of the previous.

The NTSC C64 apparently came with two revisions of the VIC-2. Both used a 8 2/11 MHz pixel clock, but one apparently used 224 chroma cycles per scanline and 262 scanlines (...?) generating stable but highly-visible artifacts; the later revision generated 227.5 chroma cycles per scanline and 263 scanlines, achieving the same objectives as the VIC-20. (see also)

The NTSC NES and SNES both start with 227 1/3 chroma cycles per scanline. The video generated by the NES dramatically exceeds the chrominance bandwidth that would be transmitted over-the-air, and they decided that having the chroma interference pattern repeat every 3 fields wasn't desirable. By removing one pixel = 2/3 of a chroma cycle every other vsync, they got something close to the desired 30Hz stability.

The 64 clock NTSC VIC-II is a mistake, and only the very early revision NTSC 64s had the chip, as most of these were the "Sparkle" C64s most 64 chips are gone( the replaced the VIC-II when they replaced the CHARGEN) . 65 clock is the standard.

Yeah, before the VGA card, in the US, there wasn't really a widespread standard video connector other than baseband CVBS via RCA jack. Even S-Video on the MiniDIN-4 was introduced at the same time as VGA - 1987.

Last edited by lidnariq on Fri Mar 02, 2018 3:31 pm, edited 1 time in total.

So does modulating to RF. It's not about complexity of the generator; it's about being able to take advantage of the monitors (televisions) that people already had rather than requiring that they buy a new monitor.

Correct. A composite output for use with the composite monitor that one already owns is less circuitry than an RGB output and a bespoke monitor.

Ninja'd twice while trying to post the following:

Before 1987: Most home computers came with composite output of some sort.

1987 through 2007: The dark ages of most people not knowing to connect a personal computer to a television. Because processing, storage, and network communication weren't yet up to the task of SD full-motion video, computers were seen as tools to display large amounts of text. It was possible to use a scan converters to downscale VGA to composite or S-Video, but these were expensive and hard to find, and few PC applications were designed to work with standard definition. This gave the console makers a monopoly on the living room, apart from a minority of geeks, and large game publishers a cartel on single-screen multiplayer.

2007 to present: Most TVs come with HDMI input, which can display HDMI or DVI-D output. Many also come with VGA input. It became even easier in fourth quarter 2015 when Valve introduced the Steam Link thin client, as the PC no longer has to be in the same room as the TV.

2012: The successful crowdfunding of the OUYA console shows demand for indie games on a TV, even if the end product underwhelms. Over the next few years, this causes console makers to open their developer programs to smaller studios in order to keep living room PCs from taking over.

Did people use to use TVs as computer monitors a lot? It's weird how many old computers have composite video.

Yes, almost as "The standard", if you had a monitor you were a rich kid. You are use to the modern "TVs cost nothing" world. Back in the late 70s and 80s TV were expensive "furniture" like items. For example a VIC-20 cost you $99, but a Monitor for it would cost you $395. So if you have to have a monitor with the VIC-20 the computer costs you $494, if you can use the TV you already own it costs to $99.

By 88 seems you could get a 1084 for $289 while a C128 would cost you $244. Mean while you could get a C64 for $154 and use the TV you already have. You could add a 1802 to the 64 for better picture for $164. That is basically the tipping point, by the time you get the IIgs the machine is $999 and a monitor is still $499.

A disk drive for the C64 was $164. So you could have a C64 + Monitor or a C64+Disk drive+Game for the same price. If you already had a TV you would use it and take the Disk Drive.

I find it pretty funny how a lot of Europeans online say that skin colors on NTSC appear as either green or purple, yet, I never seen it happen on any of the TVs I had. If these Europeans actually seen North Amercian TV sets, they must be either way older than me or they watched TV with one of those people who just leave the tint control in a funny place.

Saturation and black levels were always my pet peeve with NTSC. They were always inconsistent from one channel to the next.

Who is online

Users browsing this forum: No registered users and 3 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum