Difference between revisions of "First-Hand:My Recollections: Development of Football's Virtual First Down Line"

(Created page with "'''Contributed by J.R. Gloudemans''' == Who's Idea Was It? == I was working for Shoreline Studios in 1996, when I first heard the idea of adding a first down line to the broadc...")

I was working for Shoreline Studios in 1996, when I first heard the idea of adding a first down line to the broadcast video. Shoreline Studios helped to develop the glowing hockey puck for Fox Sports and I think the idea came from Fox. But the original idea came from an inventor named David Crane, who patented it in the 1970?s and pitched it to numerous television executives. I suspect it was a nearly impossible project back in the ?70s and not worth the expense even if it was possible.

I was working for Shoreline Studios in 1996, when I first heard the idea of adding a first down line to the broadcast video. Shoreline Studios helped to develop the glowing hockey puck for Fox Sports and I think the idea came from Fox. But the original idea came from an inventor named David Crane, who patented it in the 1970?s and pitched it to numerous television executives. I suspect it was a nearly impossible project back in the ?70s and not worth the expense even if it was possible.

−

In 1997 I joined a startup called SportVision, founded by Stan Honey and Jerry Gepner. The first product we completed was called AirFX, a camera system used to measure and display the height that basketball players jumped during the game. We made air a few times, but the results were not exciting and the broadcasters and viewers were not impressed. So we kicked around some other ideas and the first down line came up. I asked my boss, Marv White, to let me experiment with what we considered to be the toughest and most uncertain aspect of the idea -- the keying problem. When you see the yellow line, you see the players walk over the line. It?s done using a glorified green screen, essentially like the weatherman?s green screen in the background. The players obviously aren?t walking over the green screen, they?re walking over dirt and grass. At the time, with the available processing power, we were very nervous about whether we would be able to sample and test the pixels and if the results would look any good.

+

In 1997 I joined a startup called SportVision, founded by Stan Honey and Jerry Gepner. The first product we completed was called AirFX, a camera system used to measure and display the height that basketball players jumped during the game. We made air a few times, but the results were not exciting and the broadcasters and viewers were not impressed. So we kicked around some other ideas and the first down line came up. I asked my boss, Marv White, to let me experiment with what we considered to be the toughest and most uncertain aspect of the idea -- the keying problem. When you see the yellow line, you see the players walk over the line. It's done using a glorified green screen, essentially like the weatherman?s green screen in the background. The players obviously aren't walking over the green screen, they?re walking over dirt and grass. At the time, with the available processing power, we were very nervous about whether we would be able to sample and test the pixels and if the results would look any good.

Line 22:

Line 22:

== Development ==

== Development ==

−

The original system is comprised of seven computers and a lot of video gear. There was a computer at the three main camera locations monitoring the pan, tilt, zoom and focus encoders and relaying that information down to the production truck. There was also a computer (called ?Gather?) in the truck packaging and timestamping the camera data. Rick Cavallaro wrote the code for these computers. The three other SGI computers handled drawing the line (?Render?), figuring out the key (?Matte?) and determining which camera was on air (?Tally?). The hardware development was handled by Stan Honey and Terry O?Brien.

+

The original system is comprised of seven computers and a lot of video gear. There was a computer at the three main camera locations monitoring the pan, tilt, zoom and focus encoders and relaying that information down to the production truck. There was also a computer (called ?Gather?) in the truck packaging and timestamping the camera data. Rick Cavallaro wrote the code for these computers. The three other SGI computers handled drawing the line ("Render"), figuring out the key ("Matte") and determining which camera was on air ("Tally"). The hardware development was handled by Stan Honey and Terry O'Brien.

The original system was downstream of everything, including the switcher. So we needed a method to tell which camera was on air. We tried inserting identification data into the video at each camera, but the switcher would occasionally remove the data. So we resorted to a big hack to make this work. The three up cameras and the on-air feed were combined into one feed and Marv wrote image recognition code to figure out if any of the three camera matched the on-air feed. This is the Tally subsystem. One strange byproduct of this approach was that if the commentators used the telestrator, the line would disappear because the on-air video no longer matched the raw camera feed.

The original system was downstream of everything, including the switcher. So we needed a method to tell which camera was on air. We tried inserting identification data into the video at each camera, but the switcher would occasionally remove the data. So we resorted to a big hack to make this work. The three up cameras and the on-air feed were combined into one feed and Marv wrote image recognition code to figure out if any of the three camera matched the on-air feed. This is the Tally subsystem. One strange byproduct of this approach was that if the commentators used the telestrator, the line would disappear because the on-air video no longer matched the raw camera feed.

Line 35:

Line 35:

== First Game On Air ==

== First Game On Air ==

−

Our first game on air was in Baltimore a couple of weeks into the regular season. During the first quarter everything looked great and the response from ESPN was positive. But toward the end of the first half, the yellow line started to jiggle more than it should have. Viewers at home probably didn?t even notice it, but we were hypersensitive to any motion. We scrambled to figure out what was going on. During half-time, we re-booted the computers, but the problem got worse and worse as the game progressed. We got through the game and the yellow line was declared a critical success. But the line jiggling had me stumped. So, in the days that followed I dug through all the code and it turns out I was using a single precision float for time code. During testing we never ran a full game without resetting the timecode generator so the problem did not show up. The line jiggled because we ran out of precision on the float and step size was small compared to the size of the timecode. To this day I cannot imagine why I ever made it single precision (lack of sleep?). It is one of those bugs that you just look back on and wonder, ?Why??

+

Our first game on air was in Baltimore a couple of weeks into the regular season. During the first quarter everything looked great and the response from ESPN was positive. But toward the end of the first half, the yellow line started to jiggle more than it should have. Viewers at home probably didn?t even notice it, but we were hypersensitive to any motion. We scrambled to figure out what was going on. During half-time, we re-booted the computers, but the problem got worse and worse as the game progressed. We got through the game and the yellow line was declared a critical success. But the line jiggling had me stumped. So, in the days that followed I dug through all the code and it turns out I was using a single precision float for time code. During testing we never ran a full game without resetting the timecode generator so the problem did not show up. The line jiggled because we ran out of precision on the float and step size was small compared to the size of the timecode. To this day I cannot imagine why I ever made it single precision (lack of sleep?). It is one of those bugs that you just look back on and wonder, "Why?"

== On The Road ==

== On The Road ==

−

It took a whole truck to house all the hardware. We used the same truck that was used for the puck tracking system. We bought it from Fox and reconfigured it for this project. Early in the project, we told ESPN: ?We can?t do this game in Philly. It?s a Thursday Night game and we can?t physically get the truck out there.? And they said, ?okay?. But when the Yellow Line became a big hit, ESPN really pushed us to do the Philly game. I recall a rumor that ESPN was even considering getting one of those big Russian transport planes to ship the truck to Philly and back. At that point we thought to ourselves, ?we?re not charging enough for this stuff if they?re actually thinking of doing that.? Instead we hired two drivers to get our truck cross-country and lived with the reduced setup time. ESPN used a different production truck for this game and we had to set up some extra video converters on a card table outside the truck. The game went well given all the trouble interfacing with a new production truck. After the game ended, one of the truck drivers, in his haste to get on the road, started to pack up the gear on the card table. Unfortunately, the post game broadcast was still on air and since we were downstream of everything, we took them off air. You should have heard the screaming on that one. It was kind of sad because the truck driver felt so bad.

+

It took a whole truck to house all the hardware. We used the same truck that was used for the puck tracking system. We bought it from Fox and reconfigured it for this project. Early in the project, we told ESPN: "We can't do this game in Philly. It's a Thursday Night game and we can?t physically get the truck out there." And they said, "okay". But when the Yellow Line became a big hit, ESPN really pushed us to do the Philly game. I recall a rumor that ESPN was even considering getting one of those big Russian transport planes to ship the truck to Philly and back. At that point we thought to ourselves, "we're not charging enough for this stuff if they're actually thinking of doing that." Instead we hired two drivers to get our truck cross-country and lived with the reduced setup time. ESPN used a different production truck for this game and we had to set up some extra video converters on a card table outside the truck. The game went well given all the trouble interfacing with a new production truck. After the game ended, one of the truck drivers, in his haste to get on the road, started to pack up the gear on the card table. Unfortunately, the post game broadcast was still on air and since we were downstream of everything, we took them off air. You should have heard the screaming on that one. It was kind of sad because the truck driver felt so bad.

Lighting and weather conditions added challenges to the keying. We got lucky the first season because we only did night games (ESPN Sunday Night Football) so the lighting was consistent. Snow and rain caused problems and on one occasion, there was torrential downpour at a Kansas City game. Walter Hsiao was doing the keying and he really had to work very hard to keep the line visible and not draw on the players. Another tough stadium was Candlestick in San Francisco. The Niners wear that sort of brownish pants. Back then when baseball was also played on that field, the dirt from the infield was a close color match to the uniforms. So keying in those circumstances was very tricky.

Lighting and weather conditions added challenges to the keying. We got lucky the first season because we only did night games (ESPN Sunday Night Football) so the lighting was consistent. Snow and rain caused problems and on one occasion, there was torrential downpour at a Kansas City game. Walter Hsiao was doing the keying and he really had to work very hard to keep the line visible and not draw on the players. Another tough stadium was Candlestick in San Francisco. The Niners wear that sort of brownish pants. Back then when baseball was also played on that field, the dirt from the infield was a close color match to the uniforms. So keying in those circumstances was very tricky.

Revision as of 15:08, 16 November 2012

Contributed by J.R. Gloudemans

Who's Idea Was It?

I was working for Shoreline Studios in 1996, when I first heard the idea of adding a first down line to the broadcast video. Shoreline Studios helped to develop the glowing hockey puck for Fox Sports and I think the idea came from Fox. But the original idea came from an inventor named David Crane, who patented it in the 1970?s and pitched it to numerous television executives. I suspect it was a nearly impossible project back in the ?70s and not worth the expense even if it was possible.

In 1997 I joined a startup called SportVision, founded by Stan Honey and Jerry Gepner. The first product we completed was called AirFX, a camera system used to measure and display the height that basketball players jumped during the game. We made air a few times, but the results were not exciting and the broadcasters and viewers were not impressed. So we kicked around some other ideas and the first down line came up. I asked my boss, Marv White, to let me experiment with what we considered to be the toughest and most uncertain aspect of the idea -- the keying problem. When you see the yellow line, you see the players walk over the line. It's done using a glorified green screen, essentially like the weatherman?s green screen in the background. The players obviously aren't walking over the green screen, they?re walking over dirt and grass. At the time, with the available processing power, we were very nervous about whether we would be able to sample and test the pixels and if the results would look any good.

Secret Testing

In the spring of 1998, Marv let me write some code to test some different approaches to the keying problem and to see if the SGI O2 computers would be fast enough. I was supposed to be working on AirFX, so we kept this test project quiet. I started with a video clip from a football game and tried to turn the field a different color. It worked better than expected, but we would have to be clever to keep up with the video frame rate. The approach I settled on involved creating boxes in color space to decide which colors to replace. After experimenting with a number of color spaces, it turned out that YCrCb (used for television) worked the best. The keying worked by sampling a pixel from the screen in YCrCb space and looking up the transparency value in a 256x256x256 table. The table was built from user-defined inclusion and exclusion boxes. The boxes also had a transparency taper zone to reduce flicker. Patent 6,229,550 has more information about the keying system.

Prototypes

After proving that keying would work, we were confident that the rest of the project was feasible. All of the camera tracking, field registration and graphics insertion problems were similar to the hockey puck tracking project that many of us had worked on. The next step was getting a broadcaster to commit to the project and for that we needed a demo video. So I cheated and hand-digitized the line placement for a couple of plays, but I did use the test keying software to place the line under the players. The results looked great and as a football fan I thought the line would really add to the broadcast. Based on that video tape, ESPN wanted the line for their Sunday night NFL football broadcast. There was one big problem: it was already June and they wanted the system for their first regular season game in September -- we had a really tight deadline.

I initially proposed red as the color for the first down line because it showed the least amount of movement. Today, the line is solid as the camera moves around. But in the beginning, we had a lot issues with line movement. I experimented with a number of line colors and red seemed to work the best. In the end, it was Jed Drake, at ESPN, who chose the yellow as the color that is now used in the First-and-Ten line.

Development

The original system is comprised of seven computers and a lot of video gear. There was a computer at the three main camera locations monitoring the pan, tilt, zoom and focus encoders and relaying that information down to the production truck. There was also a computer (called ?Gather?) in the truck packaging and timestamping the camera data. Rick Cavallaro wrote the code for these computers. The three other SGI computers handled drawing the line ("Render"), figuring out the key ("Matte") and determining which camera was on air ("Tally"). The hardware development was handled by Stan Honey and Terry O'Brien.

The original system was downstream of everything, including the switcher. So we needed a method to tell which camera was on air. We tried inserting identification data into the video at each camera, but the switcher would occasionally remove the data. So we resorted to a big hack to make this work. The three up cameras and the on-air feed were combined into one feed and Marv wrote image recognition code to figure out if any of the three camera matched the on-air feed. This is the Tally subsystem. One strange byproduct of this approach was that if the commentators used the telestrator, the line would disappear because the on-air video no longer matched the raw camera feed.

I wrote the code for the Matte subsystem that was comprised of a SGI O2 computer that displayed the on-air video feed and provided a user interface to select inclusion and exclusion color boxes. Selection histograms and the results of the current key were also displayed. The key information was passed on to the Render computer. One design flaw we had in the original system was the inability to change the video feed coming onto Matte. This resulted in some frantic key changes because we could not preview the main three cameras.

Walter Hsiao and I developed the Render subsystem using an SGI Octane computer. This computer received the camera data from Gather and did the math to figure out where the line should be placed. It also kept a copy of the key table that was updated by the Matte computer. The computer was not fast enough to look up every pixel in the line. So we divided the line into polygons and only checked the key transparency value at the corners of each polygon. There were some artifacts in the line rendering due to this compromise. Several years after the debut, computers got fast enough to check every pixel in the line. Render also contained the user interface to perform the initial camera registration and field modeling.

The challenge of synchronization set our schedule back a few weeks and we didn?t get the first game on air. When we did a preseason game, most of the system ran well but we had timing issues. I originally thought we could get by with open-loop timing. In other words, we could figure out what the delay was and dial in the delay at the beginning of the game and it would work for the entire game. As it turned out, that wasn?t a good assumption. So we had to go back and revamp the system. We inserted a timecode (VITC) into the video signal. This solved our timing issues because we could exactly match the time on the video frame to the timestamp on the camera data. But timing did come back to haunt me during the first game.

First Game On Air

Our first game on air was in Baltimore a couple of weeks into the regular season. During the first quarter everything looked great and the response from ESPN was positive. But toward the end of the first half, the yellow line started to jiggle more than it should have. Viewers at home probably didn?t even notice it, but we were hypersensitive to any motion. We scrambled to figure out what was going on. During half-time, we re-booted the computers, but the problem got worse and worse as the game progressed. We got through the game and the yellow line was declared a critical success. But the line jiggling had me stumped. So, in the days that followed I dug through all the code and it turns out I was using a single precision float for time code. During testing we never ran a full game without resetting the timecode generator so the problem did not show up. The line jiggled because we ran out of precision on the float and step size was small compared to the size of the timecode. To this day I cannot imagine why I ever made it single precision (lack of sleep?). It is one of those bugs that you just look back on and wonder, "Why?"

On The Road

It took a whole truck to house all the hardware. We used the same truck that was used for the puck tracking system. We bought it from Fox and reconfigured it for this project. Early in the project, we told ESPN: "We can't do this game in Philly. It's a Thursday Night game and we can?t physically get the truck out there." And they said, "okay". But when the Yellow Line became a big hit, ESPN really pushed us to do the Philly game. I recall a rumor that ESPN was even considering getting one of those big Russian transport planes to ship the truck to Philly and back. At that point we thought to ourselves, "we're not charging enough for this stuff if they're actually thinking of doing that." Instead we hired two drivers to get our truck cross-country and lived with the reduced setup time. ESPN used a different production truck for this game and we had to set up some extra video converters on a card table outside the truck. The game went well given all the trouble interfacing with a new production truck. After the game ended, one of the truck drivers, in his haste to get on the road, started to pack up the gear on the card table. Unfortunately, the post game broadcast was still on air and since we were downstream of everything, we took them off air. You should have heard the screaming on that one. It was kind of sad because the truck driver felt so bad.

Lighting and weather conditions added challenges to the keying. We got lucky the first season because we only did night games (ESPN Sunday Night Football) so the lighting was consistent. Snow and rain caused problems and on one occasion, there was torrential downpour at a Kansas City game. Walter Hsiao was doing the keying and he really had to work very hard to keep the line visible and not draw on the players. Another tough stadium was Candlestick in San Francisco. The Niners wear that sort of brownish pants. Back then when baseball was also played on that field, the dirt from the infield was a close color match to the uniforms. So keying in those circumstances was very tricky.

During the first season, the user interface was pretty quirky and the system was tricky to operate. So only Walter and myself were trusted to run it on air. Walter and I switched off every game that year. We were on the road a lot. And the really frustrating thing was that, after a weekend road trip, we would come back in the office and face a list of bugs that needed to be fixed. But the critical response has been great and I am proud of the work everyone did to get on air under a tight deadline.

This article was posted by the Administrator on behalf of the author. It is an exact copy of the text sent to the Administrator by the author in an email on 29 Oct. 2012