“One of the biggest challenges during an F1 race is calculating strategy. With some many variables taken into consideration, a simple calculation of ‘should we pit this lap if there’s a safety car’ requires amazing computational power. Over the years teams have shipped increasingly powerful servers to each event around the globe to perform these operations. Even with the processing power available today, these calculations can take between 30 and 40 seconds to return the decision. With laps at some circuits taking as little as a minute and a half, this time is precious.”

“Currently there is a F1 regulation in place that prevents teams from using the cloud for this real-time, live calculations. For a sport that leads the way in technology, this feels incredibly antiquated. The sport is already a battle of who has the best technology and data and adding cloud infrastructure to the mix is a natural extension.”

I would have thought that real time data was already collated and that some of the very compute intensive calculations and scenarios were already transmitted to “home” data centers for processing. Some natural fits I see with cloud computing:

1) Computing and data resources are elastic, teams would only need to pay for their services only on race weekends. When these computing resources are not needed, teams can would not have to pay for them. So this could actually represent a cost savings.

2) Unlike “home” data centers that are say, located in European bases, the rented cloud computing resources can obtained from different data center geographies around the world depending upon where that weekend’s race is being held. This addresses delays due to network round trip times – imagine data is sent to the compute cloud, processed, and then results sent back to the track. Network times are limited by the speed of light, which in fact DOES play a factor! A theoretical photon can only travel around the world about 7 times in a second.