It looks like you're using an ad blocker! I really need the income to keep this site running.
If you enjoy BlenderNation and you think it’s a valuable resource to the Blender community, please take a moment to read how you can support BlenderNation.

Really a novel and inspiring use of Blender. Multiple camera angles are needed to reliably track the flying objects through 3D space, though. The tracking shown here takes place essentially only in 2D screen space although one can make estimates of relative velocity towards or away from the camera.
So even if the featured piece of debris was estimated to fly at 180km/t perpendicular to the camera, the actual speed may really be much greater if it's flying at a 45 degree angle to the camera. So, this is why you need to track the same piece from multiple angles.

"So even if the featured piece of debris was estimated to fly at 180km/t
perpendicular to the camera, the actual speed may really be much greater
if it's flying at a 45 degree angle to the camera." He did take this into his account by plotting the exact path of the debree with the help of Google map and streetview. He also talked harnassing all the technologie or the tornado chasers network.
This video seems to be just an illustration on how it works in principle, but in reality he seems to use several techniques both within Blender (tracking both the debree and the camera movement for calculating the zoom) and outside (multiple video clips and google maps/earth and streetview.

No he did not track the exact path of the debris. Notice that the blender tracking path was in a curve around the camera instead of around the tornado.... because blender does not know how far it was. So some way of knowing the exact distance of the debris over time is needed.

He got a guess where the camera was from street view and the national weather service is and where the tornado was .4 miles, but the debris was closer, probably much closer...

gps is a start but not accurate enough.

So yeah, multiple cameras angles, and/or some kind of calibrated stereo photography, would probably be better. Perhaps a tool could be made to automatically position and motion track storm tracker footage... then we have something.

For the debree in this clip he used data provided by the national weather service. I assume they use up-to-date equipment.
And he talked about using all of the data available from the stormchasers network. Data from one single gps equipped camera may not be accurate enough but the combined data from a bunch of those camera's should be.
But i'm whining a bit i think. And this ís a very interesting idea. If it works it could open up a whole lot of other possibilities.

I agree it's interesting and could open more possibilities but the person doing it must to be scientific.

Without knowing the margin of error the data, it's useless. The more I look into this, the larger the margin of error appears.http://www.srh.noaa.gov/srh/ssd/mapping/ says " This map is meant to show the general path of the tornadoes based on a few points. If you zoom down to the neighborhood level, the tracks will likely not match the exact path." I'm assuming these paths were provided by storm chasers and or doper radar. an so we're dealing with a limited resolution with the radar (is lower the farther away the storm it is) and the accuracy of the gps (~5 to ~50 meters) which may be hindered by the storm. And still we're only getting the center of the tornado, not debris fling in front of it. (which might be a piece of cardboard fling .1 miles away instead of .4.)

So the speed of the the debris might be 120 kph +- 100 kph. Who knows.

I do enjoy this kind of problem solving though. Blender is a great tool for trying out ideas...

This technique has potential. Great job! Hopefully it will inspire other creative uses for blender.

But we need to know Exactly how far away the debris is over time get an accurate speed. (The camera just gives us x and y, not z.)
He figured that the camera was *about 0.4 miles from the tornado, but the debris had to be much closer.and like Jarl Arntzen said, we need to know the angle to the camera, which we would know if we knew the distance to the camera over time.
* It was not a perfect match from street view and we don't know how accurate was the gps information on the street view can? Or how accurate is the tornado track from the national weather service? I wouldn't depend on those.
I see promise with using multiple cameras, and perhaps other automated tracking tools to capture multiple vectors, and hopefully subtle ones like the edges of clouds. Blender might not be the best tool for this job, but this is clever thinking that could be valuable... And blender's great for prototyping many ideas like this!
Can blender track points from multiple angles?