August 1, 2013 AT 11:00 am

Photogrammetry has a long and fascinating history over a century before digital stereo photogrammetry reached the level of automation to allow amazing projects such as Autodesk 123D Catch to exist. Still, it is not everyday that a hardcore coder/vfx maker rolls up his sleeves and braves the murky mathematics and coding nightmares to sort out another route than the blackboxes of the commercial products.

So a few weeks ago, Dan Short showed me 123D Catch. It was awesome. You feed in some pictures from your iPhone and they get uploaded to the cloud where they’re turned back into a textured 3D model you can explore on your phone or download on your computer. 123D Catch (henceforth also referred to as “Catch”) is part of Autodesk’s consumer-line of 3D technologies which include a product for 3D modeling on the iPad and producing 3D prints.

Until Dan showed me some models he generated from exhibits at the AMNH I didn’t really get the point of Catch…so what, you have a model of your water bottle…but what Dan showed me was that it worked incredible well on environments too: The Hall of African Mammals or even the penguin diorama from the infamous whale room! This is all done using a process called Photogrammetry or Structure from Motion (SFM) where the computer examines common features in each image and is able to construct a 3D form from overlapping features.

…I wasn’t thrilled by 123D Catch’s black box nature, and wondered whether there were any open source alternatives that could be used that wouldn’t have as many limitations. Surely, a lot of power was being sacrificed by its fully automatic workflow? Here are some other ways that 123D Catch is quite limited:

Photo limits: the iphone app seems to allow a maximum of 40 images. (the non-mobile version for Windows is limited to 70.) There’s no technical reason why there should be any limit. Plus, capturing larger 3d scenes like detailed environments will require more than 70 pictures pretty quickly.

Down-rezed photos: pretty sure in order to speed the photo’s “ascent” into the cloud, the pictures are scaled down limiting their detail and use as a high-rez texture during the projection phase.

Limited texture map size: in my few tests, the texture maps returned from 123D Catch’s automatic processes are returned at a given size…you have no control over how big a map you want.

Total blackbox: no controls to guide the 3D reconstruction or manipulate the results. This process is pretty intricate and having no controls seems a little scary.

Lightweight output by design: Autodesk’s 123D line (Catch included) isn’t meant to be a professional solution: this is meant to 3D-ize trinkets and give you something to 3D print with, not create high-rez models.

If I wasn’t going to use 123D Catch, I had to find some alternative. Since I’m not into pirating software, and I wanted to see how far I could get with $0.00, I decided to investigate what was available and Open Source. Lots has been written on the pros and cons of Open Source software, but I’ve learned over time that often the FOSS (Free and Open Source Software) tools are on par or exceed their commercial brethren. Since the emphasis is never on turning a profit, FOSS software is rarely intentionally crippled ala 123d Catch’s photo size and quantity limits.

I started to do research on FOSS alternatives, but getting them up and running took a bit of time. Over time, I discovered Bundler (which was actually created from open sourced components of the Photosynth project!), and RunSFM. Each of them mentioned on their homepages to check out another product, VisualSFM, which was more up to date and represented the state of the art in FOSS photogrammetry technology. (For a list of SFM software, free and commercial, check out this list. Special shout out to the Python Photogrammetry Tookit (PPT) which looks promising…check this out here )

After finding some FOSS solutions and after TONS of research, math, and picture taking (and seeing other peoples results like these!), I’d like to present a totally FOSS pipeline for converting images into textured 3D models. This documents represents hours of work and covers all the steps of creating textured 3d geometry from unregistered images…

Every Thursday is #3dthursday here at Adafruit! The DIY 3D printing community has passion and dedication for making solid objects from digital models. Recently, we have noticed electronics projects integrated with 3D printed enclosures, brackets, and sculptures, so each Thursday we celebrate and highlight these bold pioneers!

Have you considered building a 3D project around an Arduino or other microcontroller? How about printing a bracket to mount your Raspberry Pi to the back of your HD monitor? And don’t forget the countless LED projects that are possible when you are modeling your projects in 3D!

The Adafruit Learning System has dozens of great tools to get you well on your way to creating incredible works of engineering, interactive art, and design with your 3D printer! If you’ve made a cool project that combines 3D printing and electronics, be sure to let us know, and we’ll feature it here!