Pages

Thursday, July 23, 2015

I think this brings things full circle. Two weeks ago there was a barrage of baby pill bugs, last week we saw death occur as a hungry beetle larvae happened upon a slug, and this week we have millipede mating!

What we can see occurs over a period of 6 hours, and at some point it looks like three millipedes are involved. It's difficult to tell if they are the same millipedes participating throughout the entirety.

I had originally incorrectly titled this post "Centipede Mating". You can see between 16:02:01 and 16:57:02 that each body segment has two pairs of legs. If these were Centipedes you would only see one pair per segment. Assuming these are mating Millipedes we might get to see a slew of babies in a few weeks! Apparently they start off with only three pairs of legs and only after a few molting sessions do they appear as we're used to seeing them.

Amazing piece of info: Millipedes are part of the Myriapod family (many feet), and were the first oxygen breathing animal to walk on land! And last, these creatures are a significant contributor to decomposition, that crazy process that results in this wonderful pile of material we call soil that keeps you and I fed. Next time you're eating anything or simply going somewhere on a walk, thank the Millipedes.

Image stabilization issues
Some of the videos I compile tend to shake left and right, a result of the scanner occasionally starting at a slightly different position. It always scans the same distance and looking at one image next to another it's difficult to see any difference. Play them back in a video and it becomes very apparent that the images don't line up.

I've used FFMPEG's deshake option to smooth out this jitter, but it's difficult to remove all of it. Going through each image one at a time is a bit too time consuming, so we need to either resolve the initial problem that causes it, or figure out a more accurate work around.

A few thoughts on resolving / working around this:

1. Figure out how the scanner returns home.
- At 600 DPI even a millimeter off will result in a 20pixel shift. When viewing the whole 8.5x11" area it's not very obvious, when zooming into a smaller area it is. The intended use of these scanners from the manufacturer likely did not include building timelapse videos : )

2. Can we compare the first twenty rows of pixels from left to right of sequential images?
- Once we find a certain percentage match, trim all rows to the left off.

3. Apply a ~5mm solid white border to the glass scanning plate
- Use imagemagick to locate the border and crop within the boundary.

Saturday, July 18, 2015

A week ago a dying worm slowly crawled into a small pocket of space. I was interested at the possibility of seeing what the decomposition process would look like. A few days in and this happens:

I didn't expect to see something so violent! The timelapse videos shows in 10 seconds what in reality took nearly 12 hours. Maybe the speed at which it plays back is what makes it feel so violent?

It would be great to develop a system that allows for real time viewing, but that will have to wait.

Not sure what type of slug was eaten, the predator in this case appears to be some sort of Beetle Larvae? You can also make out the worm in the first few seconds of video, the view of which is quickly obscured as the Beetle Larvae makes its way in.

Friday, July 10, 2015

A few days ago a couple small explosions of tiny white bugs appeared. Zooming in shows what appear to be hundreds of tiny Pillbugs (Armadillidium vulgare, roly-polies : )! Action begins approximately 9 seconds in.

According to this article these young crustaceans have already spent 10 to 14 weeks riding around in the pouch of a female Pillbug. I wonder if we could look back at previous video and identify any of them?

The area represented in the video is approximately ~8.1cm x 4.5cm (3.2" x 1.8"). This puts the young Pillbugs at ~1.5mm (1/16") in length.

Most of the Pillbugs quickly disperse beyond the viewable area within a few seconds. A few slow down and some can be seen feeding, or being fed on.

So much happening in such little space, so much more happening that's barely visible, and I can only begin to imagine how much is going on that isn't visible at all at this level.

Sunday, June 28, 2015

Talking with George Albercook and Greg Austic got me wondering (in very rough) terms how the natively captured scanned images compare with a camera.

Scan DPI: 600
Physical Area: ~11.7x8.5"
Pixels:7015x5076

Equivalent to a 35 megapixel camera. Pew pew megapixels! Unfortunately, the videos being posted are processed down to ~1920x1080.

It would be interesting to see the timelapse at 4k on an appropriate monitor, but it's more interesting to start looking closer. Since we're capturing the images at a higher res than being displayed, we can crop out a 1920x1080 section of the native res image and turn that into a video. Here's what that looks like over the course of a week:

The video above is of a ~3.2" x 1.8" splice of the earth. One week = 604800 seconds. This plays back in ~66 seconds. Life at ~9163x!

The scanner being used can capture images at 2400 DPI, and scanners that can capture at 9600 DPI aren't terribly expensive (though they do get bulkier). Capturing a full image at that res is probably too much for the Raspberry Pi to handle, and the storage space would be excrutiating.

But capturing even a square inch of space at that DPI would be fascinating, and I think doable : )

For now, I wonder what square inch would be most interesting to capture at 2400DPI?

Also, latest images are being posted at the "Latest Image" tab above. Approximately ~10 minute delay.

Tuesday, June 16, 2015

Ann Arbor saw between 1 and 3 inches of rain on Sunday, June 14th. It's interesting to see what the rain does to freshly (~two week) turned soil.

Playing around with imagemagick a bit more. I'm splicing a 100x100px section and getting the 'average brightness' of it. Scans are taken every 5 minutes, but to keep things simple I'm only sampling every third image, or roughly every 15 minutes.

Then I repeat that process 100px down, continuing until we get to the bottom of the image. Imagemagick spits out a pile of numbers like so:

Reading Date

Rainfall (in.)

0-100

100-200

200-300

300-400

400-500

500-600

600-700

700-800

800-900

0:00:00

0

9598.33

13316.9

10462.3

10667.8

10865.2

10513.7

10696.4

10353.2

10342.7

0:15:00

0

9460.71

13131.9

10450.9

10675.1

10881.2

10513.1

10743.1

10371.1

10354.6

0:30:00

0

9237.39

12896.3

10400.8

10698.7

10877.8

10542.1

10744.6

10392.1

10368.6

0:45:00

0

9155.23

12850.5

10315

10650.4

10851.8

10505.7

10729.3

10370.2

10382.1

1:00:00

0

9487.32

12982.9

10343.6

10668.1

10847

10512

10734.3

10374.6

10384

1:15:00

0

9519.43

12857.8

10288.2

10630.3

10810.6

10464.2

10685.2

10365.9

10339.7

1:30:00

0

9623.34

12937.3

10331.2

10609.4

10802.6

10485.5

10736.8

10393.2

10421.1

1:45:00

0

9516.39

12664.1

10087.5

10552.4

10765.2

10437.9

10715.7

10369.7

10345.9

2:00:00

0

9497.97

12638.1

10084.9

10480.2

10656.9

10417.9

10720.8

10366.6

10347.4

2:15:00

0

9586.58

12693.7

10088.4

10481.7

10597.6

10347.9

10712.9

10380.3

10333.1

2:30:00

0

9770.57

12844.1

9968.46

10395.3

10489.6

10198.7

10651.9

10324.4

10265.1

2:45:00

0

9660.3

12739.6

10046.8

10414.7

10511.4

10188

10619.8

10355.5

10314.7

3:00:00

0

9605.28

12698.8

10024.2

10384.2

10493.6

10182

10594

10318.9

10277.6

3:15:00

0

9569.22

12469.8

10023.2

10386.9

10474.1

10120.6

10551.8

10289.6

10249.6

3:30:00

0

9575.44

12534.9

10039.6

10409

10488.7

10132.1

10541.7

10279

10242.2

3:45:00

0

9509.7

12525.6

10005.7

10370.4

10443.8

10094.8

10502.8

10236.6

10190.7

4:00:00

0

9598.07

12539.1

9968.55

10371.2

10460.4

10099.1

10510.9

10232.1

10183.5

Reading Date and Rainfall (in.) are coming from elsewhere. The first 100px is mostly black, this part of the scanner is above ground and since it's night there is little for the scanner's light to reflect off. A graph of the 400-500px region from 00:00 to 4:00:00 looks like:

The numbers above unfortunately represent the day prior to the rainfall. Whoops. In the morning I'll have numbers for the proper day and be able to compare them with data from the City of Ann Arbor's Rain Gauges. Some time later this week I'll post the results.

I'm wondering if I can show to some degree of reliability how far and how quickly the rain is penetrating into the soil with this setup. I guess I should build/buy some soil sensors at this point to compare : )

Also, currently using Imagemagick's identify -format '%[mean]' command to infer "image brightness", but I'm really not sure what the command is doing / how it comes up with the numbers it spits out....

So much to learn!

Updated:
Comparison of rainfall vs. image brightness for June 14th. This is a 100px snapshot ~4 inches below the surface. Both the rainfall and image brightness values were remapped from 0 - 100.

I'm doing a couple things here that I'm pretty sure are bad ideas.
1. I'm analyzing the jpeg, not the original tiff file.
2. I'm analyzing a section of a copy of the original jpeg, more loss : )
3. I have no real clue what I'm doing with the math. I think I used the same method I used to remap and constrain light sensor values on an arduino project from years ago for this (return (x - in_min) * (out_max - out_min) / (in_max - in_min) + out_min;)...
4. Wheeeee!

Thursday, June 11, 2015

Imagemagick is a wonderful suite of tools that allow people to do lots of interesting things with images. Combining it with very minimal scripting knowledge lets one easily automate processing of large batches of images.

I used ImageMagick's compare tool to highlight in red the difference between 286 sequential images captured over 24 hours. I then used IM's convert tool to remove all other colors, create a transparent background, and finally stack each image onto the next.

Avconv was then used to turn these into a short 10 second video.

It's not terribly clear, but it does highlight the path(s) worms are taking. The large blob of red that shows up at the top is sunlight penetrating the first inch or two of topsoil/debris.

Very little attention to any sort of detail has been taken with this. This was a terribly fun distraction from documenting the actual setup - which I should really finish up in the next week or two : ) Regardless, I am constantly amazed at the amount of blood sweat and tears that are freely available in the software world.

Sunday, May 31, 2015

I took a portable version of the scanning setup to the beaches of Lake Michigan last week. It was interesting, not as successful as I would have liked it to be but I did learn a bit, took some notes, and have a better idea of where things can be improved.

The home setup continued to work fine while I was gone. The biggest issue I have to resolve is storage, switching to a portable USB drive should resolve that, but I'll miss the joy of uploading everything to a remote server, crunching all the images into a video and uploading to youtube / wherever at high speed. It doesn't make sense at this point to justify spending $50 / month when a 1TB drive can be purchased for $70.

I

Aside from technical fun, it looks like we've had a creature of some sort dig a hole in front of the scanner! No good images of the animal itself, the limited depth of field only gives us blurry images of anything more than 1 or 2 mm away from the scanner.

- - This version assumes everything happens on the Pi - lots of pros & cons with this : )

- Still need to compile a section on device setup (hub/scanner/wifi) along with pros & cons of different device choices.

I would love to spend more time with someone discussing and either building or linking to information that helps myself and others understand what can be seen in the images.

- Root Growth

- Insect life / processes that we're able to see

- Soil type

- Seasonal differences

Last, another objective has been to get battery powered version running. I spent a couple hours today setting up a battery (on loan from the wonderful AADL). At roughly 500mA idle and 900mA scanning (figuring 1 min/scan) should provide plenty excess energy to document for a solid day. (27,000mAh Battery).

Though my current understanding is that I can not charge the battery AND power the setup at the same time. The specs for the battery specify a max of 1 Amp output. With a WiFi dongle power hitting 1007mAmps for roughly 30 seconds at a time. It was recommended I avoid exceed the max amperage, even if just a little.

Monday, April 27, 2015

Slowly documenting things! I'm aiming for minimal documentation that communicates as much as possible. This is proving to be difficult : ) This also meant burying another scanner : )

I currently plan on breaking documentation into three sections:
1. Setting up the Raspberry Pi to automate scans
- Process documented, needs testing and significant clean up before starting rough draft

The last two sections are pretty straightforward. The first section needs a couple dedicated afternoons cleaning up the existing documentation, testing from scratch, and finding a balance of what+how much to communicate. I would love to see if the entire project could be done by a group of young students, and that brings up a number of questions.

What and how much to communicate really depends on the goal of whoever's perusing it. It's somewhat amusing that the easiest way to document is to document everything in excruciating detail. Which is probably a buzz kill for most... Trying to take into account things like:
- Would it help to include automated publishing of images + videos.
- - Do we cover rsync, backing up, ssh, etc.? Or discuss GoogleCL?
- Is local access to images and videos preferred, and then they can decide what to do with it?
- - Do we discuss methods on Windows & Apple to process images and video?
- What's a good price point to assume? Minimal: ~$100. Ideal: ~$150.
- - This changes how much data can be recorded and methods of storage. - Do we keep it to the command line environment?
- - Aside from viewing videos it's all been SSH. This can be a huge barrier for some.
- Do we discuss open source software, hardware?
- - This whole project would likely not have happened without it.
Mmmmm, food for thought! Sounds like I need to spend some time communicating with people who might want to use it, and what their goals and desires might be.

Saturday, January 3, 2015

For the past 24 hours the scanner has not been functioning. Initially the top half of the image displayed the same repeating line of pixels, and a power cycle resulted in the device no longer appearing (lsusb) on the Pi

Both problems have separately occurred before. The former problem happened when the scanner was first buried. Dirt compacted too forcefully against the scanner meant the sensor was unable to complete a full pass, resulting in a similar image. The latter happened with the device reappearing after a day or two and several power cycles un/replugs, cause unknown.

Beyond the above, there are two variations on this project that I would like to pursue:1) Building a cylindrical scanner. Scrap the frame/body of an existing scanner, mount it to a glass cylinder and have it scan 360 degrees. I think it would be easier (physical constraints) to get it to scan from the outside in, but I think it might be more interesting to get it to scan from the inside out (buried in the ground). Similar to existing Mini-Rhizitrons, but with a 360 degree view.

2) Building a webcam version. Similar to existing Mini Rhizotrons, but significantly cheaper. This is how this project originally started. The image quality and viewing area will be reduced, but we'll have a significantly higher frame rate and nearly real time (ms delay) viewing. Taking an image once a day seems fine for tracking root growth, but tracking the path of certain bugs below ground seems like a 1 frame / minute frequency.

It would be wonderful to include additional data for both build options, ie:
- Moisture
- Humidity
- Temperature