Galleries

There has never been a better time to come up with creative ideas in ag that involve hardware and software, thanks to the mobile phone, maker movement, open source and enormous amount of free education online. Just about anyone with passion, grit and time (the hardest of all) can build just about anything.

I am not thanking the mobile phone because it gives us communication and access to apps etc, that’s all pretty boring and taken for granted now. I thank the mobile phone because it makes everything so damn cheap. Think mass production of GPS, cameras, wifi, gyros etc – economics work in our favor.

The maker movement is basically taking advantage of the situation of cheaper parts construction and packaging up parts especially for curious hobbyists. This makes sourcing parts easy for hands on prototyping, even in Australia. Also gives us access to cool technology like 3D printing.

Open source is generally something able to copyrighted – such computer code, 3D model or information – that is licenced in a way that basically anyone can use it for free as long as they adhere to the license terms. Generally, large, successful open source projects have thousands in their ‘community’ and sometimes hundreds of contributing developers. What this means is a lot of the complex core work in many situations is already done. The APM autopilot code base is a fantastic example. Build a drone or rover OR EVEN A SUBMARINE and there is tested working autopilots ready to control them for free with great communities to help you out.

Now all this is really neat because people that have qualifications in say agronomy or experience in farm management that have real problems to solve can upskill and then have an overlapping understanding in agriculture and their associated technology of interest.

Once you have an understanding (or know where to look) of what is out there, it doesn’t take long before stacking together some cheap hardware with open source software ideas can become a reality.

Here are some ideas that may be more simple to build:

Vehicle tracking devices

Moisture probes

Grain flow meters

Weather stations

More ambitious ideas floating around in my head:

RTK (repeatable, centimeter accurate GPS) is getting very cheap, and we have had high quality long range radios for a while (for sending data to and from base to rover). We could build a an open source autosteer controller for tractors as an alternative to options from Trimble and John Deere. We could use all open formats and generic hardware. See what Matt Reimer built using the APM autopilot controller. The end result is about as incredible as the support he got on the DIY Drones discussion forum. I expect RTK GPS prices to fall even further in the next couple years.

Using our RTK GPS and rover version of the APM we could build small autonomous vehicles to do spot spraying. I took my first step a few years ago by building a small prototype. I have not got round to building a larger one (yet! – there is still time).

We could mount WeedSeeker or WeedIT cameras on our open source autonomous vehicles or we could build our own! Computer vision and AI to ‘learn’ what objects look like is all the rage at the moment with some good open source software packages such as caffe. Rather than be limited to just whether an object has chlorophyll by a vegetation index, how great would it be to determine if it was grass or broadleaf. On board computers are fast and cheap and cameras are probably not as expensive as you think. This is getting very complex but not impossible.

Lets take it one step further. Why not have a drone tethered to the rover. It could be up high constantly scanning for weeds and using some sort of AI travelling salesman algorithm find the most efficient way to spot spray all the weeds with the ground vehicle. Now this is getting somewhere albeit very ambitious.

Solution of a travelling salesman problem: the black line shows the shortest possible loop that connects every red dot. Source: Wikipedia 2017

All sound exciting? Where do you start?

Remember – small steps and time.

I learned to write computer code at Udacity for free by doing the Intro to Computer Science and Web Development. Both are a few years old now but still fantastic. edX now exists which looks exciting for high quality free online learning. I’d love to do a course in software machine integration, computer vision or AI next.

Maybe hop straight to it and start working with Raspberry Pi or Arduino projects. My First project was a fixed wing UAV.

Putting this post together I discovered Farm Hack – a website that basically shares this idea that it’s a great time to innovate in ag.

Do some reading, Google searches and have a think. You may just be able to make that thing your have been dreaming of yourself!

Today I thought I would pull together some of the information that has been produced as a part of my GRDC sponsored Nuffield Scholarship on the use of unmanned aerial vehicles in the grains industry. It has been an amazing experience and I encourage farmers from around the world check the Nuffield organisation to see if a scholarship would suit them.

Satamap is a web based satellite imagery service for precision agriculture. It’s available at satamap.com.au. This is a project I am part of so the following is not an independent review, just a quickly written explanation of this innovative app. I understand my audience is fairly schooled in most things precision agriculture so I’ll skip the marketing talk and get straight to the point.

Today we are launching Satamap. This is a brand new service making up to date satellite imagery available to everyone. Our focus is on agriculture, therefore all imagery is paired with a vegetation index called Satamap Vegetation Index (SVI). It is similar to NDVI but we believe it is better at showing variability in high biomass crops and less impacted by soil colour. The colour ramp we use to represent the SVI values, while in your face at first, is designed to show biomass variability in all crops, at all stages of crop growth at all times of year. The colours remain consistent year round so that, for example blue represents the same as blue and red, red no matter which location or time of year. This is important because the Satamap slider allows any two image dates to be laid over the top of the other and the ability to slide between the two for a direct comparison. The same can be done with the standard colour imagery as well.

Satamap screenshot

This service does not require drawing in of paddock boundaries or limit you to a small area of interest. Subscriptions are based on a 3 million plus hectare tile. It takes 5 minutes to subscribe and you have access to the whole area and an archive back to winter 2013. Imagery is captured at a 16 day interval. Cloud can get in the way at times which can be frustrating but we are working on increasing our imagery availability to reduce cloud impacts. The colour imagery has a resolution of 15 m and the SVI is 30 m. We cover all major cropping regions of Australia.

Satamap works best in an iPad or similar tablet device, but functions equally as well on a desktop computer. Other standard features in Satamap include custom markers, area measurement tools, imagery export and GPS location on the map. All these features themselves could warrant an article, but best to just watch the video to see some of them in action.

Satellite imagery has been available to agriculture and related industries for decades and those that have invested the time and money will attest to the value and significance in this technology but admit that all too much the time and money is often the biggest hindrance. We are aiming to solve these problems with Satamap and bring out the potential of satellite imagery for agriculture. Agronomists, grain traders, farmers, suppliers and more can all benefit from rapid, cost effective access to up to date satellite imagery.

We are in constant development. We are working on offering higher resolution imagery, ground truthing data points, exporting with post-processing and more. Currently only available in Australia, very soon we will be opening up to other parts of the world. Thanks for checking in.

After just finishing a post on some of the applications for UAS in agriculture, I thought I would share what I believe to be some of the biggest challenges the industry faces. As I continue my Nuffield Scholarship studies (thanks to GRDC) I am learning that unmanned aerial systems and the data produced is increasingly complex. In fact, many people will study a PhD just on one aspect of these systems alone (e.g. flight characteristics or remote sensing). Some of the more obvious challenges include:

1. Repeatability

If you were to go out and map a paddock at 11am and then again at 2pm, the resulting pixel values would be different. Even using the Normalised Difference Vegetation Index (NDVI) which by definition gives a normalised value, the data will be different. This is probably because the atmosphere and clouds do not seem to block/transmit/reflect wavelengths in parallel amounts.We see this effect in satellite imagery where the image is affected by cloud shadow. The table below is pixel value of Red and NIR from two points in the same barley paddock that has minimal variability.

Cloud shadow vs no shadow NDVI, Landsat 8, barley

Despite crop growth activity and biomass being very similar, these areas produce significantly different NDVI value due to the cloud shadow.

This can be applied to UAS imagery. The atmosphere, clouds and sun angle is constantly changing throughout the day affecting reflectance of all wavelengths differently. Therefore for data to be compared like to like it needs to be calibrated.

2. Calibration

This section follows on nicely from repeatability. If data is going to be used for more than just scouting then some sort of calibration will need to take place. As discussed in a previous blog post, calibration of NDVI could be potentially be achieved using an active ground device such as a GreenSeeker. There is also the new payloads that are true multispectral and have upward looking sensor to measure irradiance which could be used to calibrate data for true reflectance. Other forms of calibration/ground truthing include biomass cutting and weighing, tissue testing, the list goes on.

3. Algorithms & Indices

When it comes to remote sensing and vegetation, NDVI is the most famous index used, and for good reason, but it is not without its concerns. If you look on Wikipedia, the authors cover some of the issues with NDVI:

Users of NDVI have tended to estimate a large number of vegetation properties from the value of this index. Typical examples include the Leaf Area Index, biomass, chlorophyll concentration in leaves, plant productivity, fractional vegetation cover, accumulated rainfall, etc. Such relations are often derived by correlating space-derived NDVI values with ground-measured values of these variables. This approach raises further issues related to the spatial scale associated with the measurements, as satellite sensors always measure radiation quantities for areas substantially larger than those sampled by field instruments. Furthermore, it is of course illogical to claim that all these relations hold at once, because that would imply that all of these environmental properties would be directly and unequivocally related between themselves.

Thankfully we are not pigeon-holed to NDVI. Agribotix claim they get better results using the Difference Vegetation Index (DVI). Another example is Soil Adjusted Vegetation Index (SAVI).

4. Position

Without ground control points, the positional accuracy of data will be mediocre at best. Expect XY accuracy of a few meters and even more on the Z axis. GPS will record the position each frame is captured (+/- delay error), but the pitch, yaw and roll of the UAS which affects how the image is framed on the ground, is determined by an inertial measurement unit (IMU). The quality of the IMU will have a bearing on the positional accuracy if the processing software takes these variables into consideration. Expect to have to lay out minimum of 4 ground control points for high accuracy data.

UAS image processed with no GCP

5. Reliable data collection

As a consequence of the process involved in collecting data with a UAS there are several factors that contribute to the ability to reliably collect data. The process usually involves the UAS following a set lawnmower style track, conducting swaths up and back as it moves across the area of interest. As the vehicle is flying, it is capturing an image every few seconds depending on its speed. A forward and side overlap greater than 50% is required. Although processing software can handle images with some angle to the ground, extreme pitch and roll will affect the overall product.

In the data collection process if the UAS is hit by a gust of wind, it may put 2 or 3 images off target, while the autopilot makes adjustment. This can lead to a hole in the data set. This is not uncommon. Some UAS manufacturers allow you the rapidly transfer the raw data off the vehicle to a laptop in the paddock to check data coverage and quality in field (e.g. Precision Hawk).

So remember when someone claims their machine can fly in high winds, it is probably true, but the quality of the data being collected may not be of much use.

Sample flight path and image footprints

6. Data processing

Reliably collecting the data is just the first step in the process. Once all these images have been collect they need to be stitched together. It is amazing that stitching 300+ 12MP images together which are all taken at slightly different angles to the ground is even possible. Even more so that 3D surface models can be constructed from these 2D images. Given the complexity of this task it takes large amounts of computer power and time (think several hours for 100ha). For this reason there are several cloud based platforms which can offer this service (e.g. DroneMapper and PrecisionMapper). Processing on your own desktop computer versus online services do have both their pros and cons. A downside to the cloud services is the internet bandwidth required to first transfer the raw data to the server and then retrieve it once it has been processed. A downside to the desktop solution is the upfront cost of hardware and software and the required skill set may not be available in house.

7. Storage & sharing

Once the data is processed it needs to be stored somewhere and somehow distributed. Often one scene can be more than a gigabyte. If the processed data is not cut into tiles it can require a powerful machine just to view it. This is where online solutions come into play. Again the same issues exist as above around bandwidth requirements. At some point if the data is going to be used for more than just looking at it will most likely need to be transferred onto a local machine. This is an online map of one of the first areas I mapped with my DIY Finwing Penguin UAS. If for some reason the data is to be printed, it needs to be formatted as such which takes time and software.

8. System integration

Integrating data generated from UAS into existing precision agriculture software should be possible but not likely in its highest available spatial resolution. Software such as Farmworks and SST were not designed for such intense data sets which are able to be generated from UAS sensors. Resampling data to 0.5m resolution may be required.

9. Safety and legal

A whole other post could be written on this but basically in Australia we have the Civil Aviation Safety Authority (CASA). They require anyone who wants to fly UAS (they call them remotely piloted aircraft or RPA) commercially to have an Operators Certificate and Controllers Certificate. This, among other things requires a theory component equivalent to a private pilots license, comprehensive manuals written, and a stringent approvals process to ensure your vehicle is fit for service and your piloting skills are sufficient. CASA is at the moment reviewing this process and should hopefully have a revision out by the end of the year. This process seems strict, but it important to keep the UAS industry safe and professional.

We can talk about how amazing this new technology is all we want but what most people are beginning to ask now is ‘how does it make me more money?’ I thought I would put together a list of some of the more common applications for data collected from small fixed wing UAS, particularly in broad acre agriculture.

1. Scouting

Probably the most talked about, and easiest to apply is the ability to aid in scouting paddocks. Think about the perspective you get when you fly over farm land. The process is something like fly over paddock with UAS using a NIR & visual capable sensor, stitch images together to form a georeferenced mosaic, and then calculate Normailsed Difference Vegetation Index (NDVI) or Difference Vegetation Index (DVI). The resulting map would show the variability of crop health over the entire paddock and allow you to concentrate on areas of poor health. In addition, human driven variation is very obvious such as planter problems, compaction, chemical application etc.

To make this process even easier there are a couple of iPad and Android tablet applications (e.g. PDF Maps) that allow you to import the map and use your GPS position to locate areas on the map you are looking for. You are then able to add notes and photos by putting down place marks on the maps. Below is a great demonstration of PDF Maps by Crop Tech Consulting.

2. Site specific weed control

In a similar process to Scouting, making sure you use a sensor with high spatial resolution, the resulting map could be used to identify where large individual weeds are located in a paddock. This could work in a fallow or in crop situation. In our farming environment we face glyphosate resistant weeds that require high chemical coverage to kill. Again an app like PDF Maps could be used to find single weeds in a paddock to mechanically or chemically eradicate.

Feathertop Rhodes Grass in a tidy fallow

3. Variable rate applications

Precisely placing inputs where they are most needed rather than blanket applications should increase yield and reduce wastage. Variable rate spreaders, sprayers and air seeders have been around for while now but the uptake use has been less than many expected. A rapidly developed georeferenced NDVI or DVI map from a UAS in combination with in paddock examination of what is causing variability puts you in a good position to generate a suitable VR fertiliser application map. Agribotix discuss their method for producing VRA maps here.

Variable rate application map

4. Insurance/Drift/Environmental regulations

I spent time at a few UAS conferences while traveling through the US & Canada and there seemed to be a consistent presence from insurance companies. I can understand why, especially in North America where insurance is such a major part of farming. If an adjustor from an insurance company is able to rapidly access a map that correlates closely with what they see on the ground then they are able to adjust the area much more accurately than a ground assessment alone, which is better for everyone involved.

Accurately mapping areas of drift damage and ability to map areas of environmental concern has similar benefits.

5. 3D/DEM/DSM

Employing the structure from motion technology, digital surface models can be generated from still images collected with a UAS. Trimble discuss the surveying potential of this technology at length in a white paper available here. In summary, with ground control points, survey grade information can be rapidly generated in their photogrammetry software with data collected from a UAS. Agisoft Photoscan and Pix4D offer similar functions.

Bezmiechow airfield 3D Digital Surface Model data from Pteryx UAV

6. Plant stand

Corn being a pillar crop in the mid-west USA, many are talking about the ability for UAS data to determine site specific plant stands in row crops. The application being to make the decision to replant or not and to evaluate planter performance.

Future applications to get excited about

Above are applications for UAS data than you can reliably apply right now for a reasonable amount of money. Some future applications include:

Over the last 6 weeks during my Nuffield travels (sponsor GRDC) I have had the opportunity to meet with several different unmanned aerial system (UAS) manufactures and discuss their product. I am looking at the potential for UAS in agriculture, particularly broad acre grains. Some systems I have had a closer look at include:

senseFly: eBee

The eBee is probably the least intimidating system. It is almost a true turnkey system – you just provide a laptop for ground control station and PC for data processing software. It is a compact, lightweight (0.7kg) flying wing design which is hand launched. All packaged up, the case is small enough to be classed as carry-on baggage. There is a few payload options including the typical Canon S110, but more interestingly the Airinov Multispec 4C is also available (read here). The included processing software is top notch as it is based on Pix4D.

Interesting facts about the eBee:

The motor is turned off whenever an image is being captured to reduce image blur from vibration

Multiple eBee planes can be operated from the one ground control station with an automated collision avoidance system

Precision Hawk: Lancaster

The Precision Hawk Lancaster platform uses a traditional fixed wing design. Two processors running Linux handle flight management and any other in flight processes such as real time data assessment. The platform is easily hand launched and weighs about 1.4kg not including the payload. Precision Hawk offer several sensors ranging from the humble Canon S series camera through to many of the Tetracam options and can also carry Thermal, Lidar and Hyperspectral equipment. To Precision Hawk, the Lancaster platform is just a small part of the workflow. Data processing and sharing is an even bigger part of their business. They offer Precision Mapper, a cloud based system which allows raw data from any UAS to be uploaded, processed and shared.

Interesting facts about the Lancaster:

Precision Hawk plan to have an API system for sensor integration so 3rd parties can integrate their own sensor into the platform

The Lancaster creates its own flight plan after it has been launched and determined weather conditions

Precision Mapper is excellent value at about 25 cents a hectare! (Hope this lasts)

Farm Intelligence/Fourth Wing: Vireo

The Vireo is marketed as a tool to provide high quality data for the online farm management platform WingScan. In its own right it is a still a UAS worth looking at. It is a sort of hybrid flying wing cross traditional plane but does not have any control surfaces on the tail. It is hand launched and weighs 1.4kg total. The whole system including laptop packs into a provided travel case. They claim it can fly for an hour or more. The Vireo does not use a modified point and shoot, instead a dual-imager sensor payload which captures NIR and Visual (RGB) in a single pass at 10MP.

Interesting facts about the Vireo:

You can go onto the Fourth Wing online store to price their products. They sell the dual-imager sensor separately.

The Vireo does not use any foam in its construction, only carbon fiber and Kevlar

Swift Radio Planes: Lynx

The Lynx is the largest of all the UAS mentioned here and also the system you would part with the least cash for. It weighs in at about 4.5kg, but is still hand launched. The Lynx will fly comfortably for 90 minutes. The plane is controlled by an APM 2.6 but has the ability to completely isolate itself from the autopilot as well for full manual control. Despite the in flight size of this system it packs down into a single case. Swift Radio Planes have developed a roll stabilised camera mount with sensor options from Sony, Canon and Tetracam.

Interesting facts about the Lynx:

It has the unique ability to deep stall meaning it can land in very tight spaces

The plane is entirely encased in Kevlar

Swift Radio Planes offer a server based platform for data processing and sharing

AgEagle

AgEagle’s UAS is a flying wing that is launched from a slingshot style launcher to ensure consistent take-off every time. They pride themselves on a system that is tough in design, built especially for agriculture. It comes standard with a modified Canon camera, but soon available with true multispectral camera. AgEagle supply Agisoft Photoscan Standard and AgPixel in the standard package to process imagery.

Interesting facts about the AgEagle flying wing:

AgEagle describe a simple process of exporting a non geo-referenced JPEG from Photoscan, through AgPixel, into SMS for variable rate application maps

AgEagle are establishing a dealer network throughout the US and even sell their system in Australia

Trimble: UX5

Most people would recognise the brand Trimble as they are well established in the surveying and precision agriculture market place. Trimble offer the UX5 UAS, which is a flying wing design weighing in at 2.5kg, made from EPP foam and carbon fiber and is and catapult launched. At the time I looked at the system they offered both standard and modified versions of a Sony mirrorless 16MP point and shoot camera. The systems comes with a rugged handheld computer for the ground control station. Trimble provide their own software for data processing which is a Photogrammetry Module for their Trimble Business Center Office Suite. This integrates with existing surveying processes but the link to Trimble’s ag products does not seem as complete (yet!).

Interesting facts about the UX5:

Trimble have published a white paper discussing survey accuracy of their photogrammetry software available here

The UX5 uses reverse thrust when landing to allow more predictable and accurate landings

Event38 / 3DR

Event38 and 3DR are separate companies but use similar components. They offer a much cheaper solution that is capable of performing many of the functions of the above UAS. The reality is that they do take some more learning to become familiar with their operation, but as far as value for money is concerned these are good products.

Agribotix: Hornet

Agribotix are worth a mention. They build a UAS based on similar technology to Event38 / 3DR, so it can be built in house quickly and cheaply. They believe that too much attention is given to the flying machines and not enough to application of the data. Agribotix offer a drone lease structure where the UAS is essentially free to use and the cloud based data processing is what incurs a fee – therefore minimal capital outlay to get UAS up and running. A truly unique model.

Interesting facts about Agribotix:

Agribotix are extremely generous with information – their online blog has information on a lot of what they have learned in getting to where they are now

These are just some of the small UAS systems available on the market now. I have not included prices as they are always changing and each product is generally packaged up differently (e.g. processing software included or not). The alternative is to build your own UAS, read about my experiences here.

Traveling through the USA & Canada as part of my Nuffield Scholarship (Thanks to GRDC) I have heard the word data more times than I could count. In this post I am going to remove the platform aspect of the unmanned system and focus on sensors and the data they provide. Then in a later post how we can use it. In this article I do plan to try cover some concepts not widely discussed in the current UAS environment.

The reason an unmanned system is flown is to collect data, then turn that data into information to help monitor, assess and ultimately make timely, cost effective decisions based on that information. When collecting data it needs to be good quality. It is important not to confuse data quality with data type. For example, many tend to gravitate straight to the amount of megapixels a sensor captures, neglecting its spectral accuracy.

If we consider what our target is when collecting data from a UAS in a grain farming situation, it will most commonly be vegetation (not always but let’s focus on that). Collecting spatially referenced data of vegetation is by no means a new endeavour. This information has been collected as vastly as Landsat satellite imagery and as specific as a GreenSeeker. Generally, for vegetation, similar bandwidth reflectance is measured irrespective of proximity to the target. The same is true for sensors used in UAS. Why is this the case? Well you can read the long answer here (Remote Sensing of Biomass). The short answer is photosynthesis. In a plant that is photosynthesizing, the chlorophyll will absorb large amounts of ‘visual’ light, particularly blue and red, and reflect near infrared (NIR) light. The more photosynthetic activity, the more NIR light is reflected and less visual light absorbed. Conversely, inactive vegetation will reflect more visual and less NIR.

The best contrast is between red and NIR light which is what is generally used when calculating the Normalised Difference Vegetation Index (NDVI). NDVI is a good indicator of plant health and measure of biomass. Consequently, most sensors used to determine NDVI look into the red and NIR bands – some more accurately than others. The chart below shows the reflectance curve of green grass over different wavelengths. Below the X axis is a rough spectral guide to some of the well-known sensors available to us.

Reflectance of green grass & sensor spectral bands

What is most notable is the wavelength spectrum or band at which each of the sensors read. If we consider the GreenSeeker, it is extremely specific at capturing a certain wavelength in the middle of the red spectrum and similarly specific in the NIR. At the other end of this comparison you can see that the S100 modified camera has very broad spectrum for each channel that it reads. Consider (what was before modification) the S100’s ‘red’ channel which reads roughly from 0.67um to 0.76um with the modified filter. Post modification, this channel is renamed NIR and measures reflectance in an area that covers red right through to NIR. The S100 modification retains the blue and green channels which replaces red when calculating NDVI. Another significant point that this chart does not show is the interference that can occur between the different bands in a point and shoot camera. Check out the S100 reflectance chart about half way down the page in this link which shows some NIR response in the blue and green channels.

It has to be noted that it is hardly fair to compare the S100 and GreenSeeker in a practical sense for several reasons with the main one being that you would not mount a GreenSeeker on a UAV as it needs to be close to the target (the GreenSeeker is an active sensor meaning that it emits light and measures that reflectance, the S100 is a passive sensor just reading reflectance from the sun). In addition, the GreenSeeker measures only one point whereas the S100 collects data on 12 million pixels. The reason I do compare them is because they can both be used to produce an NDVI map. In fact despite the spectral differences from each of these sensors and the proximity to the target, RoboFlight claim from their tests that NDVI data collected from a GreenSeeker and modified S100 correlate in a linear fashion very closely (r squared > 0.9). So we know that the two sensors correlate well but the correlation will never be a fixed formula because sunlight reflected will always be different based on sun angle, atmosphere conditions, cloud etc. The S100 and GreenSeeker would probably work best as tools which complement each other. For example, map a large area with the S100 on a UAS. The resulting dataset could be calibrated using GreenSeeker data collected in the field at the same time as the flight. Potentially if S100 data is always calibrated against the GreenSeeker, inter-paddock and inter-season comparisons can be made.

We are starting to see a new wave of sensor development designed specifically for UAS and agricultural and environmental industries. An available example is the Airinov Multispec 4C. This sensor captures 4 distinct, narrow spectral bands with no interference. These bands include green, red, red edge and NIR. What makes this package special is that not only does this sensor package look down at the vegetation; it is also looking up measuring sunlight with a lux meter. This should allow us to generate data that can be calibrated without the need for ground truthing with a GreenSeeker or similar ‘active’ device. Another feature of this sensor is that it uses a Global Shutter which means all pixels in a photo are captured at exactly the same time eradicating any motion blur. The 4C has much less spatial resolution than the S100 (1.2MP vs 12MP). Expect to pay over US$10,000 for this sensor package, not including UAV or processing software.

In summary, this article aims to explain how there is more to a UAS sensor than just megapixels. It is important to understand the spectral response of vegetation and how this can impact your sensor choice. A modified Canon camera such as the S100 is a great option for UAS but its limitations must be understood. Work needs to be done to analyse the results and accuracy of the new sensors such as the Multispec 4C.

S100 mounted in a DIY Finwing Penguin build

* Further notes: The most common sensor used in small UAS (mid 2014) is a Canon S100 or similar variant. This camera was never designed to be flown in an UAS but the internal GPS, fast shutter speed (1/2000), relatively large sensor (12.1 MP 1/1.7″ Canon CMOS), low weight (198g), ability to be modified to detect NIR, CHDK compatibility for intervalometer, and low cost (<$500) all contribute to a well suited sensor for this application. Flown at 120m this camera can provide a ground resolution of 3.5cm.