LINUXBOARD SUPPORT PACKAGES

We “bring-up” Linux on new embedded systems by adopting, adapting and integrating custom BSPs, I/O devices drivers and operating system kernels for ARM, Intel x86 and MIPS CPU architectures.

EMBEDDED SOFTWARE DEVELOPMENT

We work closely with Semiconductor and Original Equipment Manufacturers to enable their new MCU/CPU based embedded systems and processors, scoping Linux, RTOS and bare metal programming.

NETWORK PACKET PROCESSING ACCELERATION

We work with IT/telecom OEMs and service providers to achieve higher network traffic throughput rates in their Linux based enterprise servers by integrating DPDK and Virtualization software technologies, e.g. docker containers and Virtual Network Functions (VNF).

INTERNET OF THINGS SOLUTIONS

We enable IoT in embedded systems based applications by integrating the technology required to connect them to device management and data storage/analysis services such as Amazon Web Services.

Farming Drone

"Farming Drone" winner of Emutex's Software Innovator Of The Year Scholarship 2015.

This year the competition was focused on creating solutions to real world problems. We were expecting some very innovative ideas from colleges throughout Ireland but were completely surprised when a secondary school student submitted this fantastic idea. Christopher Kelly, a leaving certificate student at Crescent College Comprehensive SJ in Limerick, had an idea that could provide cost savings to farmers while also focusing on helping the environment.

Motivation

When reading an article online about how near-infrared technology is used for farming in Japan, Christopher began to think of how such technology could be adapted to Ireland. Living in a rural area, he saw the challenges that farmers faced on a daily basis. His interest in finding solutions to these challenges along with his new found knowledge on near-infrared technology resulted in the idea that he submitted to the Emutex Software Innovator Of The Year Scholarship Competition.

The project titled "Farming Drone" focused on providing useful feedback to farmers when fertilising their crops. As farmers may not posses information on how well their crops are growing it is normal practise to fertilise all crops to ensure that conditions for growth are at their optimal. Not only is there a significant cost associated with this but there is also the prospect that the fertilisers can negatively impact the environment and local wildlife. Christopher recognised a potential solution to this issue that could provide assistance to farmers throughout the country.

Theory

Photosynthesis is the process by which plants convert light into energy. Basically, plants use visible light as food. However, this is not completely the case with all light. For example, plants reflect away much of the green light which is why they appear green to the human eye. Plants also reflect near-infrared light which is not visible to the human eye. This light is reflected as it cannot be processed chemically by the plant. Although this light cannot be captured by the human eye, it can be captured using a near-infrared camera. The amount of this light reflected by the plant provides very useful information on how well the plant is photosynthesising.

As an example, below is an image first captured using a near-infrared camera and then analysed.

The image can be translated by a farmer using the normalised difference vegetation index (NDVI). This is a scale ranging from -1 to 1 by which the colours shown in the analysed image can be interpreted. The NDVI scale is as follows:

This scale runs from -1, indicating no growth, to 1, indicating excellent growth. Healthy plants typically have an index ranging from 0.1 to 0.9 on the scale. Farmers would aim to keep the index as high as possible. This theory is the main principle on which this project is built.

The Full System

The "Farming Drone" can be broken down into three components:

The Blue Box

The Host

A Quadcopter

The first two components provide the real technical aspects of the system. The first component, named simply as the Blue Box, has the following functions:

Capture near-infrared photographs.

Capture and log the GPS sensor information.

Capture and log motion and environmental sensor information.

Transfer images and sensor information to the host.

The functions of the second component, the host, are as follows:

Capture images from the Blue Box.

Capture sensor information from the Blue Box.

Analyse the captured images to provide feedback to the user.

Create a live graph of captured motion and environmental sensor information.

The initial idea included a quadcopter which would provide the means to elevate the blue box enabling the system to take aerial-view photographs of the area to be analysed. However, it was decided that this was not required for the initial prototype of the system as it would merely be used as a transportation device.

With the full system constructed, the user, in this case the farmer, would control a quadcopter using the host. Real-time positional information would be captured by the "Blue Box" and provided in graphical form to the farmer. This would aid the farmer in controlling the quadcopter. The captured images would also be provided to the farmer to either be analysed on location or saved for later analysis. Once the images are analysed, the farmer could then make an informed decision on what crops or sections of crops actually require fertilisation.

System Breakdown

As mentioned previously, the Blue Box is the main technical component of the project. It captures and analyses sensor information in real-time whilst also making this information available to the host. This system was built using all off the shelf components. Christopher had much of the hardware himself as he is clearly an experienced maker. Anything he didn't have was provided by Emutex as part of the competition.

One of the requirements of the competition was that the project would incorporate a Linux based embedded software and hardware platform. In this case, Christopher chose the Raspberry Pi B+ model for his system. To interface with the different sensors he also decided to use an Arduino Uno R3 as the sensors were compatible with this device. The R-Pi and Arduino Uno did not communicate directly with each other. Instead they had their own communication channels with the host.

It was required that the system could be accessed wirelessly and so Christopher decided to use a wireless module attached to the R-Pi. This would allow the Pi to connect to a Wi-Fi hotspot. The Pi could then be accessed by a device connected to the same hotspot. Once this was in place Christopher enabled the SSH server on the R-Pi as well as setting up a VNC server. These allowed the user to connect and interface with the device wirelessly.

The Arduino used an APC220 wireless transceiver to communicate sensor information to the host. The GPS module used a similar system and so the baud rate of both systems were set to different values ensuring there was no interference between them.

As the image capture device, Christopher used an Adafruit Infragram module as it is used specifically for plant analysis and it is compatible with the Motion software package. Christopher used the motion software package to capture images from the device as well as hosting these images on the R-Pi allowing them to be accessed by a host. The image capture device was connected to the Pi with images being captured at a rate of 1 to 2 frames per second. Each frame was hosted at an IP address accessible by the host while also being stored locally. The images could then be uploaded to an online analyser on publiclab.org with the output providing valuable feedback to the farmer.

The position tracking sensors all interface with the Arduino Uno using the I2C interface. For GPS positioning, an Adafruit GPS and SD logger shield was used. This provided GPS coordinates as well as logging them to an SD card for later use if required. The remaining sensor information was obtained using a 10 degree of freedom IMU module provided by dfrobot.com. For position information this module provides sensors used to measure acceleration on the X,Y and Z axis as well as providing the rotation of the system in degrees.

The 10 DOF module also included a barometric sensor providing pressure and temperature information. This information was recorded and provided to the user as extra feedback.

Once the sensor information had been captured by the host, Christopher wrote Python scripts to interpret the data and present it in a graphical format to the user. He also created a simulated Blue Box which replicated the movement of the system using the positional and rotation sensor information. This would be used to assist the user in flying the drone by providing visual feedback of its orientation. This can be seen below:

Blue Box

The Blue Box is the main technical component of the project. It captures and analyses sensor information in real-time whilst also making this information available to the host.

Analysed near-infrared image

Below is an image, first captured using a near-infrared camera and then it is carefully analysed.

Christopher and Mark preparing for take-off

We decided to send Christopher and his Blue Box on a plane to capture some aerial view photographs. Mark Burkley, Emutex CTO, took Christopher for a flight to test his system out.

System in Action

Christopher demonstrated his project in a controlled lab environment to our group of judges and he really impressed. All of the components were working together to provide a complete solution in which real-time image analysis was shown. Once he was announced as the winner of the competition, it was decided that it would be great to see the project in a real world environment. As mentioned previously, ideally his setup would be attached to a quadcopter. However, we decided to go with the next best thing and send Christopher and his Blue Box on a plane to capture some aerial view photographs. Mark Burkley, Emutex CTO, took Christopher for a flight to really test his system out.

Luckily, it was a clear day and Christopher was able to get some fantastic images proving, even further, what a great concept this project is.

As seen from the above image, the NDVI index is quite high in many of the fields. Some areas have average to low indexes and this is where the farmer could potentially focus fertilisation. Below is another image taken from the test flight.

Conclusion

For a project that was developed within just a few months by a 2nd level (Leaving Certificate) college student this really was a fantastic achievement for Christopher. Designing, constructing and testing a system like this takes a huge amount of innovation, patience and determination and it is clear that Christopher possesses all of these attributes. We are sure that the farmers he was inspired to help will be looking to test this project for themselves.

We would like to thank everyone who was involved in this project. Once again the Emutex team were more than happy to give up their time to contribute to such an exciting project. Finally, we would especially like to thank and congratulate Christopher on such a fantastic achievement. This leaving certificate student has a bright career ahead.