Traditional communication solutions such as cellular telephony, require a lot of infrastructure and resources, due to which, rapid deployments in the time-scale of a few minutes or hours, are not possible. Hence, they are difficult to deploy when existing communication infrastructure gets destroyed / hampered (e.g. during disasters) and their deployment costs are too large to be amortized where resources are scanty (e.g. rural India). LifeNet is designed as a communication solution primarily for such scenarios, where communication infrastructure does not exist or existing infrastructure gets destroyed. LifeNet allows rapid service delivery on a Wifi-based ad hoc networking platform formed by off-the-shelf, compact, end-user hardware such as laptops, smart-phones and routers.

As a part of the on-going collaboration with Tata Institute of Social Sciences, India, a field trip was conducted in the flood affected areas of Odisha, primarily with the aim of familiarizing the developers of LifeNet with the on-field realities. The field survey was conducted on the 21 st, 22nd and 23rd of October 2011 in which, I participated from the LifeNet development team and Dr. Shibu Mani and Santosh Kumar participated from TISS.

Situation on field

Odisha is considered to be one of the most under-developed states in India. The eastern parts of Odisha are highly prone to floods and cyclones, whereas drought is a common phenomenon in its western regions. More than 90 percent of the population in rural parts of eastern Odisha engages only in agriculture. Due to its position at the mouth of rivers Brahmani, Baitarani and Mahanadi, the area under survey is blessed with highly fertile land and paddy is the main crop followed by jute and vegetables. On coastal regions, fishing is also an important occupation. Most rural folk communicate in their native language Udiya. However, one can also get along with Hindi as well.

Due to its unique multitude of problems, Odisha provides a safe haven for many NGOs. Apart from big players like UNICEF, UNDP, etc., many small NGOs are able to thrive in Odisha, particularly in the coastal belt, where floods and cyclones are commonplace.

Starting from the first week of September, the eastern coast of Odisha fell pray to severe flooding on three of its major rivers – Mahanadi, Brahmani and Baitarani. Flooding continued until the first week of October. Heavy rainfall in the neighboring state of Chhatisgarh was identitied as the root cause of the flooding.

The flood happened in three stages. The first stage consisted of fast floods in the Brahmani and Baitarani river basins. Mahanadi river basin was flooded in the second stage, whereas the third stage again consisted of severe floods again in the Brahmani and Baitarani basins. Flood-affected areas of Odisha are demarketed as shown in the Figure. Most of these areas were surveyed during the field trip. The images shown below would give you some idea about the magnitude of destruction to physical infrastructure.

Survey: Operational Description

Arrangements for our survey were made by a TISS alumni, Mr. Gobinda Ballavi Dalai, who is now a member of Youth Development Foundation, an NGO which supports emerging social movements on social equity, livelihood and youth issues leading to mass conscientisation. Two other NGOs also participated in the field trip; each with its own agenda. Mr. Suresh Kumar participated and led the team of Goonj, an NGO known for its 'Cloth for Work' program. Mr. Niranjan Sahoo from Joy Bharati Saathi Samaj, coordinated the on-field operations. JBSS is a local NGO, which works in the flood-affected Kendrapada district of Odisha.

It was decided that on day 1, we visit the flood-affected villages of Brahmani-Baitarani. On day 2, we visit the flood-affected villages of Mahanadi and on day 3, we visit Radio Namaskar, a community radio station in Konark, near Bhubaneshwar.

Day 1 - The first on-field orientation meeting was held at JBSS office in a small village called Kanipada in the Kendrapada district of Odisha. Kendrapada district, which spanned the basins of three major rivers (Mahanadi, Brahmani and Baitarani) was affected by floods the most. Mr. Niranjan Sahu, founder of JBSS, being a local resident of Kendrapada, was well-versed with the situation. As an orientation exercise, he explained the geography and demographics of the area along with the causes, coverage and impact of the floods. We then surveyed most of the villages in the Brahmani-Baitarani belt on day 1. As we visited different villages, we (LifeNet and TISS team) interacted with local residents (farmers, villagers) and tried to develop a clear understanding of the situation pertaining to communication and related issues. The villages that we visited were – Aul, Rajkanika, Rajnagar, Damodarpur, Bara Dahumunda, etc. (Red block of Figure 1) On our way back to JBSS office, at the close of our day's work, we got a chance to meet Mr. Pradeepta Kumar Patnaik, the Collector of Kendrapada district. Though the meeting was more relevant to the other NGOs involved in the field trip (YDF, Goonj, JBSS), the Collector seemed interested in trying out LifeNet, when the idea was put forth very briefly.

Day 2 – We surveyed the flood-affected villages in the Mahanadi river basin on this day. Mahakalpada, Diha Buspur, Noindipur, Talka pada, Madhupur, Jambu Dweep were the villages surveyed on this day (Green block of Figure 1). We also got a chance to meet Mr. Bipra Mohanty, Block Development Officer of Garadpur. His block was one of the worst affected blocks by the flood in the Kendrapada district and he had extensively documented the operations. He freely shared all his data with us and expressed keenness in testing something like LifeNet, which would improve communication during disasters and even otherwise. He insisted that in every disaster and normally otherwise, the poorest of the poor suffer the most and should be the primary target of interventions.

Day 3 – We returned to Bhubaneshwar early morning. We then visited Radio Namaskar, the first and the only community radio in the state of Odisha. We had a session of detailed interaction with Mr. Ansari, the founder Young India, the NGO responsible for running Radio Namaskar. Radio Namaskar has ICT4D as its major focus area and provides related services and shows. It also plays a crucial role in providing timely cyclone updates and market rates updates to fishermen.

Findings

(I) Impact of floods on infratructure

1. Most of the villages were completely submerged in water during peak floods.

2. Power supply was cutoff for more than a week in most places. In some places, it was cutoff for more than two weeks.

3. Cellular networks were non-functional for more than two weeks in the entire flood-affected zone.

4. Power supply intermittent even after several days passed.

5. Concrete or tar roads were broken in some places rendering the flood affected areas inaccessible to vehicles.

6. In many cases embankments were completely destroyed. Embankments were the only road to many villages, destruction of which, made them completely inaccessible.

7. Many mud houses were uprooted or destroyed completely.

8. Public buildings such as schools were severely damaged or partially destroyed.

The preparedness to communicate efficiently during any emergency situation can be assessed by looking at the following parameters:

Type of emergency

Resource availability

State of available infrastructure

Effectiveness of the early warning protocol

State of available technology

Since floods and cyclones are commonplace in Odisha, the State authorities were better prepared to respond to the situation, in terms of keeping regular contact with the respective departments of water and meteorology for early warning. Villagers and local residents were also found to be more alert.

Resources can be of the following kinds – (1) Human Resources (2) Monetary Resources (3) Backup Storage Capacity. In Odisha, human resources are abundant because of frequent disasters and presence of several NGOs. Government officials have to be on alert. Backup storage capacity is reasonable and government does have provisions of backup storage in critical zones.

Infrastructure can be classified in mainly the following categories – (1) Physical infrastructure such as buildings (2) Power supply infrastructure (3) Transport infrastructure and (4) Communications infrastructure. Physical infrastructure was partially and completely destroyed in all the flood-affected zones. Power supply infrastructure was hampered for 2-3 weeks post disaster. Transport infrastructure collapsed as roads and embankments were broken in many places. Communications infrastructure was not physically destroyed, but was rendered useless due to loss of power. Hence it can be surmised that infrastructure-wise Odisha was not prepared to handle the flooding conditions. In spite of high frequency of flood / cyclone occurrences, infrastructure-wise Odisha still remains very weak.

Early Warning Protocol in Odisha works as follows:Water Dept (for flood) or Meteorological Dept. (for cyclones) contacts State Authorities. State Authorities then contact Collectors of the respective districts under effect. Collectors then inform their District Control Officers. The District Control Officers further contact local police stations. Local police personnel then visit individual villages and broadcast the news using loud-speakers. As it can be seen here, the vital information has to percolate down many levels of hierarchy before it gets actually delivered to the intended recipient. Though the information flow did happen smoothly in the case of the latest floods, this protocol seems quite susceptible to delays in message delivery.

Cellular communication remains the most popular technology for early warning and after disasters. If it fails, as in the case of these latest floods, government authorities used VHF sets that are installed in every district control room and police station. In spite of having VHF sets, people still had difficulties in coordinating their communication in the immediate aftermath of the floods. Otherwise, no other technology such as SMS blasting is being used in Odisha.

(III). Learnings for a communication solutions designer

Assume that physical infrastructure such as buildings would be partially or completely destroyed. Hence, there would be no control over the placement of communication equipment. The equipment could be placed anywhere depending upon the availability of space and the network should self-organize accordingly.

Assume that for the first few days power supply will be cut-off. Hence the communication should not depend on the same for its electrical requirements.

Assume that existing communication will be rendered partially or completely useless either directly because of physical destruction or indirectly because of power supply failure. Hence the communication solution should be designed such that it leverages the traditional communication infrastructure wherever it exists and uses alternate communication mechanisms when traditional infrastructure fails.

Assume that transportation infrastructure such as roads / embankments would be destroyed in a few places rendering vehicle navigation extremely difficult and impossible in several situations. Hence, equipment used for communications should be light and compact and should fit easily in a suitcase or backpack, which the disaster relief volunteers can easily carry on to the field as they walk.

Conclusions and take-aways

Existing communication technologies such as cellular networks, VHF, ham radio, community radio, rely heavily on infrastructure. Out of these technologies, only cellular networks have a ubiquitous presence. Nevertheless, its coverage still remains patchy in rural areas and it often fails due to power outages in disasters such as Odisha floods. Other alternatives are extremely costly and have operational issues. Due to their lack of flexibility, their use practically remains fairly limited. Overall, technology-wise many gaps still exist, which cost a lot of resources

Coordination in the field efforts remains a huge problem due to lack of a communication paradigm that provides effective network visualization capabilities. During disaster relief, coordination is required across many levels – (1) Between employees of one NGO on field (2) Between employees of different NGOs on field (3) Between headquarters and field volunteers, etc. The network should facilitate data communication to achieve these goals.

LifeNet has the capability of bridging gaps in existing communication during disasters because - (1) It is a data solution (2) It is flexible and self-organizing (3) It uses no infrastructure and (4) It is inexpensive.

]]>Porting LifeNet on Android (CyanogenMod)hrushimhttp://thelifenetwork.org/blog/index.php/2011/05/28/porting-lifenet-on-android-cyanogenmod2011-05-28T04:10:00Z2011-05-28T04:10:00ZThe procedure of porting any Linux-based software that involves a lot of native code (both user and kernel level) onto the Android platform, is not clearly documented anywhere. When we started porting LifeNet onto the Android platform, we naturally ran into several such problems and roadblocks. One needs to glean useful information from all over the web and things become more difficult if one does not have a clear picture of what information is useful in the first place. This article documents our porting effort end-to-end. Though I have tried to go into most of the porting details here, I won't claim that I have everything under the sun. Some steps are either too obvious or not relevant enough to warrant a mention here. However I have tried my best to point the reader to appropriate webpages wherever necessary. I believe that this article would be useful to a reader interested in contributing to the development of LifeNet, especially on the Android platform. It will also serve as a useful anecdote to readers interested in porting their software on to Android.

Our porting effort can be broadly divided in four phases as follows:

Phase 1 - Preparation

Phase 2 - Cross-compilation of the user-level native code

Phase 3 - Cross-compilation of the kernel-level native code

Phase 4 - Packaging as an Android app

Phase 1 - Preparation

This phase involved downloading and installing the software that was necessary for the port. Alternatively, this phase can also be called as the 'Trial and Error' phase. Simply because that is what happened. We were quite well-versed with Linux, but the Android platform was new to all of us. This phase started with the intent of finding answers to some questions - Which phone to buy? How to gain super-user privileges on the phones? Where and what are the tools for cross-compiling native code? Which kernel to use for cross-compilation? and what not. Finally, we agreed upon the following pre-requisites for the porting effort:

Rooted Phone

LifeNet needs super-user privileges to execute. We bought a Developer Phone from Google thinking that it would have all the necessary tools for development + hacking. But our expectations were shortlived. The developer phone was supposed to come with all the features and tools for development, but what it really had was only the support for developing and testing Java applications. Leave aside systems-hacking, we were not even able to execute 'su' on the phone's terminal. Fortunately, rooted-custom ROMs then came to our rescue. We downloaded a CyanogenMod rooted ROM from this website. We then flashed the firmware of our phone with the new CyanogenMod ROM using directions from this website. Now we had a phone, truly ready for experimentation!

Correct Kernel Source

LifeNet code is a suite of user-level applications and one kernel module (KM). To cross-compile the KM for the phone's kernel, we first had to get the kernel. From the phone, we figured out that the kernel version was '2.6.34.5-cyanogenmod'. We downloaded the appropriate kernel from here(CyanogenMod's repository on GitHub). We then cross-compiled this kernel to generate the kernel symbols. Only after that were we able to cross-compile the LifeNet KM onto Android. More details in the following sections.

Android SDK

Android SDK is a collection of tools and APIs for application development on Android. Out of these, the android debugger (adb) proved to be most valuable as it is needed for accessing the phone's terminal (shell) and transferring files to and from the phone. Clear instructions for downloading and installing the SDK are given here.

Android NDK

Android NDK is a collection of tools and samples for cross-compilation of native applications. The toolchains necessary for cross-compilation are present in the NDK. The NDK also has a script called 'ndk-build', which we eventually ended up using to invoke the cross-compilation of all our user-level native code. The NDK also has a number of user-level sample projects, that helped us in writing relevant Makefiles for our user-level native code. Other than the toolchain, NDK did not provide any support or samples for cross-compiling kernel-level code.

Eclipse with ADT plugin

In the final phase of porting, we needed to re-write our existing Java applications such as the messaging application because the UI and control flow of applications in Android is different than standard Java. We also needed to write additional code for loading and unloading the entire LifeNet application. For the same, we used Eclipse with the ADT plugin. The instructions for installing the ADT plugin for eclipse can be found here.

GCC 4.3.* or lower

Our host machine had Ubuntu 10.04 (32-bit) as the Operating System. It came with gcc-4.4.3. We tried hard but we could not get the kernel to compile with gcc-4.4.3. Compile-time errors kept popping up continuously as we went on resolving them by making minor modifications to the relevant erroneous kernel files. But after a point we realized that there were so many errors that something major must be going wrong somewhere. After searching extensively on the web we got a hint on some forum that gcc-4.4.3 had a lot of stricter compile time checks than its previous version (unfortunately, I am not able recall/google its name/location). We immediately realized that we were getting large number of errors because gcc-4.4 was unnecessarily throwing errors due to stricter compile-time checks. When we downgraded our gcc version to gcc-4.3.4 the kernel cross-compiled magically without a single error! The point of this story was to emphasize the use of gcc-4.3.* or lower.

Phase 4 was a standard process, which is followed by all android application developers and is well documented everywhere. So I will not write much about it. Majority of our time was spent in Phase 2 and Phase 3.

Phase 2 - Cross compilation of native user code

From our prior experience, we knew that our overall approach should be to start small. It was necessary to begin with small tasks that had less dependencies overall and gradually as we develop deeper understanding of the platforms and related software environment, move on to more complex tasks. Hence, we decided to cross-compile the user level native code before we went on with cross-compiling the kernel code onto android. There were a few questions such as - how much of the original code would we have to change? Would it be a possiblility that some core networking and socket functions that we had used in our code were not implemented on the android platform? It was impossible to find these answers beforehand. They were going to reveal themselves only as we proceed further.

We first decided to try cross-compiling and test a simple helloworld module written in C on our Android phone. NDK had some samples, but we decided to write our own sample program that just did one thing - print a string "Hello World from LifeNet!". The code is shown below.

/* helloworld.c - Simple user program that prints a string*/#include<stdio.h>

int main(){ printf("\nHello World from LifeNet!"); return 0;}

Then came the next obvious question of how to cross-compile helloworld.c. Android NDK came to our rescue! It contained the exact toolchain that we needed for cross-compilation. We used the 'arm-eabi-4.4.0' toolchain from the NDK. After reading some documentation we found out that the 'ndk-build' script, which is present in the root directory of the NDK should be used to invoke the cross-compiler. However this script worked only when the source code directory was arranged in a particular fashion and had a special Makefile called as Android.mk. From looking at samples from the NDK we found out that our structure of the helloworld module should be as follows:

Helloworld

-> jni

-> jni -> Android.mk

-> jni -> helloworld.c

-> libs

-> obj

The project root directory was Helloworld. It had three directories namely jni, libs and obj. The source file (helloworld.c) and the Makefile (Android.mk) were kept in the jni directory. We inferred the structure of our Makefile by looking at the Makefiles of other samples provided in the NDK. It looked like:

# Copyright (C) 2009 The Android Open Source Project## Licensed under the Apache License, Version 2.0 (the "License");# you may not use this file except in compliance with the License.# You may obtain a copy of the License at## http://www.apache.org/licenses/LICENSE-2.0## Unless required by applicable law or agreed to in writing, software# distributed under the License is distributed on an "AS IS" BASIS,# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.# See the License for the specific language governing permissions and# limitations under the License.#LOCAL_PATH := $(call my-dir)include $(CLEAR_VARS)LOCAL_MODULE := helloworldLOCAL_SRC_FILES := helloworld.c include $(BUILD_EXECUTABLE)

For cross-compilation we just had to invoke the ndk-build script from the project's root directory. The executable ready to execute on the ARM platform got generated in the libs directory. Using the Android debugger (adb), we pushed the executable on the phone and verified that the file executes as expected.

At this stage we were fairly confident of the cross-compilation process for LifeNet user modules. But questions such as - how much of the original code would we have to change? - were still unanswered. Native user level code of LifeNet mainly consisted of two modules - Sniff and Inject. I will now describe the cross-compilation of Sniff now. Cross-compilation of Inject was very similar to Sniff and would not require any further explanation.

Similar to the helloworld directory we organized the Sniff directory as follows:

Sniff

-> jni

-> jni -> Android.mk

-> jni -> sniff.c

-> jni -> display_functions.c

-> jni -> manifold_routing.c

-> jni -> network_functions.c

-> jni -> socket_functions.c

-> jni -> stats_log_functions.c

-> jni -> supporting_functions.c

-> jni -> sniffer.h

-> libs

-> obj

The Makefile Android.mk was as follows:

# Copyright (C) 2009 The Android Open Source Project## Licensed under the Apache License, Version 2.0 (the "License");# you may not use this file except in compliance with the License.# You may obtain a copy of the License at## http://www.apache.org/licenses/LICENSE-2.0## Unless required by applicable law or agreed to in writing, software# distributed under the License is distributed on an "AS IS" BASIS,# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.# See the License for the specific language governing permissions and# limitations under the License.#LOCAL_PATH := $(call my-dir)

Then we tried the cross-compilation similar to the helloworld program. But we obtained some compile-time errors. We quickly figured out that these errors were due to incorrect include files. For example, in our original code we had included a header file "ethernet.h". But that did not exist on the android platform. We had to include "ethertypes.h". After doing these minor modifications, we ended up cross compiling as follows:

This was the hardest step of all. Sadly, there is no clear documentation or explanation on how this is done. Hence, I will try to cover what we did in detail here. The major steps in cross-compiling a linux kernel module on the android platform can be listed as follows:

a. Getting hold of the correct kernel source

b. Configuring the kernel

c. Cross compiling the kernel

d. Cross compiling a hello world kernel module using the newly compiled kernel

e. Verifying that the cross compiled kernel module can be indeed loaded into the phone's kernel

f. Cross compiling the lifenet kernel module

I will elaborate on these steps now:

a. Getting the correct kernel source

This has been explained in detail in the previous section.

b. Configuring the kernel

This could be done by two approaches:

Approach 1 - Grab the kernel configuration from the phone.

Approach 2 - Create a new kernel configuration

We went with approach 1 because of two simple reasons - (1) Reduction of effort and (2) Reduction of chances of misconfigurations. Approach 2 just seemed to take up more time and was not worth doing when the existing phone kernel configuration could be easily obtained.

We connected the phone to our host machine throught the USB debugging cable and grabbed the phone kernel configuration as follows:

'.config' file contained the phone configuration. We then copied that in the root directory of the kernel. In the kernel root directory we then ran the following commands as follows to complete the configuration.

d. Cross compiling a hello world kernel module using the newly compiled kernel

Once we successfully cross-compiled the desired kernel, the next step was to use that kernel and cross-compile a simplistic hello world kernel module and verify that it could indeed be loaded onto the phone. These steps completed the cycle and were necessary to uncover any other major roadblocks before cross-compiling the actual LifeNet kernel module (thankfully, there were no roadblocks). We used the hello world kernel module from here:

After the helloworld module was successfully cross-compiled, it was necessary to ensure that the module can be successfully loaded into the phone kernel. We verified this by pushing helloworld.ko onto the /sdcard partition of the phone and then loading it in the kernel with the 'insmod /sdcard/helloworld.ko' command, using the android debugger 'adb'. Reaching this stage ensured that we now had a properly cross-compiled kernel, which could be used to cross-compiled any other kernel module.

f. Cross-compiling the LifeNet kernel module

Once we reached this stage, we were confident about the entire cross-compilation process. Cross-compiling the LifeNet kernel module was straightforward from this point on. We had to make a few changes to the Makefile, some compile time flags were hard-coded. The Makefile looked like the following:

At this stage, we knew we had cleared all the hurdles. The remaining porting effort was straightforward. We had to do the following - (1) We had to further modify our user-level native code to make sure that all the persistent storage was directed to the sdcard and not anywhere on the operating system partitions of the phone, (2) We had to rewrite our java apps such as the messaging application because the UI and control flow of Android is completely different that traditional Java and (3) Lastly, we had to combine the user-level native apps, kernel level native apps and java apps as a single Android app.