Thursday, 4 December 2014

We have a small garden at home and adding sensors to it was merely a matter of time, specially now that winter is coming, and it is time to plant our pepper and tomatoes plants. Below are the two iterations made to my wireless gardening sensor (still in beta stage).

Prototype A: Intel Galileo on board

For the first prototype I used the Intel Galileo Board I won at Senzations, the objective was to test the sensors, familiarize with the obtained values, and test publishing to Ubidots. For the test I used my wife's pepper plant.

Light, Temperature and Humidity are variables easy to understand and correlate, but as there was litle information about the Soil Moisture sensor itself, I measured both with the sensor in the free air and submerged into a glass of water, I found out the range was between 0-700 units. The next step was to water the pepper pot and see the Soil Moisture value when the plant if fully watered, then see the chart going down until the next watering session (when the leaves are "sad" as my wife says), so we can see at which values do we have to trigger an alert.

Prototype B: Spark Core

With the sensors and the Ubidots platform figured out, the next step was to make the whole thing to run on battery as a stand-alone device. For this I used the Spark CoreI won at IoTogether Hackaton along with a battery charger board I steal borrowed from work, and a 3.7V 800mAh Li-ion battery connected to a battery charging circuit.

To avoid having to solder a wire to the USB 5VDC pin I added a scrambled jumper logic to enable charging the battery when connected to the micro-USB, else the Spark Core will be powered by the battery only. I have also one power input to throw in a solar panel and charge the battery in the day and discharge over the night, but I still have to figure out how to adapt it to the enclosure.

I added 2 Phidget-like connector to be able to connect Phidget sensors or analog ones following the same pin-out (VDC/GND/Signal), and one Ziglet-like connector to connect any I2C-based sensor, as at the end I want to use digital sensors to keep the power consumption as low as possible, having wired a GPIO pin also to the connector to use interrupts from the sensors as well.

The male pin-header exposes unused GPIOs to be used later, for example one wandering idea is to add an MP3 board with an amplifier and a small speaker, as allegedly this helps plants grow, or maybe do a playback of my wife talking to the plants, which one was it? nevertheless it would also be kinda cool to play nature music when presence is detected... this will be likely an improvement to make if the power consumption can be kept low.

One caveat: I was one of the unlucky Spark owners who had a board with faulty DNS resolve, so I had to include an external DNS client to resolve Ubidots IP address, and then add the host property to my header and initiliaze the Server IP address as shown below:

Then to take advantage of the Spark low power mode and try to save battery as most as possible, I use the SLEEP_DEEP_MODE to put the Spark to sleep and awake after 5 minutes, rebooting the code with no memory retention, which is fine in my case as I only want to take single readings and upstream these. The code runs as follows:

The AWAKE_BEFORE_SLEEP delay makes sure the Spark Core stays awake for 20 seconds, which gives me enough time to reprogram the Spark over the Web IDE from my PC without having to connect the Spark to the host over USB. One of the things on my to-do list is to measure the current consumption of the device.

The whole thing fits into a standard enclosure, one of the things I have still pending to do is to adapt the sensors to the enclosure, make a small window to be able to visualize the LED, and also fix the solar panel. I have convinced my daughters to paint the enclosure with a festive theme, so surely I will post this anytime soon.

So that's it, I'm hoping in the holidays to have time to improve the Prototype B, make some measurements and work on the solar panel. One of the things I will surely test is the ESP8266 cheap WiFI board, but with my Photon already ordered in pre-sale for next year, I think it will make worth the wait, in time for the Prototype C, maybe even a release.

Following the end-of-year tradition of updating production boards, I found an ISEE IGEP v2 board running Linaro distribution on an Oneiric-based release, which reached end of life on May 2013. One option would be upgrading to a new LTS distro, but as time was limited and the current owner has a if-works-don't-touch strict policy, I choosed instead to update at least its sources:

W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/oneiric-security/main/source/Sources 404 Not Found

Just edit the /etc/apt/sources.list file and replace with the following:

Wednesday, 3 December 2014

Plenty of things have been said about Shellshock vulnerability and solutions, most of them consisting of upgrading the bash module for LTS distros, but lately as I have dusted my ALIX board based on Voyage 0.9.0 distribution, I found this was not an option: even after upgrading and downloading the bash packet from the dist pool, there were requirements missing to upgrade/install bash from the packet manager. This was my current bash version:

I found a fix at ShellShocker and it was as easy as running the snippet below (although I would not recommend executing remote scripts, it is not a good practice), but if you are curious about what it does, or you want to run this yourself, the sources are also listed below.

curl https://shellshocker.net/fixbash | sh

After running the script the bash has been patched and the shellshock test now ommits the "busted" line.

Deleted inodes are a common problem when working with SD cards, specially noticeable in ALIX-bsaed boards running Voyage or alike. Remove the SD, connect as an external drive to your host (I'm connecting to an Ubuntu-based VM) and do the following:

Identify your disk using the information above, then check and repair the file system using e2fsck, when prompted you can comply with the suggested fix by pressing "y". This should be the output if no errors are found.

Tuesday, 2 December 2014

I luckily found out 6lbr some months ago (I have an unfinished post about it yet to finish, loving foren6 also), but I needed to stop the service to be launched from boot while I did some testing. As I didn't found any configuration tweak to do so, I disabled the service at all /etc/rcX.d run levels using the well known update-rc.d script.

Monday, 1 December 2014

Short story: LG SmartShare DLNA media sharing service was slowing down Windows boot abnormally, and the built-in settings do not provide the polite option to disable launching the application at boot. After checking the usual suspects (msconfig and Startup folder), I found where the SmartShare task was being scheduled.

Sunday, 30 November 2014

IoTogether was a competition event for 7-8 co-located teams in Trento and Barcelona. Remote teams will be composed by participants from both sides who will be asked to interact via Future Internet videoconferencing tool provided by the SPECIFI project.

The idea was as freaky as it could get, is a lot funnier to start explaining our team concept from the prototyping stage up to the final application...

At the Barcelona location we implemented a force sensor with a 3-axis accelerometer integrated to a wool glove, to measure both force of impact, movement and acceleration, displayer over a LED bar for our viewing pleasure, hooked up with a GSR (Galvanic Skin Response) sensor to measure the "arousal" level of its wearer. To interface the sensors we used an Arduino bundled with a WiFI module, to send the sensed data over Glue.Things to the Trento location.

At Trento side there was a dildo-shaped lamp (yes, a dildo),
changing colors according to the data received from the glove by selectively turning RGB LEDs, using
force, acceleration, arousal and muscular response data... well, you know, leave it to the Italians to
sex-it up a Hackaton :)

One of the main challenge of the event was actually to set up everything from each side while talking to each group, unfortunately the co-location resources went bad and we ended up talking over Skype, and emulating both input and outputs isolated, but at the end we were glad it worked out.

The official name of the project was "The LoveTotem", to be honest the horny lamp would be a better name, but regardeless the name of the actual use case, what was interesting about the project, and the event, was to put a bunch of people together, with the right tools and attitude, and create something, whatever, as long as it works, you cannot save the world with a killer app in 4 hours, but building "anything" is the first step to get more stuff rolling... who knows? everything can be reused and interpreted in many different ways, with a different angle this would also make a good physical rehabilitation tool, a social game to take introverts into social engaging with physycal interaction with others, nnevertheless, I loved the lamp challenge... it was so out of my field that it was cool to try to design a matching application to feed love to the Totem.

This was the official description of the project:

"LoveTotem": Suppose you are too far from your sweetheart and you want to know how really he/she feels, or share emotions without talk! Your partner can wear the LoveSensors: when you'll turn on your LoveTotem, you'll see it take strong/hot color if your partner is excited, or soft blue light if is relaxed. Love Totem uses an open hardware platform (Arduno Yun) and human sensors like (Muscle sensor,Temperature/Humidity, GSR) in order to track the state of a person and transmits these data through dedicated API to a cloud platform (GlueThings). On the other side, another Love Totem will retrieve these data, representing them with a mix of some colored leds. Leds are placed in a lamp, which we have designed and made using a laser cutter and a 3d printer.

At the end our team was one of the two winning teams of the event, with a caveat: the team members posted at SPECIFI site is wrong, Barcelona's team members are swapped with other team (Aitor is actually a colleague of work, but at a different group), the correct line-up is the following: Andrés Hernández Casaus, Hector Esteller, Pablo Carbajal and yours truly.

We got plenty of goodies, including Spark Cores for each of the members, sensors (I got a soil moisture sensor, the GSR and a couple of buzzers), along with a Shield-Shield.

Here is the full list of the developed projects:

"SeeTy": a urban garden community that merges gardens around the world.

"EmotionalBag": a bag monitoring and displaying its user's emotional status.

"SmartGlove": a glove that senses the heart beat and the level of stress.

"BluePresence": a smartwatch managing smartlights with power consumption profiling.

"CarFinder": cars tracking on the web with GPS sensor.

If you ever stumble upon a Hackaton organized by the same people behind IoTogether, be sure to attend, besides the good folks assisting the event, and the electronic goodies, the food was sincerely amazing, no cold pizzas and warm beers, a full-catered event with food coming and going at the pace of cold beer and hot coffee, just what a maker needs.

Saturday, 29 November 2014

At development I tend to keep more than one Virtual Machine sandboxes, normally Unix-based running on a Windows host (because of reasons), so from time to time I have to clean-up the garbage and maintain the user-disk cuota as low as possible to avoid eating up the VM drive.

The original author makes a pretty good explaination on the command sintaxis (basically list the kernels and strip-down recursively to usable name strings to uninstall them), so I'm just going to copy here the actual command:

Short story: I was tired of having to skpe or mail files to the secretary for her to print it in my behalf, and working in a highly tech-oriented company, it was embarrasing, so the Raspberry Pi came to the rescue once again...

And that should be all, next we need to add the printer connected to the Raspberry Pi over the USB port, the process should be straight-forward. The CUPS driver provides a webserver accessible over the socket specified by the RPi's IP and Port assigned in the cupsd.conf file.

I recently was lucky enough to attend Technical track of Senzations 2014 Summer School hosted in Biograd na Moru, Croatia. It was an incredible experience traveling to Croatia, without taking merits to the event itself, one of the things I enjoyed the most was the City experience, and of course the Boat trip to the National Park Kornati.

The lecturers were great and gave plenty of insights on both Wireless Sensor Networks, M2M and IoT,and shared their current work on the field, most of the slides are available at the Program website, but I though about sharing some of the presentation links below:

As Intel was supporting the event, there were plenty of Intel Galileo development kits to use to prototype our very own IoT-driven applications, the core of the event: divide into teams and create an IoT application from the prototype to the business plan. Here's mine with plenty of Grove sensors from SeeedStudio attached in the prototype phase.

Our team, the DreamTeam, scored big time and was one of the winners of the 4-days Hackaton with our project: City Karma, which had its own dorky video as well! The main idea was to target the lack of social awareness in cities, and City Karma was born:

The application was implemented using a Python script running in the Galileo Board, monitoring 3 types of events: loud scream for help, emergency button and assistance button, then posting a Twitter message indicating the location of the event, type and date, with a randomly generated Karma Code. A person following the #CityKarma hashtag or the CityKarma twitter account then could see the new event, and could reply the Twitter message to inform the person in distress that help its on its way.

A mobile application would also allow to flash this alert to the screen, by monitoring the City Karma feed and using the user location to see if the user is nearby.

Then the helping hand would get Karma Points, plenty useful to show off and maybe get a free expresso or a discount in affiliated partners, maybe a nice tax reduction? what would it take for you to go out of your way and help a stranger? Could you ignore a person nearby you asking for help? Let's hope the people frequenting this blog are natural Karma sponges, if not remember this:

When you carry out acts of kindness you get a wonderful feeling inside. It is as though something inside your body responds and says, yes, this is how I ought to feel. - Harold Kushner

The application was powered by MQTT over WiFI/GPRS, using a local Node-RED server to receive the help message, parse and post it to Twitter, and also to track the Twitter feed for responses, posting an update on the MQTT topic to notify the person in distress that his help request has been answered, by means of a LED notification.

If you want to take a peep at the code it is uploaded in my Github, keep in mind this is not production-ready code and was done only as a proof-of-concept.

To wrap this up, I'm really looking forward to next year event, I hope I can assist as both a participant or a lecturer, it was an incredible gathering of talent and knowledge, and a fun crowd to hang out. A big thanks to Srdjan Krco (DunavNET) for organizing the event, Alex Gluhak (Intel Labs) for rolling out the hackaton and providing the equipment and tech support, and Charalampos Doukas/Jan Pernecky for the memories.

When I first started trying-out setting Uniflash to work with the CC2538DK and Contiki, I stumbled upon this warning:

I first tried Uniflash v3.1.0.00026 with no luck, so I downgraded to
version v2.2.00016, you might want to skip the next section as it mostly
describes the pain and futility of my first attempts with the latest
Uniflash version, mostly kept as a warning of the severed-head-on-a-pike
type for others encountering the same errors, and maybe lurking a kind
soul who has fought this monster and prevailed, willing to share the
solution.

At the end the solution was to find the right combinations of magic ingredients:

This is not my preferred way to flash the CC2538DK, as now with the bootloader backdoor unlocked it is possible to program the devices over UART using the built-in bootloader BSL script, but there may be cases in which you accidentaly flash an image with the backdoor unlocked, and need to enable this again.

Some months ago I was working out of the office at a client's location and I had to move my development environment there, but due to a tight and restrictive IT policy, me a some colleagues were unable to set up our current GIT repository and share our work (the proposed zip-share-meld was not tempting at all...). The easiest solution would be to host the repository on my laptop, but then again, setting up our GIT repo in a Raspberry Pi is more fun, specially when you have a video projector available at the office, and a RetroPie running with 2 extra game controllers.

Wednesday, 23 July 2014

I recently purchased a WiFi dongle as the new addition for my Raspberry Pi, this cheap and amazing piece of work from ebay is a 150M USB WiFi Wireless Adapter LAN, with a 2dBi de-attachable Antenna and the well-known ralink rt5370 Chip, for only 3€ (at the purchase time) what else could you ask for?

OK, it took a while to arrive to Spain as it was sent from China, but the product was worth the wait!

Not only worked flawlessly with the Raspberry Pi (mine is mounted on the wall next to my desk, sometimes I put it behind my monitor as the enclosure is VESA-compatible, see previous post), but also worked out-of-the-box for my LG 47LA640S Smart TV, saving me from having to buy an "official" overpriced WiFI USB dongle (30€, 6 times the value!) or adding yet-another wireless router to connect my TV over ethernet.

To enable the Raspberry Pi to connect to an AP with a static IP address, just add this to your /etc/network/interface file:

The Barcelona Activa folks at the Parc Tecnològic Barcelona Nord has opened a 3D printing space, available for the lucky people working in its facilities, featuring the amazingly cheap and cool BCN3D+ printer and the well-known MakerBot.

After a quick lessonand browsing through Thingiverse, I quickly found that the Raspberry Pi was a hot item with plenty of designs available to further enhance the Rpi experience,so I took this thing (a VESA-mount enclosure for the Raspberry Pi) and did a quick test ride.

There are plenty of tutorials and available free available software on 3D printing, so I'm going to skip this and leave a 3D printing walkthrough for a future post.

I ran the .STL file over the GCode Analyzer to see how my case was to be printedand how much time will it take to print (about 2 hours, just the case without the top-lid), and then copy the GCODE into a SD card, put it in the BCN3D+, calibrate (grumble, grumble), and the result is shown below.

The Raspberry Pi fits quite nice in the enclosure, I only had to remove a loose thread or two to connect the HDMI connector, I have not printed the top-cover as I intend to connect many things to the device, so I'm not sure yet what space should I leave for the cabling and maybe use the cover as a base mount for a sensor, etc.

I'm planning on sticking the Rpi behind my monitor and avoid seeing all the cables on top of my desk, also it serves OK to hang on a wall... good thing is that the Raspberry fits in the enclosure, but with little effort it can be pulled off, allowing a quick swap.

Maybe caused by powering off my VM sandbox while syncing or some other weirdness, normally with a very low impact as you can always do a git clone, but as I had some changes done locally on my working branch that I didn't want to miss/re-do, this saved my day.
The original answer with full comments can be found at the link above, the brief version is:

cp -a .git .git-old
git fsck --full
# Remove empty files by using "rm", continue until none is left and the "missing blob" starts showing
git reflog
# It will show "fatal: bad object HEAD"
tail -n 2 .git/logs/refs/heads/master
# Identify parent of last commit (the one HEAD is pointing to), easily recognizable as it will show up twice
git show commit_parent
git update-ref HEAD commit_parent
git fsck --full
# There are some blobs left from outdated index, nuke and carry on
rm .git/index
git reset
# There should be only references to "dangling blobs", these are not errors, continue
git status
git add .
git commit -m "Recovering from lost objects"

Tuesday, 17 June 2014

The Raspberry Pi has an on-board audio jack, but the on-board audio is generated by a PWM output with little filtering done, but the sound quality, volume, and recording capabilities can be added by using a USB sound card (these are pretty cheap too).

I bought 4€ USB sound card from Amazon, but pretty much any other can be used, it has a stereo output and has a MIC input, which is perfect for voice-controlled projects (like the one I'm currently working on, remember Jarvis?). The audio card features an USB 2.0 type A connector and it powers up from the USB 5VDC power supply. I have not measured the current consumption (yet).

Preparing the installation

I'm running Raspbian-based RetroPie on a USB stick, the following should not change anything from your configured settings, however this installation can be done on any Raspbian-based images.

sudo apt-get update
sudo apt-get upgrade

After 5-10 minutes you should have your OS upgraded and ready to go.

Recognizing the USB audio card

ALSA driver by default does not allows the USB audio card to be sound device #0 (default), to change this we need to edit the /etc/modprobe.d/alsa-base.conf (as sudo) and change the following line (normally commented-out):

#options snd-usb-audio index=-2

Remove the comment and change index to 0

options snd-usb-audio index=0

After rebooting or restarting ALSA, the audio card should be listed as the default sound device.As pointed out in Adafruit, there are different supported chipsets, mine was a C-Media based (CM108).

Testing the setup

I have hooked the USB audio card with a TPA2005D1 audio amplifier (roughly 6€) from Sparkfun and a 0.5 Watt magnetic speaker. I have to work on the sound quality to reduce some noise, change the audio amplifier's resistors to increase the gain, maybe throw in a potentiometer to allow changing the volume, and use speakers with a higher wattage.

To test the installation we can use the built-in functions provided by the OS.

pi@raspberrypi ~ $ speaker-test
speaker-test 1.0.25
Playback device is default
Stream parameters are 48000Hz, S16_LE, 1 channels
Using 16 octaves of pink noise
Playback open error: -16,Device or resource busy
Had to close the emulationstation scripts running, then I was able to hear the sound:
pi@raspberrypi ~ $ speaker-test -c2 -D hw:0,0
speaker-test 1.0.25
Playback device is hw:0,0
Stream parameters are 48000Hz, S16_LE, 2 channels
Using 16 octaves of pink noise
Rate set to 48000Hz (requested 48000Hz)
Buffer size range from 96 to 262144
Period size range from 48 to 131072
Using max buffer size 262144
Periods = 4
was set period_size = 65536
was set buffer_size = 262144
0 - Front Left
1 - Front Right
Time per period = 5.564981
0 - Front Left
1 - Front Right
Time per period = 5.559085
0 - Front Left
1 - Front Right

And to test the recording I just connected a detachable mic from a headset.

This should go smoothly but... there are small things to tweak, for example:

error: StorageVolumes.h: No such file or directory

If you include the STM25P component (external flash memory support), if fails locating StorageVolumes.h file, as this file is generated in build time from the definitions in your platform's target file (at tos/support/make):

One quick work-around is to previously build the application and then copy the generated StorageVolumes.h file from the resulting build/platform location to your application path, then run the make docs command again.

Sunday, 25 May 2014

I wasn't able to post anything last Sunday due to my daughter's birthday, so this week stumbling upon arrives a bit late, anyway today I found a video interesting to show to non-coders and specially to parents and youngsters: this new world is more about bits than atoms, coding and programming it is not reserved only to Computer Science majors, but to anyone willing to learn... sadly neither my school or college taught me how to code, so I was a late bloomer and an autodidact (now I code for a living), perhaps the video below can give you more reasons to start early.

What most Schools don't teach (core.org)

Some of the biggest names in tech and Hollywood have joined forces in a new video for Code.org, a non-profit focused on computer programming education, to encourage students to take coding classes.

Monday, 12 May 2014

As mentioned in an earlier post, I recently got a demo kit from SIGFOX and finally I had some time to start playing with.

I had a head start and told that the starter kit was based on Telecom Design TD1204, which provides a serial AT-based modem to allow communication over a serial link, so it made easier testing using software tools like PuTTY, Hercules and then moving to a Zolertia Z1 mote to drive the TD1204 over its serial port (at the end of the post!).

Note: You need to have a device registered in the TelecomDesign cloud or the SigFox backend for some of the following steps, but nevertheless is is interesting to watch and to take a peep inside.

SIGFOX Back-end

I started by browsing the web-based SIGFOX backend, by clicking on the Device tab (as shown below) we can see the devices associated to our account, my starter kit device appears listed there.

I work in Barcelona, so I was interested in checking if my workplace is under SIGFOX coverage area, so by clicking the Location tab at the left it is shown an approximate coverage map. (btw, I have tried pushing a message while at home in Cerdanyola del Vallès, but it didn't reach the network, but generally in Barcelona the coverage is OK). At first glance it made more sense to me to represent the light blue area as a circle rather than using a square, but as I'm not sure of how this is modeled, it is fine as it is at the moment.

Then the moment of truth, let's see what happens when I long-press-and-hold the button... once or twice to be sure, the message board is updated showing the content of the test message... I don't understand what the data means (180c62, 190c62), but I have a good idea about the rest of the fields...

Delay: Time (in seconds) elapsed since the message was triggered and then received by the network station.

TAP: Station receiving the message, as these stations are fairly identified you can get the estimated location of your device (a way estimated location).

RSSI: Received Strength Signal Indication: as shown we have values ranging from -124/-128dBm, as it was already discussed in my previous post, the sensitivity value is expected to be quite low, as the wireless range is favored by the low data rate and bandwidth. In the TD1204 EB page the sensitivity value is stated to be close to -126dBm, as we are using a 5dBi external antenna we can expect a boost (in one occasion I got a RSSI value of -132dBm, so this approximation seems to be OK).

Signal (dB)

Freq (MHz): channel used in the transmission, as shown above the channel is changed for every retransmission attempt.

Rep: Numbers of retransmission attempts needed for the packet to actually arrive to the network.

Callbacks: Event triggered upon receiving the packet, below is an example of the callback generated by the user-button event (an email is sent every time to a given email address):

[OK]
- TAP 0146 -
1 second

200 -
{device} (name@mail.com)

The maximum transmission power of the radio transceiver is 14dBm (for the TD1204 DK), but it would be interesting to check if SIGFOX allows devices to be on the ETSI G3 sub-band (869.4-869.65 MHz 500mW), one thing pending to confirm (in my previous post I assumed so).

Hands-on: connecting SIGFOX to a Z1 mote

So OK, enough of using the user button and the backend (we will return to this later when setting callbacks and stuff), now let's move to the meaty part of the testing: sending a custom string message. After checking the TD 1204 reference manual, I found the required serial settings to connect the starter kit to the PC:

Speed 9600 bps.

8 data bits.

1 stop bit.

No parity, no hardware/software flow control.

I first tested the serial communication using Hercules and the result was OK as expected (remember to add a carriage return at the end of the string, more information about the command/response expected format is found in the TD1204 reference manual, section 2.4).

After verifying the serial communication then I moved to wiring the starter kit to the Z1 mote.

At first It seemed a good idea to cut-off the male USB connector of an USB male to mini-USB cable, strip the wires and solder those to the Z1 mote, wiring D+/D- to the UART1's RX/TX (P3.7/P3.6), and power the starter kit by soldering GND/VCC to the Z1's USBGND/USB+5V (requiring to power the Z1 through its micro-USB port)... but this was a rather bad idea, as I noticed when connecting the TD1204 to the PC that it uses a FTDI chip, which are normally slaves thus requiring a master to communicate. The Z1 mote has a CP210x serial-to-USB converter too, but my original intention was to use raw serial communication using a free UART port of the Z1 mote, so its USB port (wired to UART0) can be kept as a programming/debugging port, avoiding to use the JTAG port instead.

So the next logical step (off course, without a doubt) was to void the warranty and take a peek inside the demo kit... I love voiding warranties :-)

Finding
the FTDI reference (FT232R-L package) chip was easy, as suspected there's a
battery charger/Modem enabling circuit, a 1000mAh LiPo battery and the user button on one side, on the other we can see a TD1202 radio transceiver and AT-modem (sending an AT&V command is also a good way to find out about board-specifics).

Next I needed to locate the serial RX/TX lines going to the TD1202 transceiver to bypass the FTDI, following the FTDI pin-out and the PCB trace I noticed there were 2 0Ohm resistors standing between the FTDI and the modem, a nice gesture as it only required to apply a little heat and lift-up the resistors instead of cutting the PCB.

The location of the RX (red), TX (white) and GND (black) pads are shown above, as well as the location of the resistors. We cross-wired the Z1 and TD1202 RX/TX lines, and the DGND pin of the Z1 to GND.

I could also directly power the TD1202 using the Z1's 3.3V power reference, but honestly I was more interested in testing the communication between the Z1 and the board, than worrying about how to power the device, so I kept using the mini-USB port to power up the TD1202 through the 5V delivered by the Z1 connected via USB as shown below... a work to be done later is to replace the LiPo battery with one that has a larger capacity, and hook it up to a solar panel, but that's material for another post.

I used a logic analyser to verify the communication between the Z1 and the TD1202 was OK, and programmed the Z1 mote using TinyOS with a simple serial test sending an AT\n command every second, hoping to receive the AT command echoed back and an OK from the AT-modem, the result is shown below.

Here's the snippet of code used for the test. The TMP102 temperature sensor callback event is shown on purpose, it will be used later to forward temperature readings from the mote.

Maybe someday I'll write a driver in Contiki/TinyOS to allow a
more flexible and API-like way to communicate with the TD120x, or just port my sub-1GHz based mote to native support SIGFOX, but for
preliminary testing sending raw strings is acceptable.

As the serial communication is asserted I can dump now the logic analyser and just print out the responses from the TD120x to console by modifying the code, adding a first print after boot to check out device information such as version.

Z1 with a SIGFOX interface

After checking the TD120x documentation the command to send data is quite straightforward:

AT$SS=[HEX1][HEX2]... [HEX12]

As mentioned before, up to 12 bytes can be sent in any transaction, more than enough for sending a 16-bit temperature reading from the Z1's built-in TMP temperature sensor.

The sending command expects a number of even hexadecimal "symbols" to be sent, meaning if you have 0x123 the modem will return an error, as it expects something like 0x0123. All data should be sent encoded as hexadecimal, and leading zeroes might be needed depending on your data format.

After wiring my temperature sensor callbacks to be forwarded to SIGFOX the following messages were received, showing the mote was close to 28.5ºC at the moment of the test. Most of my messages are being sent close to the 868Mhz frequency, maybe SIGFOX operates solely on the G1 sub-band?

Wrapping up...

If you like to outgrown the demo, you could add more motes and add radio communication, and use the TD120x-enabled mote as a Gateway with dual wireless interface: sub-1GHz and 2.4Ghz, allowing to forward messages to SIGFOX while locally using other network topologies and protocols, not limited by SIGFOX packet size/throughput, only pushing important data and events such as alarms, periodic readings and others.

So this is the end of the post, I still need to test the callbacks and customising the TD120x using the SDK ,but this post has grown too much already, so I'll leave more for later.