Archive

Most Kodi users are now running Kodi 17.x Krypton that was initially released in February 2017, with the latest point version being Kodi 17.6. At the time of Krypton release, the developers had also started working on Kodi 18 “Leia” which should now be in “alpha”, and the stable release may only be a few months away although Kodi developers do not provide an ETA.

What they did provide however – via Martijn Kaijser at FOSDEM 2018 – is a progress report for Kodi 18 “Leia”, as well as some insights into Kodi 19 whose development has just started.

Click to Enlarge

Kodi 18 has gone through a lot of cleanup with the code upgraded to C++11 standard, duplicate code and obsolete libraries removed, dropped unmaintained feature, and so on. They also moved non-core features such as audio encoders and decoders, PVR, picture decoding, etc… to external plugins. This work resulted into 299,476 deleted lines of codes, and 387,205 added lines of codes in Kodi v18 alpha.

Some of the key developments and new features you can expect in Kodi 18 include:

XBOX One support with Microsoft’s help

Improvements to the core VideoPlayer with easier to maintain, more portable and efficient code, support for DRM protected streams (e.g. Widevine), and potentially future support for PiP, headless mode, and transcoder mode.

As mentioned in the introduction, work on Kodi 19 “Mxxxxx” has also started, and one of the changes is the drop of support for Python 2 add-ons so every add-on will have to move to Python 3. Watch the video for the full picture.

So I decided to go with yet another firmware, and this time, I played with MicroPython on ESP32, and will report my experience with basic commands, controlling GPIOs, and WiFi in this getting started post.

I’ll be using Ubuntu 16.04 for the instructions, which should be pretty similar for other Linux distributions, especially the ones based on Debian, and if you’re using Windows 10, you should be able to follow the same instructions after installing Windows Subsystem for Linux with Ubuntu on your computer.

As a side note, version 2.1 of esptool does not know about ESP32-PICO-D4, but it can still detect an ESP32 device, and the update went through normally.

Hello World Sample / Boot Log with MicroPython

We can test the firmware, by connecting to the board using minicom, screen, putty, or whatever software you feel most comfortable with. I went with minicom, setup a connection to /dev/ttyUSB0 device with 115200 bps baudrate. I immediately tested the print function, and made an hard reset to check out the boot log:

Shell

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

minicom--baudrate115200--device/dev/ttyUSB0

Welcome tominicom2.7

OPTIONS:I18n

Compiled on Feb72016,13:37:27.

Port/dev/ttyUSB0,14:13:04

Press CTRL-AZforhelp on special keys

>>>print("ESP32 PICO Core says Hello!!!")

ESP32 PICO Core says Hello!!!

>>>import machine

>>>machine.reset()

The reset command will first generate some errors message, before rebooting the board:

LED Blink Sample with MicroPython

The easiest way to test GPIOs is to connect an LED, since the board does not have any user LED, only the power LED. So I connected an LED to pin 21 via a transistor to ensure enough current passes through it.

Controlling the LED in the command line interface is easy. Import the machine library, set the pin to output, and change the pin level as needed:

Python

1

2

3

4

>>>importmachine

>>>pin21=machine.Pin(21,machine.Pin.OUT)

>>>pin21.value(1)

>>>pin21.value(0)

Success! But what about doing a proper blink sample? MicroPython developers’ official PyBoard would show as a USB mass storage drive in you computer, where can copy Python files like boot.py and main.py files, but in the case of ESP32 PICO core, it appears the only option is to use the serial console for programming, as we can’t simply copy files to the board from the host computer.

I installed files module, but the error remained. So instead I installed it for Python 3:

Shell

1

2

3

sudo pip install adafruit-ampy--upgrade

ampy--version

ampy,version1.0.2

I then created blink.py on my computer to blink the LED every 500 ms:

Python

1

2

3

4

5

6

7

8

importutime

importmachine

pin21=machine.Pin(21,machine.Pin.OUT)

whileTrue:

pin21.value(1)

utime.sleep_ms(500)

pin21.value(0)

utime.sleep_ms(500)

Before uploading the file to the board, you can try to run it as follow:

Shell

1

ampy--port/dev/ttyUSB0 run blink.py

If you have plenty of errors here, that’s probably because your code is incorrect. Since I’m not very familiar with Python, it happened to me a couple of times, until I got the code right, and the LED was blinking as expected.

Now that we’ve made sure the code works, we can now copy our sample to the board…

Shell

1

ampy--port/dev/ttyUSB0 put blink.py

… reconnect to the serial console, and verify the file is there:

Python

1

2

3

>>>importos

>>>os.listdir()

['boot.py','blink.py']

To run the program type the following:

Shell

1

import blink

The LED should blink again. You can interrupt the program with Ctrl+C, and if you want to soft reset the board, press Ctrl+D.

In order to automatically start the blink program at each boot, rename blink.py to main.py, delete blink.py, and copy main.py instead:

Shell

1

2

3

mvblink.pymain.py

ampy--port/dev/ttyUSB0 rmblink.py

ampy--port/dev/ttyUSB0 put main.py

Power cycle the board, and the LED should start blinking almost immediately.

ESP32 WiFi with MicroPython (Station and AP modes)

We’ve got GPIOs working, but one of the most important feature of ESP32 is obvisouly WiFi. I’ll start by configuring the board in station mode. First import the network library, set the board to station mode, and scan access points:

Shell

1

2

3

import network

sta_if=network.WLAN(network.STA_IF);sta_if.active(True)

sta_if.scan()

The latter should return a list of access points with ssid, bssid, channel, RSSI, authmode, and hidden status as explained here.

It works as expected, but we wrote the HTML code inside the Python file, and you need to handle socket programming by yourself. To further simply the task, some MicroPython web servers such as MicroWebSrv, and Picoweb are available.

MicroWebSrv (Not working yet for me)

I tried to install MicroWebSrv first, but never managed to make it work. I still reproduce the step I followed in case somebody finds out what I did wrong. I got the code, and copied files from the Linux terminal:

Shell

1

2

3

4

5

6

7

git clonehttps://github.com/jczic/MicroWebSrv

cdMicroWebSrv

ampy--port/dev/ttyUSB0 put microWebSocket.py

ampy--port/dev/ttyUSB0 put microWebTemplate.py

ampy--port/dev/ttyUSB0 put microWebSrv.py

ampy--port/dev/ttyUSB0 mkdirflash

ampy--port/dev/ttyUSB0 put www/flash/www

We can check the files are where they are supposed to be:

Shell

1

2

3

4

5

6

7

8

9

10

11

12

13

14

ampy--port/dev/ttyUSB0 ls

boot.py

microWebSocket.py

microWebTemplate.py

microWebSrv.py

flash

ampy--port/dev/ttyUSB0 lsflash/www

wstest.html

style.css

test.pyhtml

pdf.png

favicon.ico

test.pdf

Go into the terminal (aka REPL console) to start a basic example, after setting up a connection:

Shell

1

2

3

from microWebSrv import MicroWebSrv

mws=MicroWebSrv()# TCP port 80 and files in /flash/www

mws.Start()# Starts server in a new thread

I could connect to the server, but I would always get 404 error.

PicoWeb

So instead I switched to picoweb, adapting the instructions here and there. It’s very easy to install. First make sure you have a working Internet connection in your board (i.e. set station mode), and install the web server with upip:

Now let’s go back to the host computer to create an html document, for example index.html:

XHTML

1

2

3

4

5

6

7

8

9

10

11

<!DOCTYPE HTML>

<html lang="en">

<head>

<meta charset="utf-8">

<title>Picoweb on ESP32 PICO Core</title>

</head>

<body>

<p>Hello World!!!</p>

<p><b>Hello World in Bold!!!</b></p>

</body>

</html>

as well as picowebtest.py sample file that will request the HTML page from the board, and return it to the client.

Python

1

2

3

4

5

6

7

8

9

10

11

12

13

14

importpicoweb

app=picoweb.WebApp(__name__)

@app.route("/")

defindex(req,resp):

yieldfrompicoweb.start_response(resp,content_type="text/html")

htmlFile=open('index.html','r')

forline inhtmlFile:

yieldfromresp.awrite(line)

app.run(debug=True,host="192.168.0.108")

You’ll need to change “192.168.0.108” by the IP address of your board.

Let’s copy both files to the board…

Shell

1

2

ampy--port/dev/ttyUSB0 put index.html

ampy--port/dev/ttyUSB0 put picowebtest.py

… go back to the serial console, connect in station mode, and run the sample:

Shell

1

2

3

4

5

6

import network

sta_if=network.WLAN(network.STA_IF);sta_if.active(True)

sta_if.connect("CNX-TRANSLATION","password")

import picowebtest

I(100468)modsocket:Initializing

*Running on http://192.168.0.108:8081/

Type or copy/paste the URL in the last line into a web browser, and you should get the output below.

ESP32 Bluetooth with MicroPython

There’s no Bluetooth support in the official MicroPython documentation, because it’s work in progress, and for the most adventurous MrSulry released an alpha version a few days ago. The Bluetooth API is also in flux, but the basic code to enable Bluetooth should look like:

Python

1

2

importnetwork

bluetooth=network.Bluetooth()

I’ll update that section once Bluetooth makes it to the stable release, and/or when I’m sure the API is frozen.

Other ESP32 (Micro)Python Resources

I’ve just covered a few things that can be done with MicroPyhon on ESP32, and beside the official documentation, you can also check the various MicroPython ESP32 tutoral on techtutorialsx blog. Loboris also made another MicroPython ESP32 firmware that supports pSRAM as MicroPython may use a lot of RAM. If you’re interested in Python for ESP32, but Zerynth is another option for Python on ESP32 that works with an IDE/GUI available for Windows, Linux and MAC OS X. [Update: Yet other options are Pumbaa a port of MicroPython running on top of Simba, and Pycom version of MicroPython]

Hardware based on 96Boards specifications may not have the number of sales as Raspberry Pi or Orange Pi boards, but there’s heavily used by Linaro member and other developer working on bleeding edge software. More and more companies are designing boards compliant with the standard, and several new mezzanine expansion boards such as Secure96, were showcased at Linaro Connect SFO 2017, and are yet to be show up on 96Boards Mezzanine page.

Another 96Boards mezzanine expansion board in development is Dragonwally, designed for stereoscopic computer vision, currently used with DragonBoard 410c board, and targetting applications such as object recognition, people counting, access control, or driver identification and safety.

DragonWally DW0 board specifications:

MIPI DSI interface with high speed connector

2x 5MP cameras

1x USB port

96Boards CE compliant

The two Brazilian developers working on the project interfaced it with DragonBoard 410c running Linaro Debian, and using OpenCV and Python for computer vision development. To demonstrate the capability of the board, they added a touchscreen display for a demo leveraging Amazon Rekognition API for face recognition and camera distance estimation.

This summer I discovered Hologram global cellular IoT SIM card, and since they provided free developer samples with 2MB of monthly data includes, I decided to get one to try it out. I received it a few weeks later, and to my surprise it worked, despite my country of residence having some strict requirements with regards to SIM card registration. The SIM card uses roaming, but with a low fixed worldwide pricing, and does not come with a phone number by default, so maybe that’s why I did not have to register.

The company is now back with Nova, an open source hardware cellular modem certified by OSHWA (ID #US000077). It’s basically 2G/3G USB dongle that’s controlled by Hologram Python SDK, specifically suited to Debian systems like Raspberry Pi 3 or BeagleBone Black. Hackster.io is also involved in the launch with a worldwide contest offering 200 free kits comprised of Nova 3G USB dongle and Raspberry Pi Zero W board for the best project ideas leveraging cellular IoT.

The dongle can be controlled using Hologram client tool, or Hologram Python SDK requiring ppp and Python 2.7 packages, and will allow you to send SMS, setup data connection, and more. Any SIM card should work, and it’s not tied to Hologram SIM card. While the company claims OSHWA certifications, the number US000077 is not present (empty line) in the OSHWA certification list yet, and so far, they’ve only released the PDF schematics. However, Python SDK is fully open source and released under an MIT license on Github.

But as mentioned in the introduction, if you have a great project idea, you could also get the kit for free, and possibly another “grand prize” (Apple Watch Series 3)once the project is completed. The contest is opened worldwide (except to US sanctioned countries) with the following timeline:

Submit your proposal by October 27, 2017

Best project ideas will be selected, and be sent their kit within around 14 days

Build and submit your project to Hackster.io by January 5, 2018

8 Grand Prize winners will be announced on January 8, 2018 for four categories: gateway, asset tracking, remote controlling, and remote monitoring.

NanoPi NEO 2 is a tiny 64-bit ARM development board powered by Allwinner H5 processor. FriendlyELEC sent me a couple of NEO 2 samples together with their BakeBit Start Kit with a NanoHat and various modules via GPIOs, analog input or I2C. I’ve already tested both Armbian with Linux 4.11 and Ubuntu Core Qt with Linux 3.10, and ran a few benchmarks on NanoPi NEO 2. You would normally prefer to use the Armbian image with Linux mainline since it provided better performance, but at the time I was told GPIO support was not there.

Configuring NanoPi NEO 2 board with BakeBit library

So this week-end, when I decided to test GPIO support and BakeBit Starter Kit, I decided to follow this advice, especially nanopi-neo2-ubuntu-core-qte-sd4g-20170329.img.zip image is still the recommended one in the Wiki. So I went with that image.

I’ll use Python examples from Bakebit library, but if you prefer something similar to WiringPi, you may consider using WiringNP library directly instead of using Bakebit. Since NanoHat Hub comes with header with digital I/O (including 2 PWM), analog input, I2C and UART interfaces, I’ll make sure I try samples for all interfaces I have hardware for. FriendlyELEC did not include a module with a UART interface, so I’ll skip that one.

I followed instructions in BakeBit wiki from a terminal which you can access from the serial console or SSH. First, we need to retrieve the source code:

Now we can try to build the kernel for NanoPi NEO 2 (and other Allwinner H5 boards).

Shell

1

2

cdlichee/fa_tools/

./build.sh-bnanopi-neo2-plinux-tkernel

and it failed with more errors possible related to CROSS_COMPILE flag. There must be a better solution… FriendlyELEC guys might not work on Saturday afternoon, and while I did contact them, I decided to try one of their more recent images with Linux 4.11 available here.

Let’s pick nanopi-neo2_ubuntu-core-xenial_4.11.0_20170518.img.zip since it has a similar name, and is much newer (released 3 days ago). I repeated the installation procedure above, and …

Shell

1

2

3

4

sudo python bakebit_sound_sensor.py

sensor_value=0

sensor_value=57

sensor_value=59

Success! Albeit after 4 to 5 hours of work… Let’s connect hardware to ind out whether it actually works, and not just runs.

Analog Input and Digital Output – Sound Sensor Demo

The simplest demo would be to use the LED module, but let’s do something more fun with the Sound Sensor demo I found in BakerBit Starter Kit printed user’s manual, and which will allow us to use both digital output with the LED module connected to D5 header, and analog input with the Sound sensor module connected to A0 header. Just remember the long LED pin is the positive one.

You can run the code as follows:

Shell

1

2

cd~/BakeBit/Software/Python

sudo python bakebit_sound_sensor.py

I changed the source a bit including the detection threshold, and timing to make it more responsive:

Python

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

importtime

importbakebit

# Connect the BakeBit Sound Sensor to analog port A0

# SIG,NC,VCC,GND

sound_sensor=0

# Connect the BakeBit LED to digital port D5

# SIG,NC,VCC,GND

led=5

bakebit.pinMode(sound_sensor,"INPUT")

bakebit.pinMode(led,"OUTPUT")

# The threshold to turn the led on 300.00 * 5 / 1024 = 1.46v

threshold_value=300

whileTrue:

try:

# Read the sound level

sensor_value=bakebit.analogRead(sound_sensor)

# If loud, illuminate LED, otherwise dim

ifsensor_value>threshold_value:

bakebit.digitalWrite(led,1)

sleep1

else:

bakebit.digitalWrite(led,0)

print("sensor_value = %d"%sensor_value)

time.sleep(.01)

exceptIOError:

print("Error")

The LED will turn on each time the the sound level (actually analog voltage) is above 1.46V.

PWM and Analog Input – Servo and Rotary Angle Sensor Demo

We can test PWM output using the Servo module connected to D5 header, and control it using the rotary angle sensor module connected the A0 analog input header .

Click to Enlarge

The sample for the demo runs fine, and use the potentiometer is detected:

Shell

1

2

3

4

5

6

7

8

9

sudo python bakebit_prj_Servo_And_RotaryAngleSensor.py

sensor_value=73voltage=0.36degrees=12

sensor_value=107voltage=0.52degrees=18

sensor_value=282voltage=1.38degrees=49

sensor_value=491voltage=2.40degrees=86

sensor_value=656voltage=3.21degrees=115

sensor_value=672voltage=3.28degrees=118

sensor_value=622voltage=3.04degrees=109

sensor_value=371voltage=1.81degrees=65

However, the servo is not moving at all. Raspberry Pi relies on rpi-config to enable things like I2C and other I/Os, and I noticed npi-config in the Wiki for NEO 2. So I ran it, and sure enough PWM was disabled.

So I enabled it, and answered Yes when I was asked to reboot. The only problem is that it would not boot anymore, with the system blocked at:

Shell

1

2

3

4

5

6

7

8

reading sun50i-h5-nanopi-neo2.dtb

19229bytes readin30ms(625KiB/s)

## Flattened Device Tree blob at 48000000

Booting using the fdt blob at0x48000000

Loading Ramdisk to49b00000,end4a000000...OK

Loading Device Tree to0000000049af8000,end0000000049affb1c...OK

Starting kernel...

So maybe something went wrong during the process, so I re-flashed the Ubuntu image, reinstalled BakeBit, and re-enabled PWM0. But before rebooting, I checked the boot directory, and noticed boot.cmd, boot.scr, and the device tree file (sun50i-h5-nanopi-neo2.dtb) had been modified. The DTB looks fine, as I could decode it, and find the pwm section:

Let’s reboot the board. Exact same problem with the boot stuck at “Starting kernel…”. So there’s something wrong with the way npi-config modifies one or more of the files. With hindsight, I should have made a backup of those three files before enabling PWM the second time… I’ll give up on PWM for now, and ask FriendlyELEC to look into it.

I2C and Analog Input – OLED UI controlled with Joystick

The final test I’ll use the I2C OLED display module connected to one of the I2C headers, together with the analog joystick module connected to A0 header.

Click to Enlarge

Let’s run the sample for the demo:

Shell

1

2

3

4

5

6

7

8

9

python bakebit_prj_UIControl_via_Joystick.py

('x =',538,' y =',541,' opIndex=',0)

('x =',538,' y =',541,' opIndex=',0)

('x =',521,' y =',1023,' opIndex=',1)

left

('x =',0,' y =',769,' opIndex=',1)

left

('x =',518,' y =',199,' opIndex=',0)

('x =',518,' y =',1023,' opIndex=',1)

It works, but there’s a bit of a lag, and the sample may have to be improved to better detect various states. I’ll show what I mean in the video below.

The bad parts are that documentation is not up-to-date, enabling PWM will crash the image, and while the Python sample do demonstrate IO capabilities, they should probably be improved to be more responsive. The good part is that we’re getting there, the hardware kit is a really nice, and I think the documentation and software should become much better in June, as FriendlyELEC has shown to be responsive to the community issues.

What? Python sucks? You can use C language with GPIOs too

If Python is not your favorite language, FriendlyELEC also provided some C languages samples in the C directory:

Shell

1

2

3

4

pi@NanoPi-NEO2:~/BakeBit/Software/C$ls

bakebit_analog_read.cbakebit_digital_read.cREADME.md

bakebit_analog_write.cbakebit_digital_write.c

bakebit.cbakebit.h

As we’ve seen above, Bakebit library appears to rely on WiringNP, and you’d normally be able to list the GPIOs as follows:

Shell

1

2

3

4

5

sudo gpio readall

piBoardRev:Unable todetermine board revision from/proc/cpuinfo

->IsnotH3 based board

->You may want tocheck:

->http://www.lemaker.org/

The utility is not too happy about seeing an Allwinner H5 board. But maybe the library in the board is not up-to-date, so I have built it from source:

Excellent! It’s not quite a work-out-of-box experience, but NanoPi NEO 2 can be used with (most) GPIOs.

My adventures with NanoPi NEO 2 board are not quite done, as I still have to play with NanoHat PCM5102A audio add-on board, which I may end up combining with a USB microphone to play with Google Assistant SDK, and I’m expecting NanoPi NAS Kit v1.2 shortly. I’ll also update this post once PWM is working.

The Eclipse foundation has recently done its IoT Developer Survey answered by 713 developers, where they asked IoT programming languages, cloud platforms, IoT operating systems, messaging protocols (MQTT, HTTP), IoT hardware architectures and more. The results have now been published. So let’s have a look at some of the slides, especially with regards to programming languages and operating systems bearing in mind that IoT is a general terms that may apply to sensors, gateways and the cloud, so the survey correctly separated languages for different segments of the IoT ecosystem.

Click to Enlarge

C and C++ are still the preferred languages for constrained devices, and developers are normally using more than one language as the total is well over 100%.

Click to Enlarge

IoT gateways are more powerful and resourceful (memory/storage) hardware, so it’s no surprise higher level languages like Java and Python join C and C++, with Java being the most used language with 40.8% of respondents.

Click to Enlarge

When it comes to the cloud with virtually unlimited resources, and no need to interface with hardware in most cases, higher level languages like Java, JavaScript, Node.js, and Python take the lead.

Click to Enlarge

When it comes to operating systems in constrained IoT devices, Linux takes the lead with 44.1%, in front of bare metal (27.6%) and FreeRTOS (15.0 %). Windows is also there in fourth place probably with a mix of Windows IoT core, Windows Embedded, and WinCE.

Click to Enlarge

Linux is the king of IoT gateways with 66.9% of respondent using it far ahead of Windows in second place with 20.5%. They have no chart for the cloud, probably because users just don’t run their own Cloud servers, but relies on providers. They did ask specifically about the Linux distributions used for IoT projects, and the results are a bit surprising with Raspbian taking the lead with 45.5%, with Ubuntu Core following closely at 44.4%.

Click to Enlarge

Maybe Raspbian has been used during the prototyping phase or for evaluation, as most developers (84%) have been using cheap development boards like Arduino, BeagleBone or Raspberry Pi. 20% also claim to have deployed such boards in IoT solutions.

Click to Enlarge

That’s only a few slides of the survey results, and you’ll find more details about Intel/ARM hardware share, messaging & industrial protocols, cloud solutions, wireless connectivity, and more in the slides below.

You’ve probably already seen one or more object recognition demos, where a system equipped with a camera detects the type of object using deep learning algorithms either locally or in the cloud. It’s for example used in autonomous cars to detect pedestrian, pets, other cars and so on. Kochi Nakamura and his team have developed software based on GoogleNet deep neural network with a a 1000-class image classification model running on Raspberry Pi Zero and Raspberry Pi 3 and leveraging the VideoCore IV GPU found in Broadcom BCM283x processor in order to detect objects faster than with the CPU, more exactly about 3 times faster than using the four Cortex A53 cores in RPi 3.

They just connected a battery, a display, and the official Raspberry Pi camera to the Raspberry Pi boards to be able to recognize various objects and animals.

FriendlyElec (previously FriendlyARM) launched NanoPi NEO and then NanoPi NEO Air board as respectively Ethernet and WiFi/Bluetooth connected boards for IoT applications. But so far, there was no ecosystem around the board, you had to use your own sensor modules, and write your own software to control them. This has now changed with the launch a BakeBit Starter Kit with twelve sensor modules, a NanoHat Hub add-on board designed for NanoPi boards, as well as BakeBit Library to control the hardware.

NanoPi NEO with NanoHat and Two Modules

The NanoHat Hub plugs into the two NanoPi NEO headers and provide 12 headers with 3x I2C interfaces, 3x analog interfaces, 2x UART interfaces, and 4x digital interfaces among which D3 and D5 support PWM, compatible with SeeedStudio Grove modules. You then have a choice of 12 modules to connect to the NanoHat Hub:

OLED Module

Ultrasonic Module

Green LED Module

Red LED Module

LED Bar Module

Rotary Angle Sensor Module

Joystick Module

Sound Sensor Module

Button Module

Light Sensor Module

Buzzer Module

Servo Module

BakeBit Starter Kit – Click to Enlarge

But now that you have your hardware setup with multiple module, you still need to program the thing, and that’s where BitBake library, based on Grove Pi, comes into play, as it allows you to program the module easily with Python programming. More details can be found in the Wiki for BakeBit NanoHat and modules.

BakeBit Starter Kit is now sold for $29.99 (promotion), but if you already have Grove modules, you could also simply purchase NanoHat Hub for $12.99. Bear in mind that Chinese New Year is around the corner, so any order passed after January 24th and beyond, will be processed after the holidays around February 6th. [Update: The company has also released a $9.99 NanoHat PCM5102A audio board for NanoPi Boards]