Category: Hub Software

I am using a Pi 3 Model B with the Voice Hat attached, and assembled as per the instructions provided in the MagPi magazine. Internet access is required, if you want to use the Google service or other cloud offering for Speech-To-Text and obviously network access is required to make REST calls to our home-hub.

The published AIY design requires the big green button to be pushed (or a hand clap) to activate listening mode. Most users want a wake-word as in Alexa, OK Google, Hey Siri, etc. I was attracted by the Snowboy Hotword Detection Engine, and have recorded a hotword for my system. I still use the green led as a listening indicator.

The listening mode is a contentious subject. On the one hand we do not want voice assistants streaming all our conversations to the cloud, however, with a dedicated voice appliance like the hub-vox, repeatedly issuing wake commands becomes tedious. Snowboy, being a local detector, provides us with control over this balancing act.

A warning that there is a lot of software that requires installing for the hub-vox, but a script has been written which will complete the task based on a plain vanilla Raspbian Stretch Lite operating system. If you want to install each component separately, you can copy and paste commands from the script to the command line.

Fetch the script…

wget http://www.warrensoft.co.uk/home-hub/linuxscripts/vox/setup.sh

edit the script to customise the settings…

nano setup.sh

make the script executable…

sudo chmod +x setup.sh

and then run it…

sudo ./setup.sh

Here is a summary of the main components installed:

Samba – for managing our code

Vox code – from the project repository

Voice Hat drivers – for microphones and speaker

Snowboy – hotword detection

PyAudio

Python Speech Recognition

Google api python client – STT

Flac, Pico2wave, Sox – TTS

In the next post we will assemble a test list of utterances and pipe that through the Phrase Processor.

In a previous post we developed a simple api facility for home-hub slaves to enable remote control. The voice interface will use this api for controlling actuators, but it requires a much richer interface to provide a full interactive voice service.

The following instructions will detail how to modify your home-hub to add a full REST api. This will allow us to read data from any table or view in the hub’s database.

Our REST commands will be of the form:

http://home-hub/rest/zones

Normally, the apache web server would throw a page not found error, but with some additional configuration we can direct all such requests to a specific service page.

The same warning about security applies here. The rest api has no authentication built in. It provides an access-most-areas pass to inspect (but not change) all the hub’s database contents, so if your hub is internet-facing you are strongly advised to add your own security measures.

When you have all your sensors, actuators and impulses configured, and you have defined all your rules and collected a number of months worth of statistical data, the last thing you want is for a crash or gremlin to corrupt your database. The software can be rebuilt with relative ease, but restoring the configuration might not be so easy.

What we require is a nightly database backup, to a separate location that will be safe should our Pi come to any harm. I am using a shared directory on my NAS drive, but a small USB drive would be just as good.

If you want to set this up, add the following commands to your crontab scheduler using sudo crontab -e:

/mnt/piback is the mounted directory for my installation, which you may need to tailor for your own setup. There are many tutorials on the web which explain how to add an external drive to the Pi. The command will produce a rolling backup which will wrap around every fortnight:

If disaster should strike you can restore to a known good point.

In a future series of posts I will be investigating voice control for the hub, using integration with the Google AIY Voice Kit.

The Amazon Dash is a self-contained WiFi enabled remote control button, available to Amazon Prime members, that we can interface with our home hub as an Impulse button. The advantage is that we don’t require physical wiring or additional electronics – we just sniff the network looking for the signature mac address of the button. The beauty of this adaptation is that we can implement hybrid Impulses, that respond to either fixed buttons connected to GPIO pins, or dash buttons free floating on the network.

The process to configure the button, without actually committing yourself to the repeated purchase, is well documented elsewhere. The key piece of information we require is the button’s mac address. I was able to glean this from my router log, or you can try the python test program sniff_test.py below.

and you should see a list of mac addresses from devices on your network. Wait a while to hoover up non-dash packets then press the button, and look for a different mac address. You may see double entries for the dash, but don’t worry as this will be handled by our existing software debounce routine.

Once we have discovered the mac address we need to add it to the existing impulse configuration.

The sniffing is performed by a separate thread in the hub main program. Download the new main_sched py file:

Now it should be safe to connect your sensor module to the pi. If you do this before completing the configuration you may have connection issues, as the pi is interpreting your module as a login. Note that the Pi TXD (BCM 14) connects to the modules Rx line and Pi RXD (BCM 15) connects to the modules Tx line. The remaining connections are just 3v3 Power and Ground. I recommend you shutdown and power-down to make these connections.

Add a new sensor using the website Organisation page, and set the sensor function to:

bme280./dev/serial0.4800.0

This function is constructed from 4 parts:

helper . serial-port . baudrate . channel

The serial port is /dev/serial0 which is an alias to the ttyAMA0 port we are connected to.

The baud rate is fixed in the picaxe program, at 4800, which works well in my setup.

Add a new sensor via the Organisation option of the website, and populate the SensorFunction as discussed previously. When displayed in Current Values the reading will be time-stamped with the applicable time:

The meteorology facilities give you a wide range of possibilities for implementing intelligent control algorithms within your hub. Next we will look at an alternative hardware sensor.

Once registered you will be allocated an Application Key, which is required for all queries. In addition to your application key, you will need to know the location id for your nearest monitoring site. A list of locations can be obtained by running the following query in a browser:

It is a long list so you may have to search it to find the nearest location, or use your latitude/longitude coordinates. Once you have your location id, you need to populate 2 new hub user settings: DataPointKey and DataPointLocation. You can add these in the Organisation page of the website, or use the following commands updated with your own <values>:

If you inspect this xml response you will see the list of available parameters. What we require is the parameter name e.g. ‘G’ is Wind Gust, ‘Pp’ Precipitation Probability, etc. This is a good time to add any new Measurands to our hub organisation.

We should now have all the information we require to populate the SensorFunction field of a new sensor:

e.g. meteorology.240.S

this translates to a look-ahead timespan of 4 hours and the Wind Speed parameter. The hub code will find the nearest forecast to our requested time, and read the data. The reading will be time-stamped with the future time.

If you live in the UK then you have access to the excellent Met Office DataPoint service. To quote their website:

DataPoint is a service to access freely available Met Office data feeds in a format that is suitable for application developers. It is aimed at anyone looking to re-use Met Office data within their own innovative applications, for example professionals, the scientific community, student and amateur developers.

The hub meteorology sensor type allows us to use met office readings as if they were connected sensors. This saves us implementing our own physical sensor, which might not be practical, and also gives us access to forecast data without requiring our own super-computer! For example, if we wanted to know the predicted local temperature, to turn on heating in advance , we can use this facility.

In my opinion this is what elevates the Raspberry Pi to super stardom – the symbiotic combination of global data and local control.

In order to use the service you need to register, but it is free. Once registered you will be allocated an Application Key, which is required for all queries. In addition to your application key, you need to know the location id for your nearest monitoring site and the name code of the parameter you want to measure. Here is a checklist of the required tasks:

Register for a DataPoint Account

Collect your Application Key

Use Key in browser query to obtain a list of sites

Find your nearest site in list – note location id

Populate hub user settings for application key and location id

Use Key in browser query to obtain a list of parameters for that site

Look up the parameter name code for measurement you require

Calculate the look-ahead timespan in minutes e.g. 0, 180, 360, etc.

Assemble your sensor function from the look-ahead and parameter name

Install prerequisites for python

Install the meteorology sensor helper from project repository

Restart Controller

Configure new sensor with sensor function

In the next post I will provide sample browser queries required to set things up.

You will have seen references to the slave flag in the main scheduler program. This is a boolean value passed in when the controller is first started up. We have set it to false for all operations to date, as we have been building our master hub, but if we set this flag to true then we would launch the hub in slave mode.

A Slave hub is a cut-down appliance that is focused on reading sensors and driving actuators. It still has all of the software installation of a master hub, but some routines are not used. I will be publishing an installation script and SD image file in the resources section soon, so you don’t have to manually construct another hub. The master hub will talk to the slave over the network, giving you the extended reach for your sensors and actuators. The Pi Zero W is the ideal candidate for implementing a slave hub.

Once you have your second pi up and running, all that is required to make it a slave is the following:

These helpers need to be un-commented in their respective __init__.py files.

We also need the python requests module, via pip:

sudo apt-get install python-pip

sudo pip install requests
sudo pip install --upgrade requests-cache

Restart the controller and add a new sensor, this time with the Sensor Function of the form:

remote.x.y

where x represents the least significant part of the ip address of the slave, and y represents the remote sensor number. For example, if we had a master hub on 170.30.90.40, and a slave on 170.30.90.41 with a Sensor S3, then the sensor function in the master would be:

remote.41.3

Once configured the remote sensor is identical to a local sensor, and can be sampled, alerted, monitored, etc.

An identical approach applies to remote actuators, so the actuator function remote.41.4 would control actuator 4 on the slave hub.

Just a note about security. Remote access to the api page is not secure, unlike the normal website functions. Rather than publish details on a public blog, it is left to the reader to implement whatever mechanism they feel is appropriate. One possible setup is to make just the master hub website accessible over the internet, on a different port, but not the slave hub(s). Consult your router documentation for details.

Next we will look at fetching our readings from slightly further afield.

Now we are collecting data on a regular basis it would be a shame not to record some simple statistics, maximum and minimum values, for each of our measurements. This is what the controller statistics module does.

and uncomment lines 22, 195 and line 206 of main_sched.py to enable the statistics features. Restart the controller.

A summary of the highs and lows appears, not surprisingly, on the Statistics page. In addition, a daily summary of changes can be delivered by email.

There are a couple of User Settings we need to take care of to complete the configuration. First, there is the Admin Recipient: this is the email address of the occupant who will receive the daily summary report, and secondly Summary Enabled needs to be set to true.

With the configuration completed, a report will be sent detailing any new highs and lows for the day to the admin user.