Category Archives: Thoughts

People call me many things, some polite, some not, some to my face and I am sure, some behind my back. One thing I have been accused of being, and that I am most certainly not, is an electronics hobbyist.

Certainly I use electronics, and my extremely limited electronics knowledge in many of my projects, but I certainly am not interested in the electronics, for the sake of the electronics, in fact I cannot think of much of less interest to me, and whilst I can understand the point of building electronics to test and build something so you can increase your knowledge, this is simply not me. I learn what I need to know to complete a project.

This has come about as at one client they are utilising Arduino’s in an educational setting, and on numerous occasions I have been asked to help out due to my knowledge surrounding them, now this is fine whilst they are doing some basic functions, you know, hello world kind of things, but so far for example I have had no need to learn how to control or manage servo’s so when that comes up, I am of little use.

What I DO build with electronics, are things that I cannot purchase off the shelf, yes I know its lazy, but like so many others I am time poor, I only build things that I have to build to achieve an outcome that I have decided I need, often times this is with the goal of some kind of automation, or reporting on certain states to conserve time on often wasted tasks that I would have to otherwise do.

One example of this, is the Particle Phone and electronic scales I am working on for the measurement and ultimate reporting of weight of a container of hydrochloric acid that is attached to the pool. The automated systems we have in place around the pool control PH, Chlorine to Salt conversion (through the use of ORP) and temperature on the solar controller. What it does not do however is tell me and report on the weight of the remaining hydrochloric acid, meaning constant checking of this one component. This project is simply to use a particle photo (or perhaps ultimately a ESP8266) to simply read with weight from a scales and report it back, and then either generate a push message or e-mail when the weight drops to a certain percentage(s) of the original (minus the approx weight of the container obviously). This reduces my need to check the system.

What this does not make me, is an electronics hobbyist, it makes me a maker, or perhaps an assembler, cobbling bits of off the shelf hardware and code together to make a task work. A true electronics hobbyist, would design the circuits, test them and go for a far greater efficiency than I am trying to achieve, as the photon is most definitely overkill for the task at hand in this case, and perhaps an ESP8266 is as well, I do not know, I do not care, I am after a working “product” at the end that can achieve my rather simple goals.

The Raspberry Pi and other small single board computers have really taken off in the past few years, especially with the burgeoning wave of development, both commercial, but mainly hobbyist of the Internet of Things (IoT) arena.

Now Raspberry Pi (I am focusing on RPi here because it kicked off the whole shebang in a big way, small SBC’s existed before then but they were not as widely available or used) was never intended to be a IoT board, it was originally intended to be used to teach programming to children. The success of this original project (with over 5 million, yes that is 5,000,000 sold) has not only spawned a myriad of projects but a whole bunch of clones and similar devices looking to capitalize on the success of the project.

With the hobbyist community getting a hold of these devices and putting them into various projects one has to question the cost of these devices. The devices for those who do not know cost US$25 or US$35 depending on the board revision however you also need to add a SD card (either standard or micro depending on revision), power supply, case (enclosure) and if needed a USB wireless dongle and you are looking at getting towards US$100, not as cheap as it sounds to be, and that’s in a basic headless configuration.

The other side to this is the environmental cost, with all these devices (remember there are 5 million RPi’s alone) floating around that will at some point in there lives end up being thrown out, and mostly into landfill it is not overly environmentally cost effective with all those electronics leaching chemicals and other materials over time. What causes this, upgrades to newer models or migrations to other platforms, or even loss of interest, the result is the same.

Now don’t get me wrong, I am not saying these systems are all wasted, or all an issue. Many interesting projects and products are developed from them, not to mention the education that people get from developing on and for these systems. What I am saying is that their use should be more specialized to where the processing power is actually required or used to aggregate the data (done through a technology such as MQTT), cache it and forward it to a more powerful management system (home server anyone).

Further to this, the idea here merges nicely with my move to containers (Docker) and my continuing work with Virtual Machines. If we take the services the RPi runs for each function and put them into a container, and that container syncing through either MQTT or directly through the applications services to a micro controller which then carries out the functions.

Why is this more efficient, because the micro controller only needs to be dumb, it needs to either read the data on the interface and report it to the server, or turn an interface on or off (or perhaps “write” a PWM value) to perform a function. This micro controller does not need to be replaced or changed when changing or upgrading the server, and can even be re-tasked to do something else without reprogramming the controller and only changing the functions and code on the mother controller node.

Much more efficient and effective. It does however have the downfall of an extra failure point so some simple smarts on the micro controller would be a good idea to allow it to function without the mother controller in the event of a failure but the MQTT controls are agnostic so we can work with that, at least for monitoring.

Servers have been in the home for just as long as they have been in the business’ but for the most part they have been confined to home lab’s and to the homes of systems admins, and the more serious hobbyists.

However, with more and more devices entering the modern “connected” home, it is time to once again consider, is it time for the server to into the home. Whilst some companies are, and have been starting to make inroads and push their products into the home market segment, most notably Microsoft and their partners with the “Windows Home Server” systems.

Further to this modern Network Attached Storage (NAS) devices are becoming more and more powerful, leading to their manufacturers not only publishing their own software for the devices, but thriving communities growing up around them and implementing their own software on them, Synology and the SynoCommunity for example.

These devices are still however limited to running specially packaged software, and in many cases are missing the features from other systems. I know this is often by design, as one manufacturer does not want their “killer app” on competitors system.

Specifically what I am thinking of with the above statement is some of the features of the Windows Home Server and Essentials Server from Microsoft, as many homes are “Microsoft” shops, yet many homes also have one or more Apple devices (here I am thinking specifically iPads/iPhones) and given the limited bandwidth and data transfer available to most people, an Apple Caching Server would be of benefit.

Now sure you could run these on multiple servers, or even existing hardware that you have around the house, but then you have multiple devices running and chewing up power. Which in this day and age of ever increasing electricity bills and the purported environmental costs of power, is less than ideal.

These issues could at least be partly alleviated by the use of enterprise level technologies such as virtualisation and containerisation, however these are well beyond the management skills for the average home user to implement and manage. Not to mention that some companies (I am looking at you here Apple) do not allow their software to run on “generic” hardware, well at least within the terms of the licencing agreement, nor do they offer a way to do this legally by purchasing a licence.

Virtualisation also allows extra “machines” to run such as Sophos UTM for security and management on the network.

Home server are also going to become more and more important to act as a bridge or conduit for Internet of Things products to gain access to the internet. Now sure the products could talk directly back to the servers, and in many cases this will be fine if they can respond locally, and where required cache their own data in the case of a loss of connection to the main servers either through the servers themselves, or the internet connection in general being down.

However what I expect to develop over a longer period is more of a hybrid approach, with a server in the home acting as a local system providing local access to functions and data caching, whilst syncing and reporting to an internet based system for out of house control. I suggest this as many people do not have the ability to manage an externally accessible server, so it is more secure to use a professionally hosted one that then talks to the local one over a secure connection.

But more on that in another article as we are talking about the home server here. So why did I bring it up? Containerisation; many of these devices will want to run with their own “server” software or similar, and the easiest way to manage this is going to be through containerisation of the services on a platform such as Docker. This is especially true now that Docker commands and alike are coming to Windows Server systems it will provide a basically agnostic method and language to set up and maintain the services.

This also bring with it questions about moving houses, and the on-boarding of devices from one tenant or owner of the property to another one. Does the server become a piece of house equipment, staying with the property when you move out, do you create an “image” for the new occupier to run on their device to configure it to manage all the local devices, do you again run two servers, a personal one that moves with you, and a smaller one that runs all the “smarts” of the house that then links to your server and presents the devices to your equipment? What about switching gear, especially if your devices use PoE(+) for power supply? So many questions, but these are for another day.

For all this to work however we need to not only work all these issues out, but for the regular users the user interface to these systems, and the user experience is going to be a major deciding factor. That and we need a bunch of standards so that users can change the UI/Controller and still have all the devices work as one would expect.

So far for the most part the current systems have done an admirable job for this, but they are still a little to “techie” for the average user, and will need to improve.

There is a lot of potential for the home server in the coming years, and I believe it is becoming more and more necessary to have one, but there is still a lot of work to do before the become a ubiquitous device.

Irrigation (sprinkler) systems have come a long way since their inception, and even further since the advent of modern electronics, and with the modern Internet and the beginings of the Internet of Things (IoT) revolution they are getting smarter and are able to do more. One example of this is that where a “modern” controller can tell if it is raining, or has rained in the past period through the use of a rain guage, IoT devices such as the OpenSprinkler can now use forcast weather from the internet to make a decision about the watering. Linking this with things such as moisture sensor data can make these systems even smarter. This is however one thing that seems to be missing, the “smart” solenoid.

I am not a gardener by choice per-sey but more by nececcity, wanting to take more control of the food I and my family eat requires growing our own, which whilst easy in some respects, does chew up a lot of time.

Solonoids themselves are quite simple devices they use a magnetic coil to retract a metal (normally iron) core against a spring (which opposes the coil so the solonoid goes back to “rest” when the electrical current is no longer applied) to open or close a gate, if the gate is closed, water does not pass, open the gate, and the water flows through. Nice and simple.

What is not so simple however is the current requirement to run an entire cable pair, yes there are ways of theoretically doing n+1 (n being the number of solonoids) but in general its one solonoid its one cable (pair).

Now with cheaper, smarter, more capable electronics what is to stop us moving the “smarts” that for so long have been intergrated with the controller, on to the solonoid itself, you could then program it over the cloud, a RTC would allow it to turn on/off on a schedule, a hard link to a moisture sensor could allow it to turn on if the soil gets to dry, and cloud computing, or a local weather station could stop it watering if it has rained or is predicted to rain within the next allocated period, say 6 hours.

That gives you more smarts than are most old control boards are capable of, almost as much as modern ones are capable of.

But what if, now we have this connected to the cloud, and we can group them, in one or more groups to control when things are watered. Got tomatos that need watering twice a day but are at opposite ends of the garden, not a problem just create a group of two or more solonoids to control, put in the times and off you go, what about 3 areas, just at another solonoid, 4 areas so on and so forth.

But we are talking the cloud here…. it’s all seeing, all knowing. You could in theory not only control based upon groups, but what about plant types, if you could TELL the system that you were growing tomato’s and you could tell it how much water you want to give them, and when. If you wanted to you could even attach a flow meter to measure the amount of water delivered rather than base it on the arbitary value of time where the pressure and therefore amount of water could vary, with a flow meter you KNOW how much as been delivered.

What I am thinking is a bit light your LifX bulbs, but for solonoids. What about data, well that is easy you can do it through standard 802.11 Wireless, or how about XBEE back to a controller station, or even use a three cable wire to tap in to, using an addressing system such as I2C. In the end it does not matter so much about the how it works technically, so long as I can walk up, plug in the power (or power & data) and connect to the water piping, program it how I want and boom, it works.

With recent and ongoing research and statistics showing that activation locks on phones and tablets has in many cases reduced the theft of the devices, is it time to consider this same technology for other devices?

I would argue that whilst technically it is ready and suitable for use in the wild, and there are certainly technologies out there that do similar things it is time this technology became installed on other devices. I mean think about it, just about all modern devices have a computer in it of some sort, everything from the brand new smart TV, to the washing machine. With the increasing use of the Internet of Things (IoT) these devices are also becoming connected to the wider world, in fact many of the “smart” gadgets such as TV’s, BluRay players and PVR’s already have internet accessibility. It is on these devices that I would advocate we should start installing activation lock technology, as not only are these the main target of the thieves but with their inherent desire to be connected to the internet it would allow you to put it into a lost/stolen mode and when it connects it locks out. Obviously you would want this to display a message and some basic information. So it can be returned to the owners.

Further to this having an independent clearing house for this, so all your devices and use it under a single log-in would be ideal as you can then lock one or more devices with one control interaction rather than having to go to each manufacturers or partner groups page to block devices. As time goes on this could then of course be rolled out to more and more devices, thereby making it harder and harder for thieves to steal anything with electronics in it and the activation tech.

Further to this thing such as device encryption could then be placed in using this tech again similar to what is already available on phones and tablets this would allow devices that store data (almost all of them these days in some form or another) to be securely erased (wipe the encryption key, wipe access to the data) to prevent identity theft and other malicious use of private data (you do have an ENCRYPTED backup though right)

This definitly will not happen overnight, if at all due to the competing methodologies, standards and the companies unwillingness to work together to make a standard. I can dream though can’t I