Bogaert blogshttp://bogaert.com/de/blog
deBlueHome - connecting the house with Watson IoThttp://bogaert.com/de/blog/bluehome
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><table border="0" cellpadding="10" cellspacing="0" style="width:100%"><tbody><tr><td>
Ever thought on connecting your home to a cognitive system? I gave it a trial! Here a brief description on how I made it work.
</td>
</tr><tr><td>
Within the house we have different protocols, being IEB for the house appliances, MiLight to control the colour of the lightning, a specific datalogger for the solar panels, Sonos for audio, and some IP cameras. My focus for this experiment was to connect in the first place with the EIB bus. The diagram bellows outlines how the connection is realised.
</td>
</tr></tbody></table><!--break--><table border="0" cellpadding="10" cellspacing="0" style="width:100%"><tbody><tr><td>
On the home side: The EIB bus can be reached through an EIB/IP router. To build the connection I used a small ARM based linux box. On that box I downloaded, compiled, and installed the eibnetmux server (<a href="http://eibnetmux.sourceforge.net/">http://eibnetmux.sourceforge.net/</a>) and the MQTT client (<a href="http://www.eclipse.org/paho/">http://www.eclipse.org/paho/</a>). The next step was to connect both. Therefore, I wrote a small program in C that does the following:<br /><br />
- Reading of a configuration file, containing<br />
o MQTT credentials<br />
o EIB credentials<br />
o List of devices (eib addresses and the corresponding device names and parameters to be used)<br />
- Setup a connection to the eibnetmux server<br />
- Setup a connection to the MQTT client<br />
- Subscribe to the MQTT service of Bluemix<br />
- Listen to the eib bus and for each event write a message to the MQTT client<br />
- For every message received through MQTT analyse and create an event on the eib bus<br />
The small linux box is acting as a gateway device, so the connection with the Watson service requires a one-time setup for the gateway itself. Thereafter the devices added by the Internet of Things platform automatically at their first published event.
</td>
</tr><tr><td><center><br /><img alt="BlueHome diagram" src="/images/blog/bluehome1.jpg" style="height:80%; width:80%" /></center>
</td>
</tr><tr><td>
On the backend side I used Bluemix, where I created a Node.js application and attached the Watson Internet of Things (IoT) Platform. In the IoT platform I created the gateway device and used the credentials on the house side. Below a screenshot of the populated devices in the IoT platform:
</td>
</tr><tr><td><center><br /><img alt="BlueHome diagram" src="/images/blog/bluehome2.jpg" style="height:80%; width:80%" /><br /></center>
</td>
</tr><tr><td>
Once the connection worked I could see the data coming into the NoSql database. I used the Watson IoT boards functionality to get a glimpse of the data coming in live. Below the standard usage overview board, shows the number of devices and the data usage of the system. On a second board I added the live temperatures coming from the outdoor temperature measurement and from the heat water system measurement. I also added to status indicators for the lights on the right hand side.
</td>
</tr><tr><td><center><br /><img alt="BlueHome diagram" src="/images/blog/bluehome3.jpg" style="height:80%; width:80%" /><br /></center>
</td>
</tr><tr><td><center><br /><img alt="BlueHome diagram" src="/images/blog/bluehome4.jpg" style="height:80%; width:80%" /><br /></center>
</td>
</tr><tr><td>
The nice thing about using MQTT is the possibility to return message without the need to expose the gateway to the internet. As the MQTT client at the gateway subscribes to the MQTT server it will receive all messages published by the platform. I experimented with the service by using node.js to write an MQTT message that is received by the gateway and that contains an instruction to write to the eib bus. By doing so I am able to control for example the lights remotely without any opening in the DMZ at home.<br />
screenshot
</td>
</tr><tr><td><center><br /><img alt="BlueHome diagram" src="/images/blog/bluehome5.jpg" style="height:80%; width:80%" /><br /></center>
</td>
</tr><tr><td>
So, what's next? Maybe connect the solar plant and start doing some trend analysis on the data?
</td>
</tr></tbody></table><p> </p>
</div></div></div>Mon, 08 Aug 2016 13:58:59 +0000bogaert486 at http://bogaert.comhttp://bogaert.com/de/blog/bluehome#commentsCloud revised and from Cloud to Singularityhttp://bogaert.com/de/blog/cloud2singularity
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><table border="0" cellpadding="10" cellspacing="0" style="width:100%"><tbody><tr><td class="rtecenter" colspan="2"><u><em>As published in the liber amicorum of Prof. dr. H. Jaap van den Herik, Tilburg University, 29 Jan 2016</em></u></td>
</tr><tr><td colspan="2">
<p>The path of continuous innovative ICT research by Jaap van den Herik and my path of remaining ambition to start a Ph.D. crossed in Rotterdam, back in 2008. Since that initial discussion I have the honour to work on my Ph.D. under the supervision of Jaap van den Herik for a period of almost four years. The research subject of Cloud Computing was new at that point in time. Now, four years later, the landscape has changed already. The pace of innovation and change in information technology is high, therefore I like to amend our initial findings with the following two conclusions: (1) Cloud revised and (2) Cloud to Singularity.</p>
</td>
</tr></tbody></table><!--break--><table border="0" cellpadding="10" cellspacing="0" style="width:100%"><tbody><tr><td colspan="2">
<p><strong>1. Cloud revised</strong></p>
<p>Cloud computing is interpreted in many ways. Bogaert (2011) build a structure based on two dimensions; these were (1) the dimension of collocation and (2) the dimension of virtualisation. Today, we should extend these two dimensions as follows:</p>
<p>For the first dimension of collocation we need to rethink the dimension. In the past the extreme level ‘internal’ equaled ‘on premise private’ and the extreme level ‘external’ equaled ‘off-premise public’. Today we have to loosen these cohesions. We can therefore split the dimension of collocation into two new dimensions. (1) The dimension of collocation with the extreme levels of collocation: (a) on premise vs. (b) off-premise. (2) The new dimension of individuality with the extreme levels of individuality, (a) private vs. (b) public. Combing the extreme levels of the dimension of collocation and the extreme levels of dimension of individuality using a two dimension graph results in four quadrants. We show the idea in Figure 1.</p>
<p class="rtecenter"><img alt="Two dimensions" src="/images/blog/blog_singularity_1.png" style="height:312px; width:320px" /></p>
<p class="rtecenter">Figure 1. Dimension of collocation and dimension of individuality</p>
<ol><li>On-premise Private cloud: Cloud environment on premise of the organisation and for private usage only.</li>
<li>Off-premise Private cloud: Cloud environment off-premise, solely accessible by the organisation and for private usage only. An example is a private hosted cloud for an organisation at a cloud service provider.</li>
<li>On-premise Public cloud: Cloud environment on premise of the organisation, shared with other organisations. An example is the use within the new API economy organisations to share information or services with others using public accessible API’s, therefor an on premise public cloud approach can be used to cope with the dynamics of potential workload.</li>
<li>Off-premise Public cloud: Cloud environment off-premise, shared by many organisations.</li>
</ol><p>For the second dimension of virtualisation we observe two major trends in the market. (1) The level of virtualisation is extended downwards within the infrastructure using new technologies such as software defined infrastructure, software defined storage, and software defined networks. This results in the ability to create and configure infrastructure by software, consequently new offerings called bare metal servers are available as a cloud service in the market. (2) The level of virtualisation is deepened at the level of Platform as a Service. Where in the past the Platform as a Service was provided as a single service, now the platforms become platforms of aggregated services. These platforms aggregate services from different vendors.</p>
<p>We have defined now three dimensions: (1) the dimension of virtualisation, (2) the dimension of collocation, and (3) the dimension of individuality. These three dimensions are used as axes in a three dimensional graph. We show the idea in the figure 2.</p>
<p class="rtecenter"><img alt="Three dimensions" src="/images/blog/blog_singularity_2.png" style="height:401px; width:350px" /></p>
<p class="rtecenter">Figure 2. Dimension of collocation, dimension of individuality, and dimension of virtualisation</p>
<p>We depict the extreme levels for each dimension, resulting in eight octants. When we examine the industry along these three dimensions, we are able to define eight operational scenarios in cloud computing according to the eight octants.</p>
<ol><li>On-premise Private Infrastructure services</li>
<li>Off-premise Private Infrastructure services</li>
<li>On-premise Public Infrastructure services</li>
<li>Off-premise Public Infrastructure services</li>
<li>On-premise Private Business services</li>
<li>Off-premise Private Business services</li>
<li>On-premise Public Business services</li>
<li>Off-premise Public Business services</li>
</ol></td>
</tr><tr><td colspan="2">
<p><strong>2. Cloud to Singularity</strong></p>
<p>Cloud might seems the future, as the world is moving rapidly to these technologies. However, it is only an intermediate step to the next wave of technologies. Let us take a closer look on what could be the future beyond today’s promising technology of cloud.</p>
<p>Cloud Computing is a delivery model of technology where the use of information technology is provided over the internet. This enables users to access technology-enabled services from the internet (‘in the cloud’) without knowledge of, expertise about, or control over the technology that supports them.</p>
<p>We defined three waves of computing which introduced the third wave as <em>cloud computing</em> (Bogaert, 2011). Cloud computing is the third wave of computing, characterised by two developments: (1) consolidation of technology and (2) sharing of resources among different users/applications on the same platform. This results in less infrastructure, software, and consequently in lower operational costs. The industrialised delivery model of cloud turns technologies into a commodity.</p>
<p>If we look to the graph I used before, you might ask yourself if cloud computing is going to be the ultimate and final delivery model of information technology. Or, what is the next shift in technology that is going to trigger a new wave with associated superior increase in performance versus costs? We extended the previous graph and show the idea in Figure 3.</p>
<p class="rtecenter"><img alt="Waves" src="/images/blog/blog_singularity_3.jpg" style="height:320px; width:597px" /></p>
<p class="rtecenter">Figure 3. The fourth wave of computing</p>
<p>Mobile technologies leverage the omnipresence of the internet together with access to information and services in the cloud. The growth of wireless access to the internet combined with the growth of available information and services is boosting the development of mobile technologies. Also here, what’s next?</p>
<p>While cloud and mobile are still growing, the key questions already are: what is the future after cloud and mobile have become mature? Which innovations will drive the next wave of information technology? Being home for a year caused by a scattered leg gave me enough time to think about this kind of ‘philosophic’ questions. I came up with four drivers with an impact on the way we will cope with information technology in the near future. These four drivers are (1) consolidation, (2) data and service unification, (3) integration between the human being and the technology, and (4) cognitive capabilities of computer systems. Let me explain these four drivers.</p>
<ol><li>The first driver is consolidation. Caused by the industrialised way of automated cloud services delivery there is the first development of consolidation accelerating in 2015. Global cloud service providers such as Amazon, Microsoft, and IBM deploy new cloud data centres on a regular basis and acquire small cloud solution and service providers. Moreover, the global cloud service providers announce growth rates pointing to this market consolidation. For example Amazon announced an 84% growth year-on-year mid-2015. This direction will have an impact on small and local cloud service providers. A similar trend of consolidation happened before in other industries, for example in automotive and food. Information technology is moving into the same direction. Ultimately we will end up with a market of a few global cloud service providers and a small number of niche cloud service providers. Back in 1943 when a computer was a complex machine housed in a large building, Sr. Thomas John Watson made the statement: "I think there is a world market for maybe five computers". Maybe, one day he might be right.</li>
<li>The second driver is the data and service unification. Looking to the current evolution of cloud computing from a technology perspective we see a maximum sharing of compute infrastructure, storage, network, to some extend middleware services, and the associated human services. Looking from a data and service perspective this is becoming a whole other story. Data is getting scattered and replicated multiple times, so though we optimised the use of storage with cloud, we keep burdening storage due to scattering and replication of data. Let me illustrate this with a simple example. Your own personal data (for example your name, address, and contact details) are stored at each service provider and there is a continuous challenge to keep all these copies up to date. Ideally this would be stored only once, kept updated, and all other services would refer to this single source of data through a unique reference. Of course, this requires complex access control and security. Standardisation is a key to get to this point. This requires a single representation of information in format and more important (and challenging) in meaning. Your personal data is only a small piece of data compared to what is stored on the cloud, needless to say the level of optimisation that could be achieved. The same is true for services as the same logic is implemented over and over again by each application. Let me illustrate this with another example. Financial applications calculate the applicable VAT, and this algorithm is implemented by each application provider. Every change in legislation requires the implementation of all these applications changes. A single service could provide this updated as a service for all applications, thus implemented once, and kept updated to reality. Standardisation of data and associated services can simplify application development and maintenance. Further standardisation combined with current technologies such as service brokers will underpin this evolution. Once this is achieved the network becomes the core of the future computer!</li>
<li>The third driver is the integration between the human being and the technology. Over half a century the communication between human beings and computers evolved from punch cards over text terminals to a wide variety of graphical devices. Additionally, over the past decade developments enabled interaction through audio using text to speech and speech to text technologies. The entire interaction remains a slow process, limited by the speed a human can type (or speak) and read (or hear). Developments in bionic technology show novel alternatives to interact with technology. Bionic hands, eyes, ears show that ability to establish a communication between technology and the human nerve system. Though there is still a long way to go this looks a promising evolution. In the future we can expect a more natural way to communicate, ultimately enhancing our ability to think by interacting with technology to support our thinking.</li>
<li>The fourth driver is the cognitive capabilities of computer systems. Traditional computer systems use structured data as input and have programmed algorithms to process the data. The designed algorithms work in a binary way and in terms of analysis this means that the algorithm processes the structured data to seek for what the algorithm is looking for. Recent developments in cognitive systems go beyond this. A massive amount of information is processed using natural language processing, looking for relationships in the data, looking for correlations between pieces of information, and prediction algorithms to seek for the best possible answers. These systems are not pre-programmed according to the data set. Instead these systems are trained in a knowledge area using a massive amount of high quality data relevant to that knowledge area. The advantage of cognitive systems is their ability to disclose new correlations. These systems explore all different possible correlations instead of using a pre-programmed algorithm that processes according to the known and prescribed relationships. These systems also use a prediction algorithm to find a multitude of potential answers and add a probability to each individual potential answers. Correlating information is natural for our brain. We collect information through different ways and experiences. Our brain uses all this information and the correlation of it gives us one or more outcomes, this without a pre-programmed decision tree or an 'if-then-else' sequence. The ability to learn and to improve our thinking remains unique, but cognitive systems extend these abilities. Their ability to process a massive amount of information goes beyond our capabilities and therefore they can assist in our human thinking.</li>
</ol><p>The next shift in technology is coming, and the above four drivers (among others) will likely impact this shift and bring us to the next wave of technology with an impact of on the technology landscape and a better cost versus performance ratio.</p>
<p>This might also bring us closer to the technology singularity. Kurzweil predicted the singularity to occur around 2045 and Vinge predicted it for some time before 2030, so our next shift in technology will most probably bring us very close to this point.</p>
</td>
</tr><tr><td colspan="2">
<p>Still lots of innovative ICT research is required, but let’s think positive and look forward to the exciting times ahead!</p>
<p>Dear Jaap, it was a tremendous honour to work under your supervision. Working with you was an incredible once in lifetime learning experience. I will always remember your continuous drive for perfection, your curiosity for the details, and your attention for consistency. Thank you very much! I wish you all the best for the next episode in your personal life, and as all good things come in threes: (1) wish every day your life may be filled with love, (2) wish you a continuous good health, (3) and wish you everyday happiness!</p>
</td>
</tr></tbody></table><p> </p>
</div></div></div>Tue, 01 Mar 2016 13:22:14 +0000bogaert485 at http://bogaert.comhttp://bogaert.com/de/blog/cloud2singularity#comments