AEC Scenehttps://aecscene.com
Technology in AECFri, 27 Apr 2018 23:11:10 +0000en-AUhourly1https://wordpress.org/?v=4.9.8https://aecscene.com/wp-content/uploads/2016/09/cropped-favicon-150x150.pngAEC Scenehttps://aecscene.com
3232Smart Home Part 8 – Air conditioninghttps://aecscene.com/2017/12/my-smart-home-adventure-part-8/
https://aecscene.com/2017/12/my-smart-home-adventure-part-8/#respondSat, 23 Dec 2017 00:14:04 +0000https://aecscene.com/?p=2232It’s summertime here in Australia, and things are starting to warm up. Today is one of the warmer days so far in the low 30’s (°C), but it is expected to increase as we move towards the end of the month. So that means it’s about time to connect the home’s main air conditioner into the smart home […]

]]>It’s summertime here in Australia, and things are starting to warm up. Today is one of the warmer days so far in the low 30’s (°C), but it is expected to increase as we move towards the end of the month. So that means it’s about time to connect the home’s main air conditioner into the smart home system.

There are two expectations to meet with this integration:

The ability to turn it on and off while away from the house. Mainly as a way to start cooling while on the way home.

Energy costs savings. Should an exterior door be opened, or if no-one is home, then automatically turn off.

A hopeful third idea is an integration into energy monitoring, but that will be for a future article.

Turn air conditioner on/off

As with most of my other smart device integrations, they are done so through Home Assistant and Google Home. But in this particular use case, I will also leverage Google Home’s ability, and requirement, to access the internet. That is because I would like to issue commands while off the home network, with Google Home acting as that bridge.

Using the same Broadlink IR blaster, as seen in part 7, a switch script turns the air conditioner on and off.

Now while either at home or away, commands are issued through Google Assistant on my mobile phone. This method works the same for other devices and appliances.

OK Google, turn on the air conditioner.

Due to the climate in my region I very rarely use any air conditioners for heating, so for the moment, it will only be used for cooling. To add a heating command, it is as simple as adding in a second switch script.

At a later stage, there is the option to add location-based automation. This automation could also build off work done with understanding the house temperature and comfort levels in part 6.

It would look like this. When entering a specified zone around the house, say 2km, check that no-one else is home, check that the maximum internal temperature has been exceeded, and then turn on the air conditioner. As you can see, many different data points start coming together to form a more complex automation.

Turn off after opening door

Obviously, we don’t want to waste energy unnecessarily. Should an external door to the room/s be opened for an extended period, then the air conditioning should be turned off. The below automation does just that.

The external door to my media room, when opened, activates the automation. But we don’t want the air conditioner turned off immediately in case the person is passing through the door and closing it behind them. So the ‘turn off’ action is delayed for 30 seconds. After this period, another check is performed on the door to see if it is still open. If it is, then turn the air conditioner off.

Not shown in this automation, I’ve been thinking about whether to announce through the Google Home that the air conditioning is turning off due to the open state of the door. This way the occupant would know why this happened. I’m not sure if this would be too much information.

Likewise as in part 7, when there is no-one at home ensure that certain non-essential appliances are automatically turned off. This existing automation has now been updated to include the air conditioner.

]]>https://aecscene.com/2017/12/my-smart-home-adventure-part-8/feed/0Smart Home Part 7 – Media deviceshttps://aecscene.com/2017/12/my-smart-home-adventure-part-7/
https://aecscene.com/2017/12/my-smart-home-adventure-part-7/#respondSun, 10 Dec 2017 08:56:45 +0000https://aecscene.com/?p=2230Let me start by saying that my wife and I are huge media consumers. I have an older model Synology NAS (DS214-Play) packed with media and running Plex, and we subscribe to Netflix. We love nothing better after a long week of work than to spend time over the weekend kicking back watching something. And […]

]]>Let me start by saying that my wife and I are huge media consumers. I have an older model Synology NAS (DS214-Play) packed with media and running Plex, and we subscribe to Netflix. We love nothing better after a long week of work than to spend time over the weekend kicking back watching something. And when I’m the only one home, there will typically be a podcast playing in the background throughout the house. So it makes sense that media should play a part in the smart home.

In my opinion, Google’s Cast streaming protocol is hands-down the best you can use right now, especially for the smart home. The Chromecast line of devices provides the highest level of compatibility from Android to Apple, Windows, and Linux. From mobile through to desktop PC, and support for casting video and audio. Not to mention it’s integration with a plethora of devices.

Google Chromecast 2

Better again, is that Chromecast/Google Cast is a piece of cake to integrate with Home Assistant, and, comes with native support for Google Assistant/Home. That is why Google Cast capable devices litter my home. Media, delivered to any Google Cast capable room, at any time, and from any device. [Sales pitch off]

My Home Assistant Media page

Without getting complicated, Google Cast devices work reasonably well on their own using Google Home/Assistant. Here are some examples of some native functionality.

Both my Sony Android TV and all Chromecasts plugged into non-smart, but HDMI-CEC supported TVs, can turn the TV on and off.

OK Google, turn off the media TV.

The Sony Android TV can directly have the volume changed, while the Chromecasts support software volume.

OK Google, turn the living TV volume up.

Notice in that last example I specified the living room TV. That is because my living room TV’s soundbar also supports Google Cast audio, and can be controlled separately.

OK Google, play Black Sabbath in the living room.

With the addition of a Google Home in the living room, audio only requests play automatically on the soundbar speakers, not the TV. When playing video, such as from YouTube, it automatically plays through the TV.

And finally how about something from Netflix? The subtlety of this request is that the Google Home knows which room it is in, and therefore plays on that room’s default device.

OK Google, play Star Trek Discovery.

Despite the above conveniences, it’s not all smooth sailing. Some integration is lacking between Google Home and Android TV such as not being able to change TV channels, and no Netflix compatibility. Also, media can’t be accessed from Plex media server using voice. The word on the street is that more Android TV / Google Home functionality is on its way, while there is still no word regarding Plex.

But additional benefits (conveniences) can be added through Home Assistant. Not to mention making up for some of the current deficiencies as previously outlined. Let’s take a look at some of these use cases.

Android TV – Show me the news

I arrive home at regular times, coincidentally when numerous nightly news programs are on. Depending on the time within this 1.5 hour period depends on which program I typically watch. Given this is an everyday routine, it makes for a perfect home automation task.

This automation is triggered the moment the home network detects the presence of my mobile phone. That happens the moment I pull the car into the garage.

trigger:
- entity_id: device_tracker.chad
platform: state
to: home

Then make a few checks, starting with the day and time. Also, check if the living room TV is already on. If it’s on, then assume that someone may already be watching the TV, so don’t do anything.

Android TV – Change channels

I previously mentioned that Android TV doesn’t work that well with Google Home. Until this is improved, we can work around these limitations using integrations with Home Assistant. We first saw this in a previous automation where I could change the TV channel based on the time I was home.

In this use case, I want to issue a verbal channel change command through Google Home. That proves particularly useful when the TV remote goes missing, or when the hands are full such as when cooking. Even better is that this functionality is achieved using a Home Assistant scene with just four lines of code (per channel).

- name: Channel 10
entities:
media_player.living_tv:
source: TEN HD

What’s great about using scenes is that they are easily mapped through to Google Home. But Home Assistant’s integration with Google Home is far from perfect, with the commands always coming in the format of turning something on or off. The above example would be ‘Hey Google, turn on channel 10‘.

Broadlink RM Mini3

Older TV – Change channels

Older TVs are more challenging and require falling back on traditional technology, like infrared remotes. That is where devices like IR blasters take the place of the remote control. I’m using the Broadlink RM Mini3, which already has Home Assistant support.

My integration is broken down into two parts. Firstly, the IR data for each TV remote control button press is stored, and secondly sending a combination of individual button presses to the TV.

Here is a script for a TV remote control button press, in this case for 0.

Much like a scene, scripts are also easily integrated with Google Home. In this case, to differentiate a channel change request between the living room TV and the media room, the media room TV command includes the room name, ‘OK Google, turn on media channel 10‘.

For turning on ‘channel 10’ a script sends multiple commands via the IR blaster to the TV; media_channel_0, media_channel_1, media_channel_0.

]]>https://aecscene.com/2017/12/my-smart-home-adventure-part-7/feed/0Xiaomi smart device sale at Gearbesthttps://aecscene.com/2017/11/xiaomi-smart-device-sale-gearbest/
https://aecscene.com/2017/11/xiaomi-smart-device-sale-gearbest/#respondSat, 11 Nov 2017 21:38:37 +0000https://aecscene.com/?p=2342I’ve recently been outlining my journey down the smart home path, currently with a strong focus on sensors for ESD monitoring. The online retailer Gearbest where I purchased my devices is having a short sale on these products. If this is something you have been thinking about, now is a great time to start. All sales […]

]]>https://aecscene.com/2017/11/xiaomi-smart-device-sale-gearbest/feed/0Online retail could impact residential architectural designhttps://aecscene.com/2017/11/online-retail-impact-residential-architectural-design/
https://aecscene.com/2017/11/online-retail-impact-residential-architectural-design/#respondTue, 07 Nov 2017 22:14:24 +0000https://aecscene.com/?p=2310The term last mile delivery might not sound exciting, but it’s the new battle for your dollar in online retail. That final leg to your doorstep, considered a costly step, holds many opportunities for retailers. From finding savings in operating costs, ensuring on-time or same day delivery, or providing a personal touch upon final receipt. […]

]]>The term last mile delivery might not sound exciting, but it’s the new battle for your dollar in online retail. That final leg to your doorstep, considered a costly step, holds many opportunities for retailers. From finding savings in operating costs, ensuring on-time or same day delivery, or providing a personal touch upon final receipt. Retailers can think of many reasons why this could be the key to winning your online business, and therefore need to secure its control.

However, it could also have some small knock-on effects on the future of residential architecture. That’s because the delivery of a package to your door is one thing; ensuring safe and secure receipt is another. With delivery theft on the rise, the challenge facing retailers is guaranteeing your goods are delivered safely and remain that way until you are home.

In last year’s McKinsey report, Parcel delivery: The future of last mile, 30% of respondents said that they are willing to pay extra for a faster and reliable delivery service. While this year, a surge in availability of connected security cameras and smart devices suggests homeowners are also seeking better home security. When these two factors are combined, we discover an unusual interface between arriving deliveries and secured short-term storage at home. Retailers began exploring these options many years back, with some now starting to come to fruition.

Retail Delivery

The last mile journey begins with actually getting the package from a local warehouse to your front door. While we wait for our two-legged android servants to arrive, drones have long thought to be the most obvious solution. Alternatively, terrestrial robots known as droids are equally gaining recognition as different methods of delivery. Regardless, receival zones at the place of residence need to be considered, either on the roof or at ground level. These zones would require a means to secure the delivery from theft, and be kept away from the elements.

Deliveries by drone or droid are likely to be more suitable for non-perishable and non-refrigerated goods, but not so much for those living in apartments. In this case, different technologies mixed with a change in social norms may produce other new and unusual options.

Despite the widespread adoption of the sharing economy, McKinsey suggests it won’t have a strong presence in the delivery chain. But the elimination of all human interaction is equally unlikely, with hand delivery a preferred choice (for now) by the retailer. A new scenario could be that a real person still carries your package up to your door, then… opens your door, places the package inside, then leaves, closing the door behind them. All the while no-one else is at home.

Sounds a little creepy right?

Security

Yet this is precisely what online retail giant Amazon has recently announced with their Key product. You fit the home’s front door with a smart lock, and a smart camera inside looking towards the door. This combination provides unique front door access to the delivery person, and the camera some visual security. Trust obviously plays a large part in this relationship, but it seems we have officially entered a new era of home connectivity. More on this later.

But it may not stop there.

Now consider the delivery of other consumables, particularly perishables. As we become overwhelmed with work, online grocery shopping and other pre-prepared / meal kit services grow in popularity. Just as though you went shopping yourself, these items ideally should be refrigerated as soon as possible. So, could you see yourself allowing a grocery delivery person further access into your home, such as up to your refrigerator? These are the questions retailers are no doubt asking themselves right now. How much is the home occupant willing to endure in the name of convenience?

Add to this mix smart appliances and smart rooms. Refrigerators and pantries that know which staple food items need reordering, or the bathroom requesting toilet paper and toothpaste as stocks get low. To a point, some appliances and devices can already do this. The rooms those appliances exist in may too have built-in intelligence through computer vision to track items and occupants. Residential design will move beyond the physical bricks and mortar, to the point where software/AI engineering becomes equally important.

A home that does the grocery shopping for you

Connecting all the dots, what we end up with is a home smart enough to interface directly with the retail supply chain. Throw in a splash of blockchain technology, and you now have a full chain of custody where the home not only orders the goods but also takes the final receipt.

Consider when the fridge detects you are low on milk. The fridge places an order to the supplier, who then processes and despatches it. A delivery person brings it to your front door where the home authenticates their identity first through the front door’s smart lock, and again through computer vision. Their identity is confirmed and access granted. They are tracked by internal security cameras as they place the milk into the fridge and then leave closing the locked door behind them. Should the front door forget to be locked, the home can do this too. The cycle continues.

The home of the future is only set to become smarter. It will learn how its occupants interact with it and with others. It will make recommendations, and as we’ve seen above even act on your behalf. These are considered design factors with untested social impacts. More questions will need asking, and issues overcome. But eventually, what appears to be creepy to us now will likely evolve, be tested, and ultimately one day become commonplace.

]]>https://aecscene.com/2017/11/online-retail-impact-residential-architectural-design/feed/0Smart Home Part 6 – Internal comfort levelshttps://aecscene.com/2017/11/my-smart-home-adventure-part-6/
https://aecscene.com/2017/11/my-smart-home-adventure-part-6/#respondThu, 02 Nov 2017 21:38:58 +0000https://aecscene.com/?p=2183When starting down the smart home path, I thought tracking building performance would be one of the more fascinating aspects. After all, having a better understanding of the home’s weakness’ can lead to better insights for future improvements. And while it may take a year to gather a full dataset, there really shouldn’t be any […]

]]>When starting down the smart home path, I thought tracking building performance would be one of the more fascinating aspects. After all, having a better understanding of the home’s weakness’ can lead to better insights for future improvements. And while it may take a year to gather a full dataset, there really shouldn’t be any reason why that data couldn’t be put to work right now. So I set about testing some ideas, and the following is where I ended up.

Given the present state of my home, I’m not too keen to make any radical changes to its physical structure. So instead, I looked towards existing features like doors and windows. As manually operable items in the home, doors and windows help with regulating the internal comfort levels. But on their own, they are not smart. They have no way to comprehend when to be opened or closed. My goal, to change that through leveraging building performance data.

In this case, my existing home automation hub is performing the heavy lifting. Based on some simple temperature rules it advises the occupants which doors and windows to open and close, and most importantly, when. By optimising this process, the system can notify the occupants of the smallest environmental changes, that we as humans are unable to detect. My prediction here is that the temperature fluctuations throughout the home can be smoothed out. Hopefully, the house will remain warmer for longer in winter and cool down quicker in summer.

Sensors

At the heart of capturing performance data are sensors. For this system, they come in two forms, physical and template. Physical sensors are the actual devices collecting the data. Template sensors are additional virtual entities (sensors) within the system that converts the physical data it into new data.

In this first step, we can see the physical sensor (sensor.temperature_158d00012d9624) does not have a friendly name. So we take that sensor’s temperature attribute and use it to create a new templated sensor (temperature_living). This sensor name becomes more easily recognisable for use elsewhere in the system.

While not critical to the operations, I also wanted to track whether the temperature was decreasing or increasing. Home Assistant provides a useful Trend template sensor for this. Firstly it needs two template sensors, one for decreasing and a second for increasing.

This trend data then passes through another template sensor to provide customised human-readable values of decreasing, increasing and no change. This template also supports customisation of the sensor icon.

The states of the door and window openings in each room also need to be tracked. Refer to this previous part for some more details on using openings.

For an opening template sensor we capture the physical sensor state value, and then convert its False or True data into a closed or open value. It might look a little confusing here because the resulting values are False or True, but this is converted into closed or open using the device_class: opening line.

Comfort level

For Home Assistant to understand when to trigger a comfort level event, it needs some bounds put on the operating range. In the following, the comfort level Set Point is defined based on the time of the year. A lower temperature in winter, and higher in summer. Then the Dead Band High and Low are calculated relative to the Set Point.

The last template sensor defines whether a zone in the house should be opened or closed. This sensor considers numerous factors in the calculation; room temperatures, room openings, external weather, and the defined comfort levels.

The general rules for each state are, if:

Open

Check if all openings are closed,

and the center zone temperature is greater than the Dead Band High,

or the center zone temperature is less than the Set Point and the outside temperature is greater than the Dead Band High.

Close

Check if all openings are open,

and the center zone temperature is less than the Dead Band High and the outside temperature is less than the Dead Band Low.

]]>https://aecscene.com/2017/11/my-smart-home-adventure-part-6/feed/0Hands-on with the new HP Z VR Backpack PChttps://aecscene.com/2017/10/hands-on-new-hp-z-vr-backpack-pc/
https://aecscene.com/2017/10/hands-on-new-hp-z-vr-backpack-pc/#respondMon, 23 Oct 2017 22:01:09 +0000https://aecscene.com/?p=2255Last week I was fortunate to spend some extended time with the then-unreleased HP Z VR Backpack PC. In case you aren’t aware, going wireless with VR is a highly sought-after feature. And as anyone who has experienced tethered VR will tell you, the slightest tug on a headset cable is enough to remind you […]

]]>Last week I was fortunate to spend some extended time with the then-unreleased HP Z VR Backpack PC. In case you aren’t aware, going wireless with VR is a highly sought-after feature. And as anyone who has experienced tethered VR will tell you, the slightest tug on a headset cable is enough to remind you of your physical surroundings. Suspending disbelief within a completely digital world is the ultimate goal for VR, with a wireless experience being another step towards that.

Unlike the previously released HP Omen X Backpack PC released to the consumer market, the new Z VR is squarely aimed towards the enterprise. Professional users are in mind; those who need VR both in the office, and out. It looks like it could be the perfect hardware for taking out to a customer’s home or office, or in a design office’s showroom. There are some obvious target audiences; those in architecture, or real estate. Or looking at the fringe use cases of virtual site inductions, or a design collaboration session.

The key specs of the PC:

Intel® Core i7® processor 7820HQ

Up to 32GB DDR4 RAM

Up to 1TB storage

NVIDIA® Quadro® P5200 16GB Graphics

So what are my impressions of the unit?

The design

Click for larger image

The first thing you notice when you hold the PC and backpack in your hands is how heavy it feels. At around 4.5kg (including the backpack) it’s heavier than a regular laptop. This can be initially concerning, but then you remember that it’s a wearable device. As soon as it is strapped to your back and you fit the headset, the weight seems insignificant. The freedom you now have to move around your physical space, unobstructed, overwhelms the senses. And the technology fades into the background as an extension of your body.

To the left is a photo of my colleague, where you can see the size of the unit. It’s not much bigger than the size of a regular laptop. Being quite thin the PC sits close to the user’s back to reduce the centre of gravity, but not too close. The side vents expel heat from the processors. Therefore with the PC sitting slightly off the user’s back provides for better air circulation and comfort for the user.

HP Z VR Backpack PC with HTC Vive

The backpack’s harness is sturdy. Made from durable fabric much like a camping backpack, and with what looks to be a plastic support to mount the PC. To mount, the PC simply slides down over a few brackets and clips into place.

And this leads to an important aspect of the Z VR Backpack PC. It is also a fully functioning computer. Unclipping it from the harness and docking into the HP Z VR Dock provides a desktop PC. If your the kind of professional which spends 50% of your time in the office designing, and the other 50% with customers, then this could be an all-in-one solution. Note though, that when outside the office you will need a small (portable) monitor of some kind to plug the PC into, or TV. I’m not suggesting it is perfect.

At the base of the PC are ports for docking and charging, and for plugging in the two hot-swappable batteries. Even though the PC has an inbuilt battery, it will only last around 10-15 minutes before hibernating. This is more than enough time to un-dock, mount to the harness, and connect to the batteries. While on external battery power I didn’t time usage, but it seemed to get around 45-60 mins. This should be more than enough time for a presentation to a customer. If not, then with more hot-swappable batteries usage time can be extended.

As for the top of the PC, this where everything plugs in. There is a HMD power port, 2 USB 3.0, 1 HDMI 2.0, 1 mini DisplayPort 1.3, 1 Thunderbolt (USB Type-C connector), and 1 headphone/microphone combo.

Bottom portsTop ports

To get a VR scene up and running, I first had to plug the PC into a monitor using either the dock or the mini DisplayPort. This allows the PC to act as a PC to begin the VR software session. The software I used was Enscape running on Revit 2018, and Autodesk Live. Once the VR session has started, the PC is disconnected from the monitor and it changes over to the VR headset for use.

The PC is reported to also have Miracast support via Windows 10. Miracast is a wireless display protocol, capable of transmitting HD video to an external display. This provides non-participants with the ability to mirror what the VR user is viewing. Some TV’s have Miracast support, and if not then Miracast dongles can be plugged into unsupported TVs and monitors. Unfortunately, I did not have a Miracast capable device at hand to test this out.

Performance

As for the performance, there’s not much to say. Given the higher-end specs of the PC, you would expect it to perform well for VR, and from what I was testing it for, it did. I would expect that as the hardware starts reaching customers, then performance benchmarking websites will do more in-depth reviews. So I will leave that to the hardware experts.

What I can also comment on was when connected to a HP Z Display Z34c, 34-inch ultra-wide curved display. While not running in VR, the PC had no problem pushing the pixels in a full-screen Enscape or Live session. The experience was buttery smooth and looked great in the ultra-wide format.

Overall this VR-ready backpack does what it’s designed to do. It provides the power required for an optimal software experience, and with the portability. But it doesn’t come cheap. It will set you back a little more than a regular VR gaming laptop. Although, since it doubles as a desktop workstation there is an opportunity to offset the costs with only having a single PC.

Article Disclaimer: I work for an employer who provides VR solutions and services. Comments are my own, not my employer.

]]>https://aecscene.com/2017/10/hands-on-new-hp-z-vr-backpack-pc/feed/0Smart Home Part 5 – Control lights using opening and motion sensorshttps://aecscene.com/2017/10/my-smart-home-adventure-part-5/
https://aecscene.com/2017/10/my-smart-home-adventure-part-5/#respondThu, 19 Oct 2017 20:51:09 +0000https://aecscene.com/?p=2180One of the great things about taking on a project such as home automation is experimenting with the different combinations of devices. Take lighting as an example. Traditionally, you flick a switch to turn lights on and off. In some cases like external floodlights, the addition of motion sensors improves the control. Now, take a […]

]]>One of the great things about taking on a project such as home automation is experimenting with the different combinations of devices. Take lighting as an example. Traditionally, you flick a switch to turn lights on and off. In some cases like external floodlights, the addition of motion sensors improves the control.

Now, take a moment to step on outside the box. Start thinking about what other ways could you control the activation of a light? And when it activates, could that light convey some form of meaning rather than just as a source of illumination? This is what I set out to find, and for my home, I came up with two practical use cases.

Driveway lighting

Xiaomi door sensor mounted on a custom 3D printed bracket

My house is not the ideal candidate for commonly available smart lighting due to every internal light being an MR16 fitting. It must be difficult for manufacturers to get the smarts small enough to fit into a tiny downlight.

But with three external lights using E27 fittings (two at the driveway), it meant I could at least use some smart lighting. After my success with the Xiaomi sensors, I opted for a similar lighting brand, Yeelight. A much cheaper alternative to the Philips Hue. The benefits of Yeelights are that they connect directly to Wifi, and are Google Assistant compatible out of the box.

So how are these lights being controlled? If you were thinking through a motion detector, then you’d be partially right. But not the motion of a person, rather the motion of the garage door. That is because as I go from inside to outside through the garage door and at night, I want the lights to go on. And as I drive into the driveway and remotely open the garage door from the car, I also want some extra lighting.

Using a custom mounted door sensor it communicates with the lights to activate them. After 10 minutes the lights are then automatically turned off.

At a later stage when a driveway facing security camera is installed, I’ll use the camera’s computer vision to sense driveway movement and activate the lights instead.

In this video you will see the lights activate as the door opens, then through Google Assistant they can be manually turned off. Note when the lights are turned off, the Home Assistant server recognises this and updates its interface. The two have a symbiotic relationship.

Hallway status light

Now that the garage door has a sensor, as well as other openings around the house, those states can be used to check which is an open state. Currently, my Home Assistant has a page which shows me those details. The problem with this method is having to use a device, and open that page which is somewhat inconvenient.

Tracking door and windows through Home Assistant

I found an easier way has been to use the Xiaomi Gateway’s built-in light as a colour-coded signal, triggered by a motion sensor every time I walk by. A red colour indicates the garage door is open, orange for when the garage door is closed but when other doors are open, and green for when the house is completely closed.

]]>https://aecscene.com/2017/10/my-smart-home-adventure-part-5/feed/0Smart Home Part 4 – Sensors & new hardwarehttps://aecscene.com/2017/10/my-smart-home-adventure-part-4/
https://aecscene.com/2017/10/my-smart-home-adventure-part-4/#respondSun, 15 Oct 2017 20:42:36 +0000https://aecscene.com/?p=2146It’s been a while since my last smart home update. Six months to be exact, and has provided an opportunity to test and refine the base automation system. During this time, the system has expanded with the addition of more sensor devices, allowing the gathering of even more environmental data for temperature and door/window states. Besides […]

It’s been a while since my last smart home update. Six months to be exact, and has provided an opportunity to test and refine the base automation system. During this time, the system has expanded with the addition of more sensor devices, allowing the gathering of even more environmental data for temperature and door/window states. Besides the new devices, the performance of the underlying hub has also received substantial upgrades.

In this post, we’ll take a look at the devices, and hardware setup. And in following posts, covering more details on how the devices are being used.

Sensors

Chinese manufacturer Xiaomi is well known for their mobile phone products. But they also make other consumer electronics, including smart devices. Their products are also considered extremely affordable and with a high build quality, making them excellent value for money. On average a device only costs AU$10-12 each, compared to other brands which could cost up to four times that amount.

Smart device technology, such as these, have come some ways over the past few years. It means low-powered wireless devices can now easily run off button-sized batteries. Xiaomi claims that their devices can run for up to 2 years. Given that I have been running some devices for four months with minimal drop in charge, I’m likely to agree. The included batteries weren’t at full charge, but have only lost about 15-20% during that time.

These specific devices do need an accompanying Xiaomi Gateway to operate. The gateway acts as a central connection point, and from here it connects through to the main home automation hub. One other reason for choosing Xiaomi devices is the integration with Home Assistant. Connecting it to Home Assistant allows for the Xiaomi platform to communicate with other device platforms. Cross-platform device communication is after all how the home becomes smarter.

Xiaomi products can’t be purchased in many locations directly outside of China. Although, the importation of them from certain online stores is pretty straightforward.

Hub hardware

My Home Assistant (HA) hub started out running on a Raspberry Pi; a low-powered computer. While still a suitable hardware platform to run from, it does mean that HA development is slower. This is noticeable when rebooting after config changes, and through the user interface.

I am also capturing sensor data through extra software, InfluxDB and Grafana. Until I was confident they were reliably recording the data I was after, they were previously running on an old laptop.

Intel NUC running CentOS and Docker

So rather than running two computers, they were merged into a single dedicated home automation server. Having before had good experiences with Intel NUCs this seemed a logical way to go. NUCs have a small form factor, are reasonably powerful, and yet remain relatively energy efficient.

The installed operating system is a headless version of CentOS Linux, with Webmin chosen as a browser-based admin interface. Wanting the server to be as flexible as possible, Docker was chosen to manage the applications.

Both CentOS and Docker were completely foreign to me and came with a moderate learning curve. Having a dockerized server is something which had interested me for a while so I decided to take the plunge. The extra effort has paid off, and the server has been running smoothly since.

]]>https://aecscene.com/2017/10/my-smart-home-adventure-part-4/feed/0Autodesk Revit 2018.2 Updatehttps://aecscene.com/2017/10/autodesk-revit-2018-2-update/
https://aecscene.com/2017/10/autodesk-revit-2018-2-update/#respondWed, 11 Oct 2017 23:30:50 +0000https://aecscene.com/?p=2135Autodesk has released a range of new feature videos for Revit 2018.2. For a full list of the improvements and resolved issues, check out the release notes. But of course, before you can use these features you will need to install the update. This will bring Autodesk Revit 2018 up to build: 20170927_1515. Update Download

]]>https://aecscene.com/2017/10/autodesk-revit-2018-2-update/feed/0ReCap Pro for Leica BLK360 tutorialshttps://aecscene.com/2017/10/recap-pro-leica-blk360-tutorials/
https://aecscene.com/2017/10/recap-pro-leica-blk360-tutorials/#respondWed, 11 Oct 2017 09:33:06 +0000https://aecscene.com/?p=2128Last year at Autodesk University, Leica and Autodesk announced the BLK360. To briefly recap, the BLK360 is a slightly new take on traditional laser scanning hardware. It’s not designed to be high-end, but at the price point it’s being provided at it is able to offer companies access to the technology who otherwise couldn’t afford […]

Last year at Autodesk University, Leica and Autodesk announced the BLK360. To briefly recap, the BLK360 is a slightly new take on traditional laser scanning hardware. It’s not designed to be high-end, but at the price point it’s being provided at it is able to offer companies access to the technology who otherwise couldn’t afford it. Certainly, its small size for portability is one of its core strengths, along with the ease of use through the mobile app.

Since then, there has been a steady and staged release of the hardware across the globe. For myself here in Australia, it was only a few weeks ago that C.R. Kennedy hosted a few events around the country to show off the product.

Fast forward to yesterday, to help explain a few things about reality capturing through laser scanning Autodesk has released a series of short tutorial videos featuring the BLK360. If laser scanning is something you’ve been curious about, then take a moment to watch the videos. You’ll learn some basic best practices about the scanning process, and how easy the BLK360 is to operate in conjunction with the mobile app and desktop software.

And if you want to expand a little more on your Autodesk ReCap desktop skills, then why not also check out some of the tutorials on the learning pages.