Wview is weather station software used to collect, store, process, and publish weather data from a hardware based weather station. Wview is open source and Unix based so there are versions for Linux, BSD, Mac OSX, and more.

The NetCodger has been running wview on an old Linux PC for about three years, collecting data from a Davis Vantage Vue weather station. Wview has proven to be very powerful, reliable, and very customizable. Though customization does take a lot of work.

In the course of testing the newly released Raspberry Pi 2, the NetCodger sought to move his present wview installation over to a Raspberry Pi 2 to see if the Pi was up to the task. Not to mention free up some office space and reduce the electric bill.As it turns out the Raspberry Pi 2 can run wview with ease. CPU and memory use were minimal on the Raspberry Pi 2 while running wview.

Wview, like so many other software packages, assumes that it will be run with an Apache 2 webserver. The NetCodger is confident that the Raspberry Pi 2 could manage wview and Apache 2. But, he feels that the memory use and overall “weight” of Apache 2 is just unwarranted on a resource constrained device like the Raspberry Pi and instead opted to use the Lighttpd webserver to replace Apache 2.

Unrealeted to the webserver, there were performance issues on the Raspberry Pi 2 with a feature of wview that regenerates Hi/Low data records at startup if the Hi/Low database is missing or corrupted. Hi/Low regeneration can take a long time and the wview documentation warns of this. This is true of any system, but on the Pi it was taking more than 30 seconds per week’s worth of records. This meant that to regenerate from a three year data set of over 200,000 records it took well over an hour on the Raspberry Pi 2. That’s a very long startup. The need for such regeneration shouldn’t happen often, if ever, but it has been known to happen and this lengthy process is a cause for codgerly concern.

In an effort to reduce disk writes on the Raspberry Pi’s micro SD flash card and in hopes of speeding up wview’s Hi/Low regeneration, the NetCodger implemented a RAM disk and set up wview to store the Hi/Low Sqlite3 database(wview-hilow.sdb) and some other frequently changed files there. An init script to save the data to persistent storage on shutdown and load data from persistent storage on startup was also implemented. This significantly reduces the disk writes to the micro SD card, but regeneration performance was not improved. This perplexes the NetCodger.

Overall, wview is a fine choice for weather station software and wview runs very well on the Raspberry Pi 2. If you’d like to try it for yourself, the NetCodger has provided the following Bash script to quickly update a new Rasbian OS install, install wview, and create the RAM disk and startup environment for wview. There’s also a second optional script that can be used to import data from an existing wview installation, if you choose to retire your existing system. To use it, save it in a file on your Raspberry Pi. Make sure to make the file executable with the sudo chmod +x installWview.sh command and then run the file with sudo ./installWview.sh

The Raspberry Pi series of credit card sized computer boards comes with a full-sized HDMI output interface. It’s pretty great and with the built-in GPU hardware acceleration it works very well, even when playing 1080p video content.

But, there are endless cases where you might choose to run the Raspberry Pi as a “headless” server. This means running without any kind of monitor attached to the Pi and, for that matter usually no keyboard or mouse either.

Once Raspbian is installed, running your Pi headless is no problem at all. Any interaction that you need to perform on it can be done via an SSH session from another computer with monitor, keyboard, and maybe even a mouse. That’s great! Right up to the point that you decide that you need to re-install Raspbian or you want to install one of the other OS options. At that point, you need to connect an HDMI display to your Pi and a keyboard etc. Alternatively, you may prefer to pull the micro SD card and re-image it from another computer, but there may be an easier way.

The Raspbery Pi developers have thoughtfully provided an option that will allow you to install or reinstall your OS using their NOOBS installer utility. Unfortunately, they don’t seem to talk(document) about it much. But, if NOOBS was used to install your OS, then you can easily access your Pi and install or reinstall your OS remotely on a headless Pi.

To accomplish this you’ll need to make a simple edit of a text file in the NOOBS partition and then connect to the Pi with a VNC viewer application. The first step will be to gain access to your NOOBS partition from the Pi command line in order to edit the necessary file. This might be old hat to Linux veterans, but the NetCodger will go through the motions anyway, beginning right after you’ve logged into the Pi via SSH.

You need to create a directory to serve as a mount point for the NOOBS partition.

pi@raspberrypi ~ $ mkdir n00bs

Next we mount the partition.

pi@raspberrypi ~ $ sudo mount /dev/mmcblk0p1 n00bs/

Next we alter the file.

pi@raspberrypi ~ $ sudo nano n00bs/recovery.cmdline

Inside the recovery.cmdline file we append the command line to look like this.

quiet vt.cur_default=1 elevator=deadline vncinstall forcetrigger

Notice the vncinstall keyword. This starts a VNC server when the N00bs utility is booted. The forcetrigger keyword forces the Pi to boot to the NOOBS utility on the next boot, rather than back into Raspbian. This is similar to holding the Shift key, on a keyboard attached to the Pi, while it initially boots.

At this point you’re done. Save the recovery.cmdline file and reboot with the sudo reboot command. Your Pi will then boot up into the NOOBS utility and have a VNC server running. Use your preferred VNC viewer and connect to your Pi’s IP address for a nice graphical OS installer/reinstaller.

After N00bs has finished and you click the final OK button, NOOBS will boot your Pi into the newly installed OS(Raspbian). This will end your VNC session. You must now connect to your Pi again via SSH to the newly installed OS and you should probably run the Pi’s initial configuration utility sudo raspi-config to set your internationalization settings and timezone.

But, you’re not done yet. Your NOOBS configuration has not changed back and if you restart your Raspberry Pi now, you will be booted back into the NOOBS utility to begin the install process all over again. If this happens to you, simply reconnect via your VNC viewer and click the Exit icon. But to prevent this form happening again, you must remove the changes that you previously made, from the recovery.cmdline file.

Nagios 3 is a popular choice for network host and service monitoring. This open source project can be relatively simple to setup and have working quickly. Compared to more in-depth network monitoring systems, both commercial and open source, Nagios may seem somewhat crude and limited at first. But, the beauty of Nagios 3 is the ease of adding service checks or monitors and expanding the system yourself. This makes Nagios 3 have a low barrier to entry while still having the ability to grow to very large and complex systems. Nagios 3 is pretty powerful and though it is routinely panned for not being “scalable”, it actually scales quite well and is used to monitor everything from small networks with just a handful of systems to very large networks with hundreds of nodes and thousands of services.

While more powerful than previous versions, the Raspberry Pi 2 is a low powered computer and I don’t expect it to have the same capabilities of a PC with an Intel i5 or i7 processor with lots of RAM. That’s why, when looking to install Nagios 3 on the Raspberry Pi 2, I kept a wary eye on dependencies that might impact the Raspberry Pi 2’s performance.

The biggest consumer of computing resources with the Nagios 3 stack is the defacto use of the Apache 2 web server. The Raspberry Pi 2 is indeed capable of running Apache 2 very well, especially for a low number of concurrent connections. But, Apache 2 is just not needed for serving something like Nagios 3. For this reason, the NetCodger chose to eschew Apache 2 and use the very much leaner Lightttpd webserver. Lighttpd consumes a tiny fraction of the computer’s resources, compared to Apache 2 and is a great choice for this use case.

There are many how-tos about getting Nagios 3 running as well as several that are Raspberry Pi specific. Since this installation was to be an experimentation and testing project, the NetCodger anticipated numerous reinstalls and wrote the following Bash script to automate the installation process.

The following script takes a fresh installation of Raspbian, updates it and the Raspberry Pi firmware, and installs and configures Nagios and Lighttpd. Since Raspberry Pi and Raspbian is a Debian system the default mail system, used for Nagios notifications, is Exim 4. The NetCodger prefers to work with Postfix, so this script excludes Exim and installs Postfix instead. Upon completion of the script, one simply configures Postfix to their needs and can then begin the standard Nagios setup or configuration of checks. Alternatively, if you are migrating from another system you can easily copy over the check configurations of an existing Nagios 3 system.

Late to the party as always, NetCodger has been meaning to try out a Raspberry Pi for going on three years. It’s very small and consumes little power, but despite the endless raving about how fabulous Raspberry Pi is, it also doesn’t have too much processing power and thus limits its usefulness to the NetCodger. That is, until now. The Raspberry Pi Foundation recently released Raspberry Pi 2.

The Raspberry Pi 2 ups the ante quite a bit. It is presented as being up to 600% faster than the previous Raspberry Pi version, thanks to a quad core 900Mhz ARMv7 processor and 1GB of RAM. At $70, after adding the requisite micro SD card, case, micro USB card, power supply brick, and shipping, I could no longer make an excuse for not giving the Raspberry Pi a try.

I’m all for testing things and for playing with new stuff, just because. But, I’m more interested in finding out if the Pi 2 is genuinely useful for more than hobbyist tinkering and supposedly teaching children about the “exciting” world of computer programming. In fairness, the general purpose input/output(GPIO) pin options provided by the Pi is a compelling factor. But my limited needs for such control have so far been better served by Arduino Nano, so I have no immediate need for this from the Pi and it has never been a driving force for me.

I do have a few real-world applications that, if the Pi 2 is up to the task, would be beneficial in reducing the power and physical space presently being used by existing machines running network monitoring software(Nagios3), a personal weather station, and a custom built webcam streamer/recording script. I don’t really expect the Pi 2 to manage the latter, but if it can actually manage stitching jpegs together into H.264 videos in a timely fashion, then the NetCodger will be really impressed.

Now that I’ve received my Raspberry Pi 2, I’ve played with installing and reinstalling the Raspbian(Debian 7 – Weezy) OS. I’ve quickly become comfortable with this tiny PC and I’ll start testing for my applications next. It really feels like any Linux PC at this point, using Debian was a very good choice.

I’m still of the mind that there are significantly better hardware platforms than the Raspberry Pi 2 for about the same price or just a little more than the Pi. Devices such as cheap smartphones and tablets presently running Android, that have similar processing power to the Raspberry Pi 2 but also offer built in battery power, touch screen, camera(s), WiFi, Bluetooth, and more and all in a form factor that is even smaller than the Pi. However, I’m not aware of anyone who has figured out how to easily run Debian or any other Linux distribution natively, rather than slow and limited chrooted apps, on them like you can with the Raspberry Pi. The Raspberry Pi is an open platform by design, while the Android devices are still very much closed hardware.

Share on:

Like this:

Looking at the Net Codger WordPress blog stats, I see that there is a statistic for clicked links. This represents links on your blog that readers have clicked. But, some of the Codger’s click stats don’t make sense.

A link in a web page is essentially an address(URL) to another page. When the reader clicks the link his browser, the client, makes an HTTP request to the server at the specified address. The connection is direct from the client to the server. The server records the request in its log files and that log record is counted as a hit. The client also provides the server with the URL of the referring page, which the server also records in the log.

When you click on this link to Google your browser directly retrieves the Google home page. Google’s servers log the request as well as the fact that it was this Net Codger page that you were referred from. This is a click.

Since Google’s server receives and responds to the request, it is expected that the Google servers would have a record of the click. But, since your client goes directly to the Google server, the transaction should not interact with WordPress in any way. So, in theory WordPress should have no knowledge of this click or any other request/click to a non-Wordpress site. Yet, WordPress is somehow recording clicks to third party URLs. How is this possible?

Some sites accomplish this kind of click tracking by wrapping the final destination URL in a wrapper URL that sends the request first to their site which records the click and then redirects the client to the third party site. Google does just this so that they can record clicks from their search results.http://www.google.com/url?q=https://netcodger.wordpress.com/&sa=...
Above is a link to the Net Codger’s blog from a Google search page. We can see that the URL points in fact to Google whose servers then redirect your browser to your favorite blog.

But, I don’t see evidence of WordPress doing this. Certainly they are not doing such URL wrapping/redirection consistently. So, how is WordPress recording clicks of links to third party sites? The Codgery is perplexed. There is skull duggery afoot!

First, let me point out that these days a lot of malware hijacks your DNS settings. Some malware places entries in your C:\Windows\System32\Drivers\Etc\hosts file. Make sure that there is nothing in there besides the loop back address 127.0.0.1. Some other malware changes the DNS servers that your system uses. Since most networks assign IP addresses and DNS servers dynamically using DHCP, if you see manually entered DNS server settings that you didn’t create, you’re likely infected. In both cases, clean off the malware and reset the DNS settings to default, typically Obtain DNS server address automatically.

But, if your system is clean and it is indeed the ISP that is feeding you undesired search pages for mistyped URLs or your system is being slowed by delays in DNS responses you have a few options. One such option is to run your own DNS server on your local system or local area network(LAN). If you’re up to that challenge, you’re probably not reading this, so I won’t get into it here. But, if like most people you have your PC directly connected to your ISP’s “modem” or through a broadband router, it is simple to configure an alternate DNS server.

Which alternate DNS server should you use? That question depends on personal preference, location, ISP and who knows what else. There are several publicly available DNS servers that you could use.

Google
8.8.8.8
8.8.4.4

openDNS
208.67.222.222
208.67.220.220

Verizon
4.2.2.1
4.2.2.2

Directly Connected Hosts
On Windows 7 or Vista go to Control Panel and click Network and Internet → Network and Sharing Center → Change adapter settings. The right click on Local Area Connection and choose properties. Select the Networking tab and then select Internet Protocol Version 4 (TCP/IPv4). Once that is highlighted, click Properties.
Now, click Use the following DNS server addresses and enter a pair of the addresses from above in the Preferred and Alternate DNS server fields. As shown above, I like to use one address from two different services at the same time.

Broadband Router
The instructions for broadband routers will vary depending on which brand/model you are using. One of the best selling brands is Cisco/Linksys so, I’ll demonstrate that one here.

Login to your router’s administration page. With Cisco/Linksys, this is done by pointing your web browser to http://192.168.1.1 and using the userID admin and the password admin. The first page will look something like this:
Under the DHCP server settings enter the DNS server IP addresses that you wish to use and click save. Now, close your browser.

On your PC open a command prompt and type ipconfig /renew to immediately pull the new configuration changes from the router. And that’s it, you’ll now use the chosen servers to resolve your DNS queries, rather than those provided by your ISP.

Like this:

While putting a Western Digital My Book Live through its paces, I needed to backup a Linux system to the My Book Live which functions as a NAS. Because it took me more than just a few minutes, I thought I’d share my backup script, and the reasoning behind it, so that others can get going more quickly.

Using the flexible My Book Live, there are a lot of ways that one could backup a Linux system. You could use tar and save the backups to a network share, right out of the box. But, most Linux admins prefer Rsync for backups these days. Again, because of the My Book Live’s flexibility, you could install the rsync daemon on the My Book Live and have it receive your backups. Or you could install Rsnaphot on it and have it act as a backup server, sucking the backup off of your Linux systems.

When one updates the “firmware”(WD’s Debian install) on the My Book Live, it wipes out any user installed apps and most, though not all, configurations, returning it to a factory vanilla NAS. The Net Codger fully intended to mess with his My Book Live and would almost certainly need to restore it to factory defaults. So, for this backup scenario, I did not want to modify the My Book Live. This eliminated the rsync daemon and Rsnapshot, etc. However, since the SSH configuration is maintained across firmware updates, rsync via SSH was a perfectly viable option. So the following instructions are how to push rsync backups to the My Book Live via SSH. The script also rotates the backups and uses file system hard links to maintain numerous full backups that take only seconds to run each day while consuming minimal disk space.

The first step is to enable SSH on the My Book Live. Western Digital even provided a GUI screen that allows you to enable this service, but you have to enter the URL to it yourself. To do it, first log in to the web interface at http://mybooklive
After you’ve been authenticated, enter this case sensitive URL http://mybooklive/UI/ssh and tick the Enable SSH check box. You can now login via ssh and change the root user’s password with the passwd root command. Great job WD.

A few simple commands and your done. From now on, simply typing ssh root@MyBookLive securely logs you in with no password. This is imporatant when you want to use SSH in bash scripts, which is exactly how I’ve chose to do the backups.

The following script keeps a 30 day rotating backup of my excessively large (135GB) home directory on the My Book Live. But, thanks to rsync’s leveraging of file system hard links, 30 full backups occupy less than 300GB and nightly backups take only seconds to complete.

#!/bin/bash
#
# A backup script based on Rsync that pushes backups onto a NAS.
#
# Directories are rotated n times and rsync is called to
# place a new backup in 0.days_ago/
# Net Codger https://netcodger.wordpress.com 4/17/2012

# Run the Rsync command. Nice is used to prevent Rysnc from hogging the CPU.
# –link-dest creates hard links so that each backup run appears as a full
# backup even though they only copy changed blocks since 1.days_ago
nice rsync -av \
–delete \
–link-dest=../1.days_ago \
$SrcDir root@$NAS:$DestDir/0.days_ago/

Simple and secure, storing 30 backups in 1/15th of the space. This si a good backup script. Save it where ever you like. I like /home/NetCodger/backup.sh Don’t forget to make it executable
chmod +x /home/NetCodger/backup.sh

The only thing left is to use cron to make the script run each night. Type crontab -e and add a line like the following.
0 1 * * * /home/NetCodger/backup.sh > /tmp/backup.log 2>&1

This runs the backup script every morning at 1:00am and redirects the output to a log file in the /tmp directory. I like to do this just in case there is some problem with the backup script that I’d like to troubleshoot.

Now contrary to the all too common bragging about Rsync, Rsync is actually quite slow. Rsync over SSH is twice slower still. So the initial run of this script could take a very long time depending on how much data you are pushing. The Net Codger’s initial 135GB backup took a ludicrous 5 hours! But, all subsequent backups consist of only the files changed since the previous backup which is rarely more than a few gigabytes. So subsequent Rsync over SSH backups take from just a few minutes to as little as just a few seconds. Last night’s run was complete in 47 seconds.

That’s it, nightly Rsync backups over SSH to the WD My Book Live, or any SSH enabled storage that you choose.