Category: Home Data Center

I suspect some SEO will drive the wrong audience to this post with the above title. NAS Hot-Swap Love is all about being able to change the drives in my Synology Network Attached Storage (NAS) servers without powering them down. The act of doing this while the power is on and the units are still running is called a “Hot-Swap”.

As I mentioned in a previous article, I run my home network with two Synology NAS units for back-up and storage media. These units are great. Between the two of them, I have 30TB of disk in RAID-5 configuration. They would be even more valuable if I had anything more impressive than old racing videos to store on them. You can only take so many pictures of your cats – no matter how good looking they are.

Right before I relocated to Virginia in July 2019, my first Synology DS1515+ died. They replaced it no charge and said it was under warranty. I removed the drives, numbered them, and shipped them to VA. I had Synology send the new system to VA as well. When I put it all back together, it actually worked! Which was great news, until the first drive died.

I shut it down, ordered a new replacement drive and waited for it to arrive. Upon arrival, I replaced the drive, rebuilt the RAID set, and yes, all my useless files were still available. This again went smoothly for a month unitl the second drive died. I knew what that beeping sound was this time. However, I ordered 2 new drives – as something told me these drives would start failing one after the other…

Last month, the third drive failed. I became emboldened! I took the drive out while the unit was still on, put the new drive in the housing, and reinserted it. It allowed me to rebuild the RAID set and I never was offline for one second. This was NAS Hot-Swap Love at its very finest. I did not play opera music while performing the operation – but I could hear it faintly in my head.

Alas, I realized I had exhausted my supply of 3TB drives – and had never stocked any 6TB drives for my new DS1517+ unit. As the picture above shows, I now have a supply of two 3TB and two 6TB drives waiting for the next drive failure. I have never felt so prepared for NAStiness.

Since it comes as a Java Jar file, I decided to go ahead and buy the documentation for about $55 USD. The first thing I noticed was that the PDF file was for the 1.0.19 version and not the current 1.5 version. This was worrisome. However, David’s blog pointed out that the documentation also gave you a complete sample set of charts in a jar file, and instructions to run it.

Between the documentation and the sample set of 300 graphs, it was fairly easy to make quick progress. The documentation gave me enough clues to make a Java Servlet that created a PNG with a summary chart. There is a sample app that allows simple clicking through the 300 sample graphs and charts to allow selection of various features for chart type, legend manipulation, axes control, titles, subtitles, etc. Combining these two elements, I was able to pull together four graphs of three different chart types fairly quickly. My first chart follows.

Disadvantages of Graphs and Charts with JFreeChart

As previously mentioned JFreeChart is not totally free if you want the documentation and the sample charts. But to me, they are totally worth $55 USD. Also as mentioned, the documentation is not current for version 1.5 – but it was good enough in the sections I needed to help me create Java Servlets that produced a PNG image. The sample charts are well worth the money and helped me craft custom charts such as the bar chart below.

It would have been nicer if the documentation was up to date. It also would make me feel a bit better if there were updates post 2017 to this project by David. But it works great and is mostly free. David also has other newer SVG graphs, JS graphs and a whole new charting library. But this one worked well enough for marrspoints.com.

Conclusion of Graphs and Charts with JFreeChart

Hopefully this short article has helped you see that you can build some free Java based graphs and charts with JFreeChart.

I received this unit in December 2019 with Ubuntu 18.04 LTS installed. The 3.28 version of Gnome running on the system did not support display “Fractional Scaling” – which made some applications much too small (or in 200% mode much too big) to read and use well. Some research showed that a newer version of Gnome would support Fractional Scaling with 125%, 150%, and 175% options available. The 20.04 LTS version Ubuntu includes Gnome 3.36 which does support Fractional Scaling.

Back ports of the Hades Canyon NUC

Right now Ubuntu 20.04 does not provide direct upgrades from the 18.04 LTS release until the 20.04.1 release arrives. I did download the ISO and ran it under virtualization and can see the Fractional Scaling option exists!

I will soon be moving my development environments over and moving from the Ubuntu Unity desktop to Gnome in this high-powered mini-PC with 64GB of RAM, a quad core i7 processor, and amazing video power.

I have moved around the United States many times while pursuing my career in business and technology. My relocation plans have taken me to Chicago, three different homes in the the Bay Area of Northern California, Seattle, Northern Virginia, back to Seattle, over to Minneapolis and now splitting my time between Blacksburg and Arlington Virginia.

My trips used to include packing hundreds of pounds of books, audio CDs, computer CDs, etc.

My time at Digital River allowed me to see how they built an early part of their business which leveraged high bandwidth speeds to obviate the need for software CDs. We all just download our software now and would not think of lugging boxes of software CDs.

Back-up CDs are also a thing of the past. I have my two Synology NAS devices, which slowly back themselves up to the AWS cloud.

Mr. Steve Jobs made audio CDs a thing of the past with iTunes. Most kids have no idea what a record player or even a cassette tape are. It felt good throwing away all those boxes of audio CDs.

I was browsing my Audible library this morning. Yep, all of my books are in there. Even my books I read in print are in my Kindle app.

Being a caveman

So what is wrong with curl? Nothing. But Postman (at getpostman.com) is simply one of the best tools I have used while developing code that consumes APIs. This is another case where I was using caveman tech (curl) to do a job so elegantly managed by a service that makes a desktop app that runs on Linux, MacOS, and Windows (and syncs across them).

Even a stealth API…

Currently, I am now working on a stealth start-up idea with an even more stealth cohort of mine in the financial space. The data company we have tentatively selected (and their API documentation) pointed me to Postman. It is awesome. I have deeply tested the financial access, accounts, instruments, etc. This was accomplished on my own accounts in only a couple hours of work and research. Postman is script-able, has variable replacement, etc. Oh, and the best part, a single developer license is FREE. My favorite price.

To think Sam Morris at Digital River talked about Postman dozens of times. It never occurred to me to go look at it. That cost me a lot of wasted time. Especially since I know Sam is “the man”. Thank you Sam – the second time I heard of it, I knew to go get a copy and learn it quickly.

Getting my latest NUC

I am pretty psyched to get my latest Intel NUC. The NUC7i7DNKE has an 8th generation Intel® Core™ i7 vPro™ 4.2 GHz “Turbo”, quad core processor with 32GB of DDR4 2400 MHz RAM and a 1TGB SSD drive. Not to mention built in 4K UHD video with HDMI ports and USB 3.0.

My home data center NUC cluster

I will use this as my main development machine. It is crazy that I tend to run out of RAM on my 16GB machines running Ubuntu.

This will be my 9th NUC. Maybe I am a little too in love with these things. They make great clusters for home research and development on distributed technologies such as Cassandra and Hadoop. I have three nodes running Cassandra and Hadoop today – and am looking to add a 4th node when I free up my current development machine NUC.

Quiet, Low Power, great for clustering!

They are whisper quiet and use very low power. There are 5 in a stack sitting on my desk next to me as I write this, and they make less noise than a single standard PC. In fact, they seem to make no noise at all.

I also run Windows 10 on one as a home theater type of PC connected to a Samsung UHD TV via HDMI. These NUCs are awesome. I gave my old i3 core media NUC to my younger brother as a gift.

Here is an old picture of my early stack of NUCs. They are each 4″ x 4″.

New Blog along with some old content

As a past media executive at companies such as CNET Networks, Microsoft’s MSN, AOL and the early social network Classmates.com, I have operated a blog here and there over the years. Mostly to test out SEO ideas and cross link my sites, etc.

Started on LiveJournal in 2004

One of my unfortunate SEO decisions was using LiveJournal.com for my tech postings. In 2004 as CTO of CNET Networks, I was fortunate enough to meet Brad Fitzpatrick who invented LiveJournal (as well as memcached). Since we made a (failed) bid to buy the site, I decided I should use it and get to know it a bit. I had used it to blog about some of my non-proprietary experiences with technology and software from time to time.

My last post there was almost two years ago to the day. I was musing at the intersection of my auto racing hobby and my technology hobby. It was through a lack of automation of the points standing of my auto racing league that I had finally brought these two passions together. This was all enabled by Open Source, the Intel NUC computers (home data center), and Amazon’s AWS hosting facility. Resulting in the creation of the marrspoints.com race points tracking web application.

LiveJournal did not seem to get the SEO juice

Compared to modern blogging sites such as WordPress (which this blog is built on), LiveJournal never got the great SEO features that it deserved. Therefore today, I am moving my LiveJournal information over to a new home here at cahall-labs.com. All of the posts have been successfully moved here as of this post.

Open Source and my Home Data Center

I have a few tech topics that are of interest to me. They include:

My home data center evolution

The Open Source operating systems and application software I use at home

Cassandra and Hadoop

The marrspoints.com site was simple to build, but the back end tools to ingest all of the race data was a lot more work. I occasionally look at ways to change the data ingestion or analytics. Therefore I play with tools such as Cassandra and Hadoop on my NUC cluster in my home data center. In general, I will try NOT to blog about racing in this blog. That will move to a blog at either cahallracing.com or cahall.com.

Thank you LiveJournal – hello WordPress

So thank you to LiveJournal for the tools and time. It was a good 14 year run. There is also an old, outdated racing blog on WordPress. It will likely be moving to a new home in the next month or two. It will be good to get back to using the tool Matt Mullenweg built (WordPress). I had the opportunity to work with Matt at CNET when he spent time there for a year on his way to becoming famous. Clearly I wish I had made a blog tool. Some day I may even blog about Gavin Hall and Alex Rudloff. They built blogsmith. Blogsmith powers TMZ.com and most of the AOL blogs. I guess I met most of the people that built blogs… Very, very smart and talented people.

Linux desktop variations

After playing with Debian and Ubuntu, I wanted to see what the latest in KDE looked like. I have mostly been a Gnome user and had read some interesting tidbits on KDE 4.3 in LinuxJournal. I did not want to “polute” my Ubuntu installation by downloading all of the KDE parts to it. So I decided I would add a Kubuntu partition to my Ubuntu box. I would do this as well as test Kubuntu on my 64bit Windows machines using Wubi.

I was surprised to see that the installers for Ubuntu and Kubuntu are not really from the same code base. The installation on my 32 bit Ubuntu box went off without a hitch. I had a spare drive on it and I used that for the new partition. I needed to manually change the partitions with the partition manager. This is so it could leave the old Ubuntu 9.04 and 9.10 versions where they were. Even this was simple and straight forward.

Wubi letdown

I guess my biggest surprise was that Wubi does not install Kubuntu/Ubuntu to run “on top of Windows” as I thought it would. I had thought there was an additional VXLD layer or something that was written to let Linux run as a guest OS on top of Windows XP. This would have been really cool. Sort of like Cygwin on steroids. This may sound ridiculous, but a colleague long ago, Bill Thompson, wrote such a VXLD for Windows. He did this back in the mid ’90s that allowed x86 versions of Unix to run on top of Windows.

I searched around the web and Facebook and LinkedIn to see if I could find Bill. With much digging I found him on LinkedIn. His start-up was called “nQue”. He was also a file system guru that wrote a lot of CR-ROM file system drivers, etc. after the start-up went south.

Needless to say, I think if that feature could be added to the Wubi concept, a lot more people might try Ubuntu. Adding it right on their Windows desktop as an application environment without requiring a reboot. I know Wubi does not alter the Windows partitions. So it is still a fairly painless way to try Ubuntu without risking much. Users can always uninstall it as they do any Windows application it if they are not happy with it. I just prefer to rarely reboot my systems if I can avoid it.

Home Data Center Saga continues…

The “home data center” is getting a bit crazy to maintain. It is a good thing I have so much free time on my hands (not). I did finish a couple of the projects on my list last weekend. I wanted to upgrade one of my P4 3.0GHz “home brew” machines from Fedora Core 3 to Ubuntu and put Debian server on one of the “new” used Dell 2850 servers I bought from work. I am now the proud systems administrator of both of these machines – with plenty of fun along the way.

Upgrading from Fedora to Ubuntu

I started with the Fedora Core 3 to Ubuntu conversion by making sure all of the applications I had written in Java, PHP, Perl, sh, as well as the databases in MySQL had been successfully ported to another CentOS machine and regression tested. That took longer than expected (of course). I had already used BitTorrent (thanks Bram Cohen) to download Ubuntu 9.04. I like their numbering scheme as even I can figure out how new/old the rev is. I then went and installed it on my “home brew” Intel motherboard based system. It worked like a charm and I was checking out its slick UI and features within minutes. So far, so good.

Next I decided to see how the graphics worked and if I could get it into 1920×1280 mode with my 24″ monitor. That was a tad trickier – but I was pleased to see that it went out on the Internet and figured out where to get the latest NVidia driver that supported the video card I had bought years ago. That was slick and the graphics were awesome. In high res mode it even puts some transparency to windows and gives them “momentum distortion” as you move the window. Not sure how useful it is – but it looks pretty cool.

VNC for graphical UI across machines

I like to sit in my home office and use VNC to access the 7 Linux boxes running in my basement and other rooms (versus running between them to try things). I know that “real systems administrators do not use VNC” as told to me by one of our real systems administrators at AOL (and CNET years ago). I am not embarrassed to say I am not a real systems administrator! I like the graphical UI access to all of these machines. It makes working on them so much easier with 4 or 5 windows open at a time. So here is where the rub is. I enabled VNC, ran back to my office, and tried it. No luck. I made sure SSH worked and I could get to the box – that was all set and good to go. I check that the machine was listening on port 5901 – that was good too. A little snooping in the VNC log file let me know it could not execute the /etc/X11//xinit/xinitrc script. I thought that was odd but enabled execute permissions on the file and everything worked.

Upgrading versions of Ubuntu to 9.10

As I performed a routine update of the OS and files, it let me know that Ubuntu 9.10 was now out (as it was past October – month 10 in year 2009). I had downloaded 9.04 a month earlier when I began thinking of the project. A 9.10 upgrade sounded great – so I decided to “go for it”. Bad decision. After the upgrade the video would not work in graphics mode and I could only bring the system up in text mode. Not a big deal for a “real systems administrator” but definitely not what I was looking for – especially on a desktop machine where I wanted to check out the cool graphics in Ubuntu.

Video driver hell

Since the machine had no real work on it and I did not feel it was worth my time to really figure it all out in 80×24 text mode as I trouble shot the X Window system, I simply put 9.04 back on the machine and got it working where it was before the upgrade. This would represent my fallback case in a worst case scenario. I then used BitTorrent to get 9.10 on a DVD. Ubuntu allows you to add multiple OS versions by partitioning the drive. I did that and shared the drive with 9.04 and 9.10 and performed the installation. 9.10 came up and worked from scratch – but the video upgrade would not work. When I tried to get it to go out and upgrade the video driver as it had in 9.04, it kept telling me that there were no drivers and that the upgrade was not possible. This did not let me use the 1920×1024 graphics mode of the card or monitor.

After playing with the software update tool, I was able to find some NVidia drivers that were available and downloaded those. Once I did that the system finally let me do the upgrade to enhanced video mode and use the 1920×1280. I am not sure why the 9.10 version was not able to automatically find these drivers as the 9.04 version was, but clearly this was why the upgrade had failed when I tried to go from 9.04 to 9.10 “in place”. The VNC issue for xinitrc still existed and I again corrected that. Project complete!

And on to the Debian upgrades

The Debian 5.0.3 server install for my Dell 2850 proved to be less frustrating – but not without hiccups. I had downloaded the first 3 DVDs for Debian and proceeded to the basement to start the install. That is when I noticed that this 2850 came with a CD-ROM drive and not a DVD-ROM drive! I had already put CentOS on the other Dell 2850 months ago – so I “assumed” that both machines had DVD-ROM drives. Bad assumption… The nice thing about Debian is that is allows a “net install” CD to be burned that is fairly small. It then downloads the rest of what it needs as it goes along. So this is the route I chose for the Debian server. From there the install was fairly straightforward. The graphics are nowhere as nice and Ubuntu – but this is a server install and I don’t have a fancy video card in the 2850 anyway. The VNC issue for xinitrc also exists with this version of Debian – which is no surprise as Ubuntu is a downstream distribution of Debian. Another project complete and now I have systems to compare different OS features and issues and keep up with some of the pilot projects we are doing at work to streamline software distribution, etc.

Great work by Brad Allison in creating SUMO and for the data center and SA teams for pushing its usage. This tool allows AOL to identify underutilized servers and either decommission them – or bundle them up onto virtualized hosts.

It is great to work with dedicated people that are not only smart, but care about their environment at the same time.