Random thoughts

I grew up on science fiction. I loved reading stories of The Future, where space travel was commonplace, where all energy was generated cleanly, and where we worked side-by-side with machine intelligences to accomplish tasks.

Of course, that wasn’t what most of the books were about; space travel, clean energy, and A.I. were just part of the background. And many of the books I read had issues– most of them failed to properly predict the role that women play in society (or often failed to credit them at all, aside from being a love interest for the hero), quite a few were somewhat racist, and all of them glossed over problems of everyday life that couldn’t simply be resolved with technology– but in the background areas of space travel, A.I., and clean energy, they were quite prescient.

“But Ryan, none of those have happened yet,” I hear you saying. And you’re right, none of these have yet reached their full potential. But in a world where SpaceX, Blue Origin, Rocket Labs, Virgin Galactic, and so many others are all working to bring affordable spaceflight to the masses (and getting pretty darn close), I think it’s safe to say that space travel will soon1 become far more commonplace than it is today. We won’t be living in Star Trek, but costs will come down and won’t be the same barrier that they are today.

And in a world where renewable energy continues to make great strides, I think it’s safe for me to predict that we are well on the way to moving the bulk of our electrical generation away from fossil fuels– there are some problems to solve with electrical transmission and energy storage, but we keep moving closer to to a solution. As more companies compete for a slice of the renewable energy pie, prices will go down, efficiency will go up, and an increasing percentage of electrical generation will be from renewable resources. Energy will increasingly be stored in batteries of one kind or another, whether they be chemical (such as Tesla’s Powerpacks), mechanical (such as hydropower or compressed air), or thermal (such as molten salt paired with a concentrated solar tower). Those batteries will allow more and more of the baseload supply of energy to come from renewables, reducing dependence on fossil fuels, and paving the way for those to be slowly phased out.

Lastly, we already work incredibly closely with machine intelligences, though that intelligence isn’t quite what was predicted (at least, not yet). Work is increasingly done on computers with incredibly complicated error checking algorithms, designed to reduce mistakes that cost money and lives. Our data is sorted and analyzed by powerful computers that can correlate information thousands of times faster than we can, letting us build more complicated and accurate models to help us learn even more about the world we live in. And most impressively (for me), we can now literally talk to our computers and ask them to do things for us. I know that the first two examples are far more impactful to society, but I still can’t quite believe that I can just talk to my Google Home, have it correctly interpret what I’m asking for, and then give me an answer that makes sense and fulfills my need. It still feels like magic.

The science fiction writers I read in my youth may have written about a world that was full of more straightforward adventure than the one we live in today. But what our world lacks in swashbuckling heroes with laser swords, it more than makes up for with the amount of sheer technical wonder that we have at our fingertips each day. Fifty years ago, for me to write and publish this article would have required a bulky typewriter, a mimeograph machine to create copies of the typed article, a bunch of paper, and a stapler to staple my article up on random light poles for people to read. Today, I wrote most of this post on my Chromebook connected to the hotspot on my phone, while waiting for my car’s inspection to be finished. To publish it, I hit the big blue “publish” button at the top of the page, and trusted it would be sent out to your screen when you wanted to read it. The amount of time and effort saved by modern technology for this article alone could probably cover my grocery bill for a week. And if that’s not evidence of us living in the future, I don’t know what is.

I host a few websites for myself and family on DigitalOcean. Up until recently, I’ve always just spun up a new droplet for each site, so they were all fully independent from each other; this was the easiest and most convenient way to get a new site up and running without jeopardizing uptime on other sites if I made a mistake in configuration, and it was drop-dead easy to map a domain to a static IP. It had some security benefits, too– if one site was compromised, it wouldn’t affect the rest.

But it was also maintenance-intensive. I needed to login to multiple servers to run updates; adding plugins had to be redone over and over on each server; and obviously this was starting to get expensive. So I decided to consolidate my multiple sites on one server, using a fancy feature of WordPress called… “Multisite“. Imaginative name, I know.

The initial configuration went well, with no real hiccups (other than my accidentally rm’ing most of Apache’s configuration files– but a quick droplet rebuild took care of that1). The trouble started when I had moved over the sites I was consolidating, and switched the domains to point at my new Multisite server. I spent two hours trying to figure out why one of the domains refused to point at the new server, only to discover (drumroll, please)… it was DNS. I use Pi-Hole on my home network to block malicious sites, but it also provides a DNS caching service which usually works great. In this case, however, it was pointing me back at the old server over and over, until the TTL finally expired2. A quick flush of the DNS cache, and I was able to see that the domain was correctly configured. Fifteen minutes later, I had SSL up and my plugins configured.

So what’s the lesson in all this? Even when you think it’s not DNS… it’s DNS.

Yes, I could have restored the configuration without too much difficulty, but I was early enough in the build that it was faster to just start over. ↵

I did set the TTL to a very low number when I started this process, but the old value wasn’t updated until the original one expired. ↵

I’ve finally moved to a VPS on DigitalOcean, from my previous (free) shared hosting. I did this for a couple of reasons: first, while my hosting was free for a year with my domain name, that year was almost up. To renew my hosting for the second+ year, I would have needed to pay $38.88/year; while that’s a decent price, I looked at my options and decided that moving to DigitalOcean wouldn’t cost much more (around $30 more across the year, since I use the weekly backups option), would give me much more control over my server (now I get SSH access!), and would centralize all of my VPS instances in the same place (I’ve used DigitalOcean for several years to host various projects).

Of course, as with so many things, this migration wasn’t sparked by a simple glance at the calendar. While I’ve intended to move my host for the last month or two, the timing was decided by my messing up a WordPress upgrade on the old site at the beginning of December. I used the automatic updater, ignored the warnings about making sure everything was backed up first1, and told it to apply the new version. When WordPress exited maintenance mode, I was locked out of the administration dashboard. The public part of the website was still up and running, but the backend was locked off. Since I was entering finals week at my university, I decided to just let it be until I had some time to come back and fix it. Worst-case, I had backups I could restore from, and I’d been meaning to migrate my site anyway.

Of course, things didn’t work out that way. When I finally had some time on Christmas Eve, I discovered that a complete backup hadn’t been made in months.

Turns out, if you don't verify that your backups are working properly, they might not be a viable restore medium.

Yes, I committed the cardinal sin of not verifying the state of my backups. Apparently I’d screwed something up with their configuration, and I’d never tried to restore from them before and hadn’t noticed until I needed them. At this point, I decided that if the backups weren’t working, there was no point in trying to recover on a host that I was going to be abandoning within a month, and I spun up a WordPress droplet on DigitalOcean to hold the rebuilt site.

But I've been meaning to move my hosting over to a VPS on @digitalocean for a while now, and this is a perfect opportunity to do that.

I still had copies of all the content that was on the site, so I’d be able to restore everything without much trouble. Some copy/pasting and time would be required, but I could get everything back to the way it was without too much trouble. But before I did all of that, I thought “what if I’m overlooking something really simple with the old site?” I did a little searching, and apparently W3 Total Cache, which I used to create static pages for my site and decrease load times, can cause problems with WordPress upgrades. I disabled that via FTP2, reloaded the site, and I was able to access the admin area again. Turns out the simple steps that you should take before completely rebuilding everything are actually worth it.

And of course, while I move from one host to another, I solved my original problem. C'est la vie.

Since I had already spun up and started configuring my new site, I decided to press onwards. My task was made considerably easier by my being able to access WP Clone on the original site, which let me move everything from my old site to the new one in just a few minutes. I redirected the nameservers to DigitalOcean, and ran a few last checks before calling the bulk of my work done.

The next day, when I was tidying up some loose ends and preparing to get SSL set up, I realized that my email no longer worked– my email server resided on the same server that hosted my old website, which meant I needed to find a new solution.

And we're mostly back online. Email is still acting up, but we're close to being done.

While I have been meaning to setup my own email server sometime soon, I wasn’t confident in my ability to get it up and running quickly, and email is one of those vital services I depend on working 100% of the time. In years past, I would have simply used Google Apps3 to host my email, but that is no longer the free option it once was. Luckily, I found a solution thanks to Ian Macalinao at Simply Ian, which is to use Mailgun as a free email server. Mailgun is designed to send out massive email blasts for major companies, but they also offer a free tier for people and companies that are sending out fewer than 10,000 emails per month. I send out a fraction of that number, so this was perfect for me (and their mass email prices seem quite reasonable, so I might even use them for that if the need ever arises). Ian handily provided a set of instructions for how to setup the proper routing, and, while some of the menu options have changed, I was able to get my new email up and running within a few minutes.

Well, I have email up and running, but ssl still isn't working. Going to give the letsencrypt tool another shot before trying something else

So I’d managed to get both the site and my email up and running, but I still couldn’t get SSL up and running. For those that don’t know, SSL stands for Secure Sockets Layer, and it’s what powers the little green padlock that you see on your address bar when you visit your bank, or PayPal, or this website. I wrote an explanation on how it works a while back, and I suggest checking that out if you want to learn more.
One of the benefits of hosting my website on a VPS is that I don’t need to use the major third-party SSL providers to get certificates saying my server is who it says it is; I can use the free and open Let’s Encrypt certificate authority instead. Unfortunately, I just couldn’t get the certificate to work correctly; the automated tool was unable to connect to my server and verify it, which meant that the auto-renewal process wouldn’t complete. I could have generated an offline certificate and used that, but the certificates only last ninety days and I wasn’t looking forward to going through the setup process every three months.4 I tried creating new Virtual Hosts files for Apache, my web server, but that just created more of a problem. Eventually, I figured out that I had misconfigured something somewhere along the line. Rather than try to figure out which of the dozens of edits I had made was the problem, I gave up and just reverted back to a snapshot I had made before starting down the rabbit hole.5 After reverting to back before my virtual hosts meddling, I was able to successfully run the Let’s Encrypt tool, generate my certificate, and secure my site.

…that was my own fault. Turns out I didn't actually need to rewrite all of Apache's virtual hosts, and I probably shouldn't have tried.

It’s a pretty straightforward and simple process, I just know that I would forget about it at some point, the certificate would expire, and the site would have issues. If I can automate that issue away, I would much rather do that. ↵

Snapshots are essentially DigitalOcean’s version of creating disk images of your server. I absolutely love snapshots; they’ve saved my bacon more than once, and I try to always take one before I embark on any major system changes. ↵

Ars Technica did a nice job of creating an impartial write-up on why Hillary Clinton used an external email server, and how it was actually used. It sounds to me like there’s an institutional history of using private email to conduct business, largely due to obstructive or incompetent IT services (in fairness to the State Department IT team, there are likely a number of complicated policies and legal requirements that they’re trying to work around, which is difficult). Still, that’s not an excuse to use a home server to manage official communication– if you must use your own email address, at least use something like Google Apps or Microsoft Exchange Online, where you have teams of people professionally managing the email environment 1.

It’s also interesting to see that the NSA basically shot down any possibility of her getting a secured mobile device; I would have thought that providing the Secretary of State– the person who comes fourth in the presidential line of succession– with secure communications at all time would be a priority for them.

Today I reconfigured a server I maintain for the Office of Residential Life and Housing. It broke yesterday because of a database issue, but I’ve taken this as an opportunity to rebuild and improve it with an included email server. I have it mostly up and running now, but it’s been a long, slow process that took far longer than I expected it to (as a sidenote, this would have been far easier if the backups I had were up-to-date. Always check your backups!)

Building an email server is more difficult than I expected. I almost expected to just run sudo apt-get install postfix and have an email server up and running; sure, it would need some configuration, but I’d be able to start sending and receiving mail almost immediately. And yes, that might be true if I installed something like Mail-in-a-Box or iRedMail, but I decided that that was too easy, jumped into the deep end, and immediately started configuring a mail server using Postfix, Dovecot, MySQL, and Spamassassin (and would have been instantly lost if it hadn’t been for this awesome guide). So I spent twelve hours copying and adapting code to my purpose, rewriting databases, adding users, restarting when I messed up.

It was absolutely awesome.

There’s something about taking the blank screen of a terminal, typing commands across it, and making something work. When you reload the page and it actually works the way you want it to, there is an immense feeling of satisfaction and accomplishment. You took something that was blank and empty, and turned it into something useful. There’s no feeling quite like it in the world.

That said, I’m totally using one of the ready-to-deploy email servers next time. Making something work is fantastic when you have the time to do that, but sometimes you just really need to have whatever you’re working on to be up and running.

Decades ago, dreamers imagined a world where the resources and tools for learning anything could be used by anyone. Where colleges, universities, and even ordinary people could share information and ideas, collaborate and learn. With the invention of the World Wide Web in 1989, many people thought that that world was close at hand, but it’s taken twenty-four years and countless hours of work to bring it to fruition.

I soon discovered that the quality of the lectures was far above my expectations. Where I had been expecting poorly-lit, low-quality video, there was professionally-recorded video with accompanying resources, images, text excerpts, problem sets, discussion boards– basically an entire college course. edX published new lectures weekly, with weekly or bi-weekly graded tests. Scoring higher than a 60% on the entire class would gain you a certificate of accomplishment; those scoring below 60% by the end of the course could choose to instead audit the course.

For the past few months, I have wandered through biology with a fantastic guide. Professor Lander is clearly a man who loves his work, and his boundless enthusiasm has made even the dullest parts of biology seem bright and interesting. The discussion boards have been teeming with helpful classmates, brimming with insights. The problem sets and tests have challenged without being impossibly difficult.

And now, it’s over.

Today was the final day. I watched the farewell message from Professor Lander, completed the survey for the course, and completed the final exam. I’ll miss the weekly lectures, I’ll miss learning something new about biology every week, and I’ll miss interacting with my classmates.

So, to Professor Lander and the entire team at MIT and edX, if you happen to read this, thank you. Thank you for bringing the joy of learning not only to me, but to the 40,000 other people who signed up as well. Today, anyone with an Internet connection and a desire to learn can go online and join a high-quality class in almost anything that they’re interested in, and the edX team, along with their partners, are in large part responsible for this distribution of knowledge.

Several decades ago, science fiction spoke of a future where knowledge could be shared and traded among everyone. Writers dreamed of a world where anyone could learn anything– a future society where information could be truly free.