( Thoughts of a Software Engineer: Today, Yesterday & Tomorrow )

About Me

I am a Software Engineer with over 12 years of experience. Programming is not only a skill, but my passion. Visit my online portfolio for highlights of some recent projects I have worked on. In my spare time I enjoy hiking, skiing, & water sports - pretty much anything outdoors.

I’m a little less trusting with feature branches and instead often push them to the central remote repository. For me, I fear that days of work could be lost by keeping the work solely on my repo until it is ready to be merged back into the develop branch. I can imagine my repository being corrupted, my hard drive dying or my computer being stolen all causing work loss. I don’t like to leave my hard work to Murphy’s law.

Now one could use a backup strategy to mitigate this issue and I do highly recommend doing this. I have used CrashPlan for years and it has saved me a few times, making CrashPlan worth every penny. However, I also like to know there is a central Git repository holding my code (again a trust issue). Plus some companies will not allow their code to be placed on any server outside of their network.

Since myself and my entire team have chosen to place our “feature-*” branches on our central remote repository, it has caused a small issue. We find that we end up with a lot of old and forgotten branches on this remote repository as well as our local machines. Of course Git is great at this task of removing the old branches.

Delete Local & Remote Branches

We can remove the remote branch by using the command:

git push origin --delete <branchName>

Then we can remove the local branch with the command:

git branch -d <branchName>

Great, problem solved I can run those two commands each time I’m done working on a feature. But wait, I don’t want to remember those two commands and I want to run only one command. So, I put together a couple bash functions that will make this easier. Just drop these functions in your ~/.bashrc file or in the appropriate dotfiles directory if you happen to be using my Dot File Manager.

After adding these functions don’t forget to use the source command to load the functions into your shell.

Now a branch can be easily deleted from both the local and remote repository or just one of them by executing the command:

git-delete-branch <branchName>

Delete Merged Branches

Since we have multiple developers all pushing their branches to the central repository, we often find that over time we start to get a long list of orphaned branches (branches that no one is working on anymore). Any easy way to clean up most of these branches is with the following command which will show the branches that have already been merged into your current branch.

git branch --merged

Then each branch can be manually deleted with the command:

git push origin --delete <branchName>

Again, I like things simple so I put together the following function to streamline the process:

Another tool

Another possible solution to the above stated problem is to use the git-flow tool. At first glance it seems to be a handy tool to automate many of the tasks in the GitFlow process.

However, in my short experience with it, I found it doing things I didn’t expect. I then checked the github repository and noticed an unhealthy amount of open issues (175) and pull requests (78) with the last commit date being September of 2012. This project is officially dead in my book.

More recently, I have found out that there is another fork of this project that is being kept up-to-date. It’s called git-flow (AVH Edition). This one might have more potential.

However, I think the GitFlow approach is easy enough to carry out with the standard Git tool that I have not yet taken the time to test out this newer git-flow (AVH Edition) tool.

Have you used the newer git-flow tool? What are your experiences with it? Do you have any other git commands or tools that have helped with your day-to-day Git operations?

I’ve had my blog running on port 80 for years and have finally decided it is time to deprecate HTTP and move everything to a secure SSL connection. This decision was a lot easier to make now that Let’s Encrypt is providing free SSL certificates and has been out of beta since April. I also appreciate that the entire installation can be done via command line and that the certificate can be automatically renewed a month before it expires. Wahoo, no more pesky calendar reminders to tell me to hurry up and buy a new certificate before it expires and manually install it.

With that said, my blog is currently running on CentOS 6 with Apache with vhost files placed in a non-standard directory by DirectAdmin. This means I will have to manually add the certificate information to the vhost file for each host instead of letting Let’s Encrypt do all the work for me. That’s ok though, as I will only have to do this once.

Also, CentOS 6 will throw a little curve ball as it doesn’t have Python 2.7 setup by default and depends on Python 2.6 for yum. So, care will be taken to get them both setup on the machine.

Here are the steps needed to set up Let’s Encrypt. First, we need to set up the IUS repository with the following commands:

Next, we will install pip as that gives us any easy way to install Let’s Encrypt and update it in the future.

sudo easy_install-2.7 pip

Now we will install Let’s Encrypt (which also goes by the name certbot)

sudo pip2.7 install letsencrypt letsencrypt-apache

If you happen to have a very vanilla Apache setup and are running Debian then the following command to generate and install the certificate may magically setup everything for you. This was not the case for me so I didn’t use this step.

sudo certbot --apache -d brett.batie.com

Or, you could try to be more specific about where your config files are located for Apache as I did in the following command. However, at the time of this writing this does not work if your vhost file has more than one vhost in it. So, I didn’t use this step either.

IMPORTANT NOTES:
- Congratulations! Your certificate and chain have been saved at
/etc/letsencrypt/live/brett.batie.com/fullchain.pem. Your cert will
expire on 2016-09-01. To obtain a new or tweaked version of this
certificate in the future, simply run certbot again. To
non-interactively renew *all* of your certificates, run "certbot
renew"
- If you like Certbot, please consider supporting our work by:
Donating to ISRG / Let's Encrypt: https://letsencrypt.org/donate
Donating to EFF: https://eff.org/donate-le

Sweet, I have a certificate! Now, I just have to tell Apache to use it. My vhost file is located at /usr/local/directadmin/data/users/admin/httpd.conf and I need to add the following 4 lines to the appropriate vhost in that file.

Now do a graceful restart of Apache and we should have the new SSL certificate up and running.

sudo service httpd graceful

You can load the site in your browser to see if it is using the new SSL certificate as well as test it at SSL Labs to see if you received a passing grade.

Since I decided to completely remove http (port 80) from my site I also added the following redirect which could be placed in the appropriate (port 80) vhost (preferable) or in a .htaccess file (less preferred).

Redirect / https://brett.batie.com/

Now, we just need to automate the certificate renewal by adding the following to a cronjob:

@monthly /usr/bin/certbot renew

As you can see there were a few steps involved in this process but overall I found this easier than generating certificate requests, verifying domain ownership, receiving an email with the certificate, needing to find the entire certificate chain and then manually put all the files where they needed to go. The fact that Let’s Encrypt is free and it simplifies the process makes it a no brainer and should help our Internet become a little more secure.

Today I finished putting together a dot file manager. This is a tool that helps manage all of the settings and configurations for multiple computers. I’m currently storing configs and settings for vim, sublime, bash, aliases, custom functions, bin files, thunar, diffuse, kupfer and the like. This tool allows building out a new computer with all of my settings very quickly and also helps me keep multiple computers/servers configs in sync. No more painful copying to usb, emailing or using scp/rsync to manually pull my settings and configs. YEAH!

The installation of this tool couldn’t be easier with a simple one line command:

That command will download all settings/configs into a dotfiles directory and then ask if symbolic links should be created. The dot file manager tool (called dotm) can be used by anyone to store their custom configs (without being forced to use mine). I hope others find it useful, I know I have.

Handles symlinks to files in sub directories of the dot file directory. This will match the directory structure in the users home directory. So ~/dotfiles/somedir/.somefile will have a link created in ~/somedir/.somefile.

Symlinks to directories in the dot files directory. Any directory name that ends in .lnk will have a corresponding symlink pointing to it from the home directory. So ~/dotfiles/somddir.lnk/ will have a symlink in ~/somdir.

Creates backups of files that exist in the users home directory. Places backup in a user defined directory (~/dotfiles/backup by default) and appends a timestamp to the filename

Automatically creates symlink for all files in the dot files directory (~/dotfiles by default) and sub directories except for special directories.

I recently had a task where I needed to export a specific table that was in a few hundred different databases. However, mysqldump does not have a way to specify that a specific table should be dumped out of every database. See the supported formats below: mysqldump [options] db_name [tbl_name …] mysqldump [options] –databases db_name […]

I recently had a task where I needed to quickly start up 50 spot instances that all required an Elastic IP (EIP) address. I initially worked out the steps in the web console and determined I needed to accomplish the following: Request 50 spot instances based on an existing AMI Allocate 50 new EIPs Associate […]

I often join IRC channels where other developers hang out. I’ve found this to be very beneficial in keeping up to speed with changing technologies. I generally stick to two servers Freenode and OFTC. Freenode is by far my favorite as it seems to be the standard server for other developers to join and create […]

On occasion I need to run some software tests where Symantec gets in the way. So I put together a simple batch file that will stop and start Symantec. Just add the following commands to a symantec.bat file. Then you can run the commmands symantec start or symantec stop. if “%1” == “stop” ( echo […]

For those unfamiliar with Mercurial, it is an awesome Source Control Management (SCM) tool. One of my favorite features of Mercurial is that the repositories are distributed which allows each machine to have a full copy of the project’s history. Being distributed has many advantages such as faster committing, branching, tagging, merging, etc. since it […]

I recently had a project to setup an Instant Messenger Robot for Windows Live Messenger. A IM robot can have many purposes such as: Keeping track of when contacts are online/offline and when they were last seen. Broadcasting a message to all contacts. Automatically answering common questions. Notifying contacts about new events. A newer site […]

I use the application UltraMon to help manage my multiple monitor setup. Overall this application is awesome as it makes moving applications between monitors a breeze and supports a separate task bar on each monitor, among other things. However, I have had this issue for a while where UltraMon will not move applications between monitors […]