07 November 2015

It has been a while since I have had the free time or motivation to play around with computers at home. I have been running Mac OSX computers for about four years now. The first one I purchased was a mid-2011 MacBook Pro in 2012, and soon after that I decided to give up my dual-booting PC/Ubuntu desktop at home for a 2010 Mac Mini Server. Both of these computers had identical specs, which is what I always aim for, with dual-core Intel Core 2 Duo 2.66 GHz processors, 4 GB of RAM in the mini and 8 GB in the laptop (upgraded). Both have served me very well over the years. So well, in fact, that I lost interest in playing with any type of Linux for serious home use.

When I started working at the FMRIB Centre, they very graciously purchased me a fancy new 2014 Mac Book Pro with Retina Display, with a dual-core i7 and 16 GB of RAM. This has been my only computer for the past year, and while it works extremely well, I felt that it was only prudent to have an actual "non-work" computer for use at home in case I don't feel like "lugging" a whole 2 kilogrammes (3 1/2 lbs) home with me after work. Or if it were ever to break or get stolen as it does get hauled around with me wherever I go, via bike, bus, and walking all over Oxford, London, and back home to the US.

My requirements were minimal: (1) Something cheap, preferably used off of Amazon or eBay, and therefore not a Mac, (2) something lightweight and easy to use in bed or on the road, and (3) something about which I wouldn't feel too bad if it were never really used. This was the perfect opportunity to experiment with a Linux PC again!

I decided after a lot of research to go with the Dell Chromebook 11 2014 version. I found one on eBay with a scratched up outside shell, and slightly damaged power connector (NB: the power plug on these laptops is not soldered into the motherboard, but "floats" within the case, so even though the plastic holder was damaged the likelihood that it was non-functional was minimal). In general I do think that Dell is crap, but this particular machine received some good reviews. Specifically, these are made with the education market in mind, and thus feature some nice things like a more durable screen hinge and a "waterproof" keyboard. 4GB of RAM, a dual-core Haswell Celeron @ 1.4 GHz and a nice keyboard layout with durable keys. Not too bad. The laptop arrived and fortunately, the damage was minimal and just as described by the seller.

I decided within 10 minutes of using ChromeOS that this thing could do more. Again, I researched a huge number of GNU/Linux distributions to try out. I am, like most of my truly Geek friends, fed up with the glut and poor UI of Ubuntu. I thought about some alternative Ubuntu distributions like Xubuntu or Lubuntu, and even considered good old Debian. This is mostly because I am so familiar with the Debian-based Aptitude package manager, and in general very happy with the availability of good software repositories for those systems. Especially because the NeuroDebian repo has almost all of the neuroimaging software tools I use on a daily basis. I can't stand yum, so RedHat/CentOS/Fedora/Scientific were out.

However I really wanted something lightweight, and wanted this to be more of a project than a plug-n-play system. I started looking into some of the more "ground up" systems. Gentoo looked interesting, but the thought of compiling every single package from source on what is basically a glorified netbook seemed a bit intimidating. I looked into other options like LXLE or Bodhi, but these minimalistic approaches didn't seem to have the maturity I was looking for. The one distro that came up time and time again and looked genuinely interesting was Arch Linux. I have heard about Arch for a long time, but always dismissed it into the same category as Gentoo: small user base, obscure, too Geeky for my tastes, and (I assumed) not enough support for non-free software (e.g. Adobe Flash) or drivers (e.g. Intel).

However, Arch kept coming back again and again in searches. Especially in the context of a distro that plays well with Chromebooks. In fact, the Arch wiki had dedicated installation instructions for Chromebooks in general, and also detailed driver and config tips for the Dell Chromebook 11. I made up my mind -- Arch Linux would be my first attempt to re-install the Chromebook.

16 July 2015

Get an e-mail today from health service with less than a week’s notice that I’ve been scheduled to have a vaccine appointment next Wed at 9:15am. And if that time doesn’t work for me then I need to call/e-mail ASAP as that is the last date the clinic is open before the summer break.

Have a doctor’s appointment at 10:00a that same day, that I need to give a week’s notice (too late!) to cancel, or I’ll have to pay the £45 fee out-of-pocket, and is a 20+ bike ride away.

Rescheduling with health service is a pain, and usually results in me cycling all the way to campus and losing 2/3’s day of work time. Decide the best option is to just hire a car for the day for day so I can make all of my appointments. (I don’t own a car).

Go to Enterprise website. Pick out the car and day, decide the price is okay (£18), but realise I’m not logged in so I would have to fill out all of my details again. Logging in results in me getting kicked back to the front page so I have to re-enter everything again.

Try to pay with my new credit card. Not registered for Verified by Visa(TM), so when I get to the payment page it now makes me register. Have to dig up and fill in all my bank details, then think up a new password that I will be able to remember (because it doesn’t ask for the whole password each time, but instead random digits from it, password manager won’t work here). When I’m done with that, it kicks me back to Enterprise without actually authorising the transaction so I have to try again.

10 June 2011

Thanks to our friends at Wall Software, LLC for getting us up and running (and continued hosting!). I promise to write something substantial here in the near future... I've been working on a few new ideas and want to get the chance to share my thoughts more regularly.

29 November 2010

Popular Mechanics did an interesting article/experiment on how companies handle shipping packages. Specifically, they used a set of sensors to record things like temperature and acceleration, to see how well the packages were handled (or mis-handled) during their voyage.

I think this is a really cool article because I once had an idea to do something similar for a start-up company. The idea was for a customer to receive a small tracking device which he would then place inside his package before shipping. This device would then be returned to the company to check if the package was handled correctly. The target audience is for items that are fragile or require very special handling, such as scientific instruments. Parameters I thought we could track would include acceleration (for drops/large bumps), water/humidity, temperature, magnetic field (for scientific instruments...?), and GPS location. Most of these sensors are already available in many hand held smart phones nowadays. It would also be interesting to couple this with some type of shipping insurance, since the data from a recorder would provide excellent, concerte evidence of mis-handeling.

That being said, while I find this to be an interesting article, to some extent the real question is "who cares?" We all know that these shipping companies throw our stuff around. But we also know that our orders are placed in lots of packaging to prevent damage. I love buying stuff online... books, computers, electronics. Lots of expensive and seemingly fragile stuff, yet I would say 99.9% of everything I have ever ordered has come in perfect, working condition. Maybe we should just turn a blind eye to how things are handeld mid-transit if, in the end, the company can deliver on its promise to sucessfully deliver them un-damaged. Sometimes, isn't it better just to not know what's going on?

22 November 2010

I'm doing something that most numerical computing people would consider crazy -- I'm computing the analytical inverse of a square non-singular (although ill-conditioned) matrix. I'm doing this as a solution to a differential equation, which I need to be able to evaluate very quickly very very many times (100,000's of times), so naturally I am coding up the problem in C.

Its amazing how quickly the closed-form solution becomes complex as the order of the matrix increases. For example, the quite trivial case for the inverse of a 2x2 matrix can be written in just 5 lines:

// inv(A)

scalar = (1.00/ (A[0][0]*A[1][1] - A[1][0]*A[0][1]));

invA[0][0] = scalar * A[1][1];

invA[1][1] = scalar * A[0][0];

invA[1][0] = scalar * -1.00*A[1][0];

invA[0][1] = scalar * -1.00*A[0][1];

Ok, that was trivial! The code for a 3x3 inverse is longer, but still quite manageable.

Now consider the inverse of a 6x6... let me just say its messy. Very messy. Messy enough that its nearly impossible to be able to do it out by hand on a white board and sucessfully type it up into C. It was easier to find the solution using MATLAB's symbolic math toolbox, then reformat the result from MATLAB into C.

The next thing to ask -- why use an analytical solution at all? If you asked about this on a forum post, most people would direct you to some other algorithm such as QR or LU decomposition, as implemented in LAPACK or some other free "off-the-shelf" numerical software package. I have my reasons against this:

(1) Its true, most of the algorithms in LAPACK are highly optimal, however they are optomized for working with very HUGE systems of matricies (with dimensions in the 100's or even 1,000's). Generally, doing this kind of math is "trivial" on small matricies (2x2, 3x3) is very easy to do by hand, and the focus of most numerical computing work has been on really large systems.

(2) Using LAPACK adds to the complexity of the code. It requires that the correct libraries be installed. It also means that the code can't easily be compiled on another platform, for example as a CUDA or other GPU-type program. Again, it seems like overkill to make the whole program much harder to port onto different systems just to solve one little 6x6 matrix inverse.

I have also investigated 3rd party C implementations, such as Jordan Exchange or LU decomposition. As I said before, however, this is a fairly ill-conditioned matrix for some cases, and these numerical methods tend to introduce a lot of error into the result which could affect the performance of the overall algorithm. Its also true that most of these implementations require many conditional statements (e.g. for choosing columns for pivoting), and are therefore inefficient when implemented on a GPU.

In conclusion, there are a lot of great resources out there for solving large systems of equations; there are also fairly trivial solutions to simple systems. What I'm struggling with is where my problem lies -- is it trivial enough to warrent a hommade analytical solution, or does it the use of some external software library, such as LAPACK (and all the associated headache of getting that to work)?

27 October 2010

Been doing some work on my computer, as usual. I have really wanted an SSD (solid state hard drive) for a while now, as I've found that my user experience with Windows 7 is gradually slowing down (or maybe my expectations are just gradually increasing). A recent set of blog posts (Jeff Atwood & Linus Torvalds) suggest that an SSD may be the most cost-effective performance boost right now, and judging by the amount of clicking and grinding my hard drives are going through whenever I try to launch a program or interact with my computer, I think this may be a real issue which I have not previously addressed.

However, having recently spent a considerable amount of money on guitar equipment and a Halloween costume, I'm not really in the position to blow another $200+ on a hard drive that is only 80GB large! As an alternative, I decided that it was time to take a step back and look at the current resources I have to work with. My computer, in terms of hard disk arrangement, was a complete MESS. I have 4 hard drives, about ~2.5 TB worth of storage space, but chunked up into many medium-size partitions. The main reason I've gotten into such a mess when it comes to disks is:

(1) I tend not to really have an "upgrade roadmap" for disks, as I typically do for other components such as CPU/Mobo/RAM, and video card (for these, I have an idea of when they'll become obsolete, when the newer technology hits the price point I'm willing to pay, and what components I can buy knowing they'll last past several iterations of computer upgrades). And I never buy brand new disks when building a new system, nor factor them into the overall cost of a computer.

(2) I tend to buy disks as they come on special sales at SlickDeals (or if I have a BustBuy er.. BestBuy gift card to blow), so they all have mis-matched Manufacturer/Capacity/Performance/Cache Size.

(3) I tend to choose which disks to keep in my system based on their size, not performance. With a limited number of SATA ports (4 - 1 for CD-ROM), I just keep the 3 biggest disks around.

(4) Most of my important data has been historically spread across many different disks, partitions, and file system formats, and I always have too much to fit on one single disk. So when I want to install a new OS, I typically find a disk that has enough free space to throw on a partition and go from there. The result has been that the drive I choose for an install is more based on where data already is than the optimal configuration.

Recently I solved problem #4 by buying a relatively high-performance (WD 'Black' /w 32MB cache) 1TB drive and creating a "dump" of all my most precious data. Another way to go, of course, is to get a NAS or similar external drive enclosure. For the purposes of home computing, I think this is a really bad idea for a number of reasons:

(1) If you use an external hard drive, your performance suffers greatly! Most of these things are still USB or FireWire, and even if its the faster Firewire spec, its still nowhere near as good as a SATA drive. Yes, you can get ones that have eSATA, but there is a significant price penalty. Also, I have found some incompatibilities, such as certain drives only working at SATA-I with certain enclosures.

(2) External enclosures are usually louder than internal ones. You already have a perfectly good "enclosure" sitting at your desk: your computer case! Why add more junk? Yes, they're more portable, but in today's age of network & Internet file sharing, its doubtful you will want to physically lug around several Terabytes of your data (not to mention reliability and security issues!).
(2b) If you buy a pre-configured enclosure, the exact specs of the drives inside are often unclear. Its also often unclear if you can take it apart, put a bigger hard drive, or replace a failed hard drive.
(2c) If you buy a separate enclosure system and hard drives, it can be difficult to determine if the thermal transfer of that enclosure will be sufficient to cool your disks and not shorten their life span. It also adds yet another price overhead that can be completely avoided by sticking the drive in your computer.

(3) Using a NAS at home doesn't make sense: sure its great that you have all of your data easily network-accessible to all computers/devices in your home (and in some cases, on the Internet as well), but then you also have slow (NETWORK) access to all machines. If you have the disks inside your computer, its almost trivially easy to setup a Samba share in Windows that achieves the same level of data sharing AND have it be very FAST on at least one computer!

Coming off my rant on external drives & NAS, I decided to think about something I never did before: Performance. So I ran several HDD benchmarks, and here were the results:

(1) I had my OS installed, for "historical reasons" (i.e. this was the free disk I had laying around at the time) on a 0.5 TB drive which was actually my SLOWEST internal hard drive. I had backups running to a 0.75 TB drive which was significantly faster.

(2) As expected, my data-drive ('Home') was on the fastest drive, a 1TB Western Digital. However, because this was in an eSATA external enclosure, it was only running at SATA-I instead of SATA-II, so that was a significant area for improvement. It also produced a strange resonance with my desk so it made my system significantly louder.

(3) A good fraction (>25%) of my space was in "dead filesystems", i.e. Ext3 and Hfs+ partitions from Linux & OSX installs I had abandoned a while ago and did not care about the data.

So, in conclusion, I was able to streamline my system down to 2 disks, 4 partitions, and greatly improve performance. My OS disk has 2 80 GB partitions: 1 for Windows 7, 1 for Linux, and 1 500GB partition for a Home Backup. My Home disk continues to be 1TB with 1 partition, but now that its an internal drive it operates at a higher performance. For those hackers out there who ignore their file system and disk plans, take a second look before resorting to more expensive technologies.

Next Post: How I was blown away by the quality of both Win7 and Linux after my recent re-configuration.

20 October 2010

I keep getting Apple Updater alerts every so often, because I unfortuantly have iTunes installed on most of my computers (mostly to make it easy to use my iPod, not because I particularly like having a 500MB MP3 player). But I always see under the 'optional updates' section: Safari 5. And so I ask myself, does anyone actually use Safari for Windows? Is it actually considered a real browser, or just some experiment gone horribly wrong? Maybe the only people who actually have it installed are those who are duped into doing so when updating their iTunes...