Like many people of my generation, I started programming at an early age. At 10, I was writing 6502 assembly language. My beloved BBC Micro lacked built-in flood fill, circle and ellipse drawing, and other graphics commands, and I spent many happy hours working on these. I used interrupts and vectors to place a clock in the corner of the screen, of limited use since the Beeb lacked a battery-backed RTC, but, multitasking baby!! Exciting stuff. I dabbled in making games, I did a little 3D and simple physics modelling, I wrote my own version of LOGO†.

I sometimes come across people with similar backgrounds, who count their programming experience as starting from those days‡. Now it is natural that I would feel an affinity here, but real experience has taught me that that experience doesn’t carry over into programming in a commercial setting. The problem is, a normal, working programmer, does an awful lot of things other than write code. Hobby projects don’t teach you to write documentation, or specs for new features, or clear bug reports that anyone can pick up to reproduce the issue. It doesn’t teach you how to work in a team, how to branch and merge, how to do code reviews, how to mentor others. It doesn’t teach you how to manage priorities or dependencies or balance conflicting demands. It doesn’t teach you how to talk to users or sponsors or vendors. It doesn’t teach you how to make intelligent use of technical debt. It doesn’t teach you about how to dive into unfamiliar code, under a tight time constraint, in a business domain you don’t fully understand, and make your change without breaking anything else (or what to do if you do). Dealing with distributed, heterogeneous systems can make things get very complicated, very quickly, in ways you can’t anticipate unless you’ve done it a few times before. There are a million other things… A guy or girl who does this isn’t even exaggerating their experience, they’re demonstrating that they don’t even understand what experience is. People in their early 20s who think their “10 years” experience puts them on a level with someone in their 30s who has 10 years commercial experience plus their teenage and other personal hobbyist projects. Please stop this!

† Then about 14 I discovered other interests and didn’t do much computing again ’til college at 18. FORTRAN baby!!‡ Or equivalent for their generation.

When you have been around the industry for a little while, you will see that nearly all applications, especially user-facing ones, are made of only three kinds of things: Forms, Reports and Workflows. A Form is simply an interface, a screen or a page, for the user to input some data. It is made of text fields, checkboxes, drop-down menus, and so on. A good Form offers prompting and validation, such as a text field that is supposed to be a number is actually a number and within a sensible range. It can be dynamic, updating fields depending on each other, all sorts of fancy things, and when the user is done, saves the results in a database.

A Report is a screen or page that accesses a database, often one populated at least in part by Form input, filters, sorts and aggregates the records, and presents them in a useful way, for example a graph or a table. Often Reports have a few Form-like elements too, so the user can refine or drill down into the data. A Workflow simply links Forms and Reports together. Behind both you might have Services (APIs). So 4 things, to make (nearly!) any application you can think of.

Let’s consider a website like Amazon. You go to it, and it runs the default Report. You refine that Report into a selection of things you’re interested in, and select one, fill in a Form to buy it, and the database is updated, one item transferred atomically from their inventory to your basket, by calling a Service. Then in the warehouse someone runs a Report that tells them what to pack and who to ship to, then they fill in the Form to update the status. There’s nothing in this, fundamentally, that couldn’t have been done on a 1971 IBM 3270, it just might not look as pretty to modern eyes, but all the functionality and interactivity would be there. If you’re buying a book, do you really need a photo of the picture on the cover to decide?

Once you start thinking at this level, suddenly it doesn’t matter that the lifetime of the current fashionable web page generation language is only a few years, or the latest Javascript framework will only be around for a few months. Let junior programmers (who I’ll define as those who define themselves primarily in terms of their language rather than their domain) worry about that and worry about running just to keep up with their peers jumping on the latest bandwagon. Concentrate on the meat of the application, and use tried and true languages and platforms to do it. This isn’t an anti-technology rant by the way, just a piece of advice for those ready for the next level.

Anyone can learn to code, we are constantly told these days, but is it true? Yes… in the same sense that it is true that anyone can learn to play a musical instrument. Many people do, and many of them get a great deal of personal satisfaction from it, play in bands, write songs, meet new people, and lots of other good things.

But if a man selling guitars and guitar lessons tells you that playing the guitar is a guaranteed route to a well paid, secure career, then you would be wise to take that with a pinch† of salt. Or worse, if he tries to convince you that your kids should just learn to play the guitar and join a Silicon Roundabout startupband, instead of studying traditional subjects.

Similarly anyone can learn to cook, but the skills required to cater a dinner party for some friends don’t scale up to catering a formal reception for hundreds, or running a restaurant. Not even Jamie Oliver is stupid enough to try to convince anyone his books and TV shows will teach you that. If schools taught people to cook meals from basic ingredients then as a nation we would be healthier, wealthier and happier, but they can’t even manage that… Why does anyone think they can teach “coding”?

So, how was 2014 for me, in tech at least? Pretty good. It has been quite an interesting experience adapting to an all-proprietary tech stack (database, language, IDE, job scheduler) but I feel I am finally getting a grip on how all the moving parts hang together and how the sum is greater than the whole of the parts. So, professionally, I’m pretty happy and I find out early next year how happy they are with me :-)

The systemd farce has prompted me, after nearly 20 years, to look at alternatives to Debian for my personal Unix(-like). Candidates for a replacement so far are DragonFly, Minix3 and Debian/HURD, tho’ I have to wonder how viable the latter is. The Debian project simply doesn’t have the manpower to maintain two parallel distros, once systemd has fully infected Linux. Windows 95 called, it wants its registry back. I have these running in VMs to kick the tyres right now. But more and more of my scarce time for personal projects is taken up with BBC Micro and Atari ST stuff now anyway…

Speaking of the Beeb, Elite: Dangerous is available now. I have to admit I was skeptical that it would ever ship, and I’ll hold my hand up to say I was wrong about that, good work Braben. However I won’t be playing. The lack of a single player mode is the deal breaker for me – you can play single-player but you are forced to participate in the shared universe, which runs in real time. But in a game I want the universe to stop when I’m not there and resume where I left off whether that’s an hour later or a month. And consider this: I bought my copy of the original Elite, in 1984, with pocket money I’d saved. Acornsoft ceased to exist as a separate entity in 1986 as the bottom dropped out of the home computer market. But I still play it in 2014, on original hardware. Should something happen to the servers owned and operated by Frontier, what happens to online-only Elite?

Nothing to report on the OCaml front. I’m still into it, but just haven’t had time to do much on OCI*ML. As I said previously, I had gotten it to the point at which it was useful for the work I was doing at the time. Features such as handling BLOBs are still outstanding, maybe I will have a chance to add that (and find a way to do it without my company claiming the IP). I’d like to maintain my connection to the OCaml community, because you never know… And I need to get properly up to speed on C++14, again to keep options open. (RIP Dr Dobbs). Will I ever return to the world of Oracle? I won’t rule it out but what I am doing now is a radically different way of thinking about databases and the applications that run on them; a few years of this can only strengthen my database and development career long term.

My prediction in wider tech in 2015 is this is the make-or-break year, either something is done about impossible-to-secure HTTP, SMTP and TCP/IP networks, or the public massively disengages from online services in general, sticking only to those websites with a) a good track record of security and b) an even better track record of making things right financially and logistically when their customers are impacted by a breach.

Long time since I have updated here, a lot has been happening. For a start part 4 of my series on Oracle 12c new features is unlikely to be written since in October I started a new job which is non-Oracle. I’m now working for an organization with specialized enough needs to have written its own in-house database, the raw building blocks of which are Linux, C++ and Python, integrating soft-realtime data access, replication with flexible topology, sophisticated batch scheduling and a whole range of other features, globally distributed and operating 24/7. My new role is Application Development Lead.

It is a bit strange; for the last 15-odd years Oracle has been my bread-and-butter, from versions 7 → 11g in Prod and 12c in Dev, as a DBA and a developer, but it is good once in a while to step outside one’s comfort zone. Here’s wishing for an exciting and prosperous 2014 for all!

In the course of my work with OCaml I have traditionally resisted using anything other than “pure” OCaml, and the facilities of the underlying OS. So rather than OMake or OASIS I just used plain, old-fashioned Makefiles. For package management, I relied on APT on Debian and MacPorts on OSX. And I avoided both Batteries and Core. No so much out of a fear of “backing the wrong horse” but just to make whatever I did as portable and easy to adopt as possible. And also, in the early days, I didn’t really know enough to choose anyway, and I wanted to work with the raw language rather than a high-level framework. Sort of like you can learn to program MFC without ever really learning C++.

But now Real World OCaml (which I have on pre-order) is in final draft, and spent some of yesterday getting my Debian and OSX environments set up for it†. One quirk I quickly found is that both have pkgconfig as a prereq, which for whatever reason, neither system had already, and that’s not mentioned on the page, maybe everyone else has it by default. I have a bunch of OCaml stuff in-flight at the moment – OCI*ML test suite and new features, some playing with Project Euler (solved 1516 problems at time of writing) and now working my way through this (trying not to skip to FFI which is a keen interest of mine, obviously!). That’s on top of playing with Oracle 12c, and I have barely started properly playing with C++11 new features yet!

† Links to the draft of the book will stop working at some point I expect…

The only thing I can see is that they’re near the Unix epoch, but why would that cause it to be exactly 1 hour out…? The latest version of the code is up on Github. The underlying C code is in oci_types.c.

Anyway, at least this illustrates the value of soak-testing with randomly generated data – I had never experienced this issue “in the wild”, not has it been reported.

Update: Fixed! Was a double-application of localtime. I never noticed it because at the company I was at when I wrote this, there was a policy of all machines everywhere in the world being in GMT all year round! The epoch thing was a red herring. I suppose the moral of the story is make sure your random data is really random…