Almost forty years ago, my father and I journeyed to Texas A&M University for me to look the school over. During the tour we met the Assistant Dean of Engineering, a courtly gentleman named C.H. Ransdell (you old-time Southerners will recognise the use of initials). A product of 1930’s engineering school, he was helpful in getting me to come to Texas A&M, a decision that proved controversial back home.

I came back that summer for freshman orientation, and same courtly Dean Ransdell helped lay out my first semester’s coursework. Among the courses I took was computer programming, and in those days that meant FORTRAN. Even he could see where this world was going (so did Jack K. Williams, the university’s president).

That skill has stuck with me; FORTRAN 77 (as it became) is still my “native” computer programming language, although I’ve coded in BASIC and PHP since that time. (A sample of this is here). My teacher has had a long career in computer science; his speciality is cryptography, relevant then and now. The underlying things that make computers really work haven’t changed as much as you might think.

When I started my PhD at the SimCentre, same SimCentre was very concerned about my ability to code. I dispatched these concerns up front, although there have been other challenges. Coding is still an essential skill for those who actually make computers work for whatever purpose, in spite of the advances that have taken place in object-oriented programming, etc.

Their concern re a student as old as me was unfounded, but my teaching tells me that most of my “traditional” students–and I’m teaching civil engineering–really could use the skill. The convenience of computers has basically dulled their desire to actually “get under the hood” and even program a spreadsheet (and you can do quite a lot with a spreadsheet).

Coding forces an individual to do two things that most people hate to do.

It first forces you to use logic in a rigorous fashion. A weak logical structure will kill you in successful programming as quickly as just about anything. You have to construct the algorithm (or at least understand what it’s doing in the code you’re adopting).

Second if forces you to consider all the sources of error that might come up in an algorithm. Those sources include poor implementation of the logic, improper coding of the mathematics, and the errors that result from digital computation.

By the time you’re done with this, you look at computers differently. Instead of being the passive recipient of the results, you have an idea of what’s behind it, and are more sceptical of those results.

During my first job at Texas Instruments, I did some fairly elaborate coding in the design of this. My boss looked at it and expressed his concern that, with results as “effortless” as these, he wasn’t sure how the “old fogies” (who just got the results out) would deal with it. My response is that this was a bigger problem for the next generation coming up that didn’t have to do the coding. That’s where we’ve been since and where we are now. We live in a society where too many people are accepting too many results uncritically that come out of a computer.

For those of us who code, this reality is scary. It should be for you too. We need to teach those who plan to use a computer how to code, for their sakes as well as ours.