First we create a requirements document. Which includes all the inputs and outputs of the system. The exectected processing and storage to be done. Performance expectations. Perhaps any power consumption and cost limitations etc. All the text and diagrams in this document are created by hand with pen and pencil. The girls in the new Word Processing department will turn that into a nice printed document.

Having captured the requirements we can move on to the high level system design documents. Here we partition the job into hardware and software components, specify how they communicate and interact, etc. All by hand again and back to the girls in the WP department.

Having got a high level software design we can create all the low level, detail design documents for all the software components. These will include flowcharts, pseudocode, whatever it takes to describe the design. Back to the girls in the WP department.

From the low level specifications we can start coding. We don't have a compiler so it's written in assembler, with a pencil on paper.

We don't have an assembler either so we have to assemble that source into hexadecimal, again by hand with a pencil and paper.

With hex in hand it's off to the typing girls to type that onto punch tape.

Finally we have a punch take with our code on it. We can load that into the EPROM burner, place the programmed EPROMS into our computer board, which the hardware guys have been designing whilst we did all of the above.

Surprisingly the thing mostly works. No doubt due to all the reviews and cross checking that went on between every step outlined above.

Or at least that is how it went in Marconi Radar R&D back in the early 1980's.

Ah. A lot simpler nowadays. I just type my code into the IDE, click on "Run" or "Compile", fix all of the syntax erroes that the IDE finds, then compile again, this time testing the functionality. Make adjustments, finish the rest of the program, again, syntax check. The result, voila, a functional PBASIC or Spin program.

The greatest thing about logic is that, once you have followed the thought train, it is then your own.

Except when those IDEs become gigantic, slow, horribly complex monsters like InteliJ, Eclipse etc. The Propeller Tool is an excellent example of a simple IDE that is fast to load and use on a whim, does not take all day to figure out how to set up a project and drive it and does not consume most of the resources of your PC!

Still, it's not much easier than simply editing code in a regular editor, like vim, and compiling it from the command line.

The problem still comes when you have a bigger project, with many developers working on it over a long period of time. Then you had better find a way to decompose it into components and modules that can become tasks for developers to work on in parallel. Not to mention making the parts testable on their own. Then you are into design specifications, test specifications, documents, diagrams and all the rest.

@msrobots -
[programming] All of what you say is true. What git adds for me though is with *getting back* to a previous programming session. A project (work or private) that I haven't touched for days, weeks, months, or, sometimes, years, or even decades, now (I started with git in 2007, and the first thing I did was to convert my old private CVS repos to git, straightening them up in the process. Some of those CVS repos were again converted from older SCCS repos. But CVS didn't allow me to straighten out the commit history. Git did.)

I just look through the git commit history and that will spool the project back into my mind in a way that just looking over the source couldn't possibly do. I see the modules getting into place and on top of each other, one by one, and that's the key.

White-out is banned in regulated industries and under Good Documentation Practices (GDPs) mistakes need to lined-out, initialed, and dated.
You need to clearly show what was changed, by whom, and when.

Typically the companies I have worked for that spend so much on following such procedures are very fussy about trade secrets, intellectual property, non-disclosure agreements etc. Then there is the Official Secrets Act to pay attention to if there is a government contract involved.

I don't want to have to talk to their lawyers.

However if you type some terms into google, like "software design specification", "software requirement specification" and so on, you will find a lot of discussion of such things and examples.

I am not at all sure what GIT has to do with being a better programmer.

Almost nothing.

There is no way any tool is going to make a dullard into a Ninja programmer.

But, possibly, maybe, a tool like git can help us dullards make better programs.

How so?

a) With a tool like git taking care of all your software versions you are liberated to hack with things, make branches and hack with things, try experiments, rearrange everything if need be. Never having to worry about losing track of "What was that last good working version I zipped up somewhere" etc.

b) If you are brave enough to put your code into a public repo like github then, if you have any pride, you will take the trouble to make sure it is the best you know how to do. You won't want to expose crap to the public. Which is not just about source code but also build instructions, test instructions, usage instructions.

c) Having got the code out in the public view, some other dullard may try it out and then come back to you with bug reports, bug fixes, pull requests. Now you have a better program without doing any work much!

'git bisect'. Wonderful tool when you one day find that your nice program has an error that you know wasn't there six months ago, and you don't know when that changed.

Use git bisect to very quickly find exactly which commit caused the regression. Tell it about a version that you know was good, then let it run - it'll repeatedly check out older versions (bi-secting, so you could check thousands of commits by a few runs). You build&test, then you say 'good' or 'bad' to git, and it'll try another, until it has pin-pointed the commit where things changed.
You can use it to find other changes (not just errors) too, of course.

With that in mind, always make commits that a) builds (no compilation errors), and b) runs.
That makes the bisect easy to use. So I always commit the new functions _first_, and only then do I commit new code that use the new functions. Good practice, even without bisect,but knowing about bisect makes your commits even cleaner.

I'm often checking in code that does not build, does not run or I know has lots of broken features. I try to avoid doing so but there we are.

Why do I do this terrible thing?

Because I often use git to collaborate with myself. For example: I might have a days work on a PC at home and the next day be in the office. So, I commit and push the code from home, when I get to the office I pull it and continue. Quite likely that days work done at home is in a mess. Especially if I'm overhauling the whole structure of a program. But, I don't want to lose that work and I want it available elsewhere.

Same would happen if I were collaborating with someone else.

Of course it would be better to do such carnage in a new branch, but, well, I get lazy...

Being a better Communicator can really help you be a better programmer.

You are communicating with your past self, others, and their past selves. People come and go, age out, and age in too. That's a real big one a lot of people don't think about.

Revision controlled, formal communication, is extremely high value. It has many uses Beyond programming. One example might be the engineering documents associated with a product. Think of something very simple like say a marker pen.

An object like that could have literally hundreds of documents associated with it. And that's true for the various States and variations that it goes through.

Just a quick sample:

There are the various sketches, the industrial design for the pen, there are the cad models that detail the geometric structure and topology of the pen, there are chemistry documents, formulations of inks, tooling to manufacture the parts, fixtures to do assembly, manufacturing process documents that do the same, retail packages different regions languages forms collections.

And then I haven't even gotten into compliance and legal business-to-business documentation.

So there's a hundred different documents easy right there, and I'm being ultra-conservative.

Now maps that back to Software.

It's a very similar thing, often associated with physical objects, and the software and all of its documentation often is part of the product these days.

That is the kind of thing that git does for you.

Learning to communicate in a formal, revision controlled way, understanding best practices the flow of information, and other basic things, has professional benefits Beyond just keeping your program code organized. It's well worth doing.

I spent many many years in the PLM, engineering, product design, CAD space.

I had no idea the complexity, but I do now, and it's on par with and can often exceed that for software.

Products and the interactions between the people who make them, are increasingly complex today. And it's getting worse!

I seldom, if ever, do an outline ahead of time. I just dive right in, but I'm always willing to accept the premise that my first coding attempt will be a total bust and that I'll have to start over. But I learn more from that bust than I ever could from any kind of pre-organization, and that lesson makes refactoring so much easier.

This process happened, BTW, on the Scribbler GUI. I was halfway through when I realized, "Oh cripes! This is not the way to do it!" So I started over from scratch, and the new refactoring made everything so much clearer and more organized. That was probably the most complicated Perl project I've ever undertaken, and the lessons I learned there were invaluable.

-Phil

Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery

This process happened, BTW, on the Scribbler GUI. I was halfway through when I realized, "Oh cripes! This is not the way to do it!" So I started over from scratch, and the new refactoring made everything so much clearer and more organized. That was probably the most complicated Perl project I've ever undertaken, and the lessons I learned there were invaluable.

-Phil

I must comment that that is one of the finest pieces of code I've ever studied.

Whit+

"We keep moving forward, opening new doors, and doing new things, because we're curious and curiosity keeps leading us down new paths." - Walt Disney