Blogs

About this blog

This blog contains articles related to the practicalities around application software development. How should source code control be used? When should secure engineering be employed? How can teams maintain software for their sucessors to own?

Tags

Where do software developers learn to be professionals?

I had the pleasure of visiting my alma mater last week to interview students for summer internship positions at IBM. I always enjoy going to the campus to meet with faculty and students and hear about what they're studying, what research they're working on, and what's on their minds.

One thing always strikes me as interesting though when I talk to students about their software development experience. They are always excited to tell me about the projects that they have worked on and the cool technical problems that they had to overcome to build a solution. Some of this software runs on Linux systems, some if it is web application server-based, some of it is database programming, and still other projects are various game implemntations that the students have created. The interesting thing here is what they don't talk about. I very rarely hear about how the students worked as a team to collaborate on the project. Yes, they do work as a team. But more often than not, working as a team turns into a group of them all sitting in front of the same LCD screen and sort of "team coding" the solution.

This is just one aspect of what I will call "professional software development." And it is something that I'm sad to see is not getting as much attention as I would like it to get in our education system. Most curricula that I have seen relegates topics such as "software design", "team structure", "planning", "source configuration management", "requirements management", "work item management", and "test management" to courses such as "Software Engineering". These courses are typically junior or senior level elective courses. The outcome of this approach is that students fresh into the work-force have developed software development habits that are best described as "solitary". The concepts of multiple people working simultaneously on a source base are only experienced if the students are contributing to some open source project - only rarely do students consider this as part of their class work. The benefits of coordinated build and automated testing are never considered. And managing a group project using a source configuration management system that can coordinate multiple people's work and assist with addressing conflicting changes to source code is completely foreign.

When these students enter the work-force, they wind up spending a large amount of their time learning all of these working characteristics "on the job". And in doing so, they have to un-learn what could be viewed in the industry as "bad habits" before they can appreciate the benefits of working alongside many others who are all contributing to the same source code base.

I tend to think that we could do better here by instituting a development process for our university students that encourages good, strong, "professional software development" practices from the out-set of their college experience. In so doing, we would be preparing these students much better for the style of programming that is found in most medium to large software development environments. This would also encourage "good coding habits" before those "bad habits" take root, resulting in software development teams that seek out good source configuration management, requirements tracking, defect and feature tracking, and automated build tools rather than shun them.

What have other people seen in their interactions with new hires and college students they have worked with? I'm interested in hearing your experiences.