Sometimes when I start a small project (like an android app), I don´t know which approach will work out at the end, and I just go for one approach and give it a try. But if I never used this approach before (for a sort of application I´ve never programmed before) it is like stepping into unknown terrain. I don´t know which libraries to use (maybe I have to try out several libraries) and there are so many unkonwns (like: how to get raw audio data in android)

So then my development process goes like this:

Write a piece of code to see if the approach has a chance. (The more uncertain the approach is, the uglier the code gets)

If it works, refactor a lot until it is beautiful

I think it could be a waste of time if I planned my software design in detail at this point, it would be like planning a trip without a map.

Is this part of aglie development?
How do you deal with unknown terrain in software development?

4 Answers
4

That has nothing to do with agile, but people sort of assume it does because that's what they think Agile is; headless chicken development in a hippy commune.

What you are doing is assessing technologies because you don't currently have enough experience of them to make a judgement call. This is good and never ends because new libraries, frameworks, languages and platforms appear almost daily.

How you deal with the unknown is a really good question actually and it comes down to researching the alternatives, assessing them and then selecting one.

The skills which tend to become associated with Agile that help here involve making code that is as easy and safe to refactor. TDD is good example. It encourages you to consider your development in terms of outcomes. "This code should produce this result" which focuses the mind and reduces the amount of code which doesn't contribute towards solving the goal.

If you write code following SOLID (Acronym) principles, you will be in a good position later to replace a library if you made the wrong choice, or, as so often happens, you outgrow your choice.

It's good that you ask this sort of question. There are too many developers who for various reasons will not risk appearing "ignorant" by taking time to select the correct technology. Make mistakes early in the project not late. Experimentation is key not a waste, so I think you're going about it the right way.

Is this part of aglie development? How do you deal with unknown terrain in software development?

What you have described is not Agile. The Agile development is more about promoting adaptive planning, evolutionary development and delivery with a time-boxed iterative approach. Agile does encourages rapid and flexible response to change. Thus, re-factoring your code as development progresses have pieces of Agile methodology in it.

Dealing with un-known piece of the project starts with gathering the known requirements, with high-level design. Once you get most components on hand, you may search for the right solution. That said, building small proof-of-concept before full-blown development is the approach that our team follows.

There is a software development principles called SOLID. In my experience, applying them on issues/problems is always a step forward in improving your project's code base.

Depends on the project, if you are working alone on a small project, it might make perfect sense to perform your tech research and investigation as part of the development. And although not a part of Agile, of course an Agile methodology could be used to add some control to this. However this makes the process very hard to predict /or time box. Might be fine, even faster, if working alone on a small project that is wholly yours, let your requirements unfold as you learn them. Use good principles along the way, and be consistent and you shouldn't need to re-factor so much.

At work we use Kanban, Scrum, and more traditional waterfall approaches. Depends on the project, I find that complex developments with well defined up front requirements are not best suited to agile, many will disagree though.

Before we start work even on an agile project (all but the most simple that is), we create some documentation. We have a mock up (if ui focused), a set of requirements, and a functional spec.

Development will be asked to create the technical spec from the functional spec, and during this process we will specify technology and perform any up-front research that we need to. This process seems so important to me, as it gives the opportunity to see gaps in the requirements / functional specs - and gives the big technology decisions up front to the people with the experience and system knowledge to make such decisions.

The significant thing though, is that the functional spec could be a list of bullet points, and the tech spec will usually be a model, with some bullet points and technology steers, maybe just 3 or 4 pages in some cases.

Even when running an agile project we create documentation:

All documentation has a cost.

Developing against moving and ill-define high level requirements has a cost.

The correct balance to the above depends on your project, the culture, and the people.

We document Just in time, documents come out of date.

We document barely enough / just enough.

We do not maintain or update these documents, we don't put much effort into them. They are small. We expect to throw them away.

We iron out the big unknowns such as technology decisions, hazy requirements, and architecture up front.

We know what we are developing before we start.

We trust the developers to make informed decisions around the documentation and discuss any issues.

We value communication over documentation, as such we expect all involved to communicate often.

We document systems (overview) after development, not during, not before.

You see there is a small waterfall in our agile process.

If you work alone, create an upfront model (diagram!) and play with and choose the tech, and then when you have this concept of the high-level requirements, run ahead and develop in an agile iterative way, but consider good principles and consistency as you go and you will need to re-factor less, more re-factor as you go.

But in general, if there is a real cost involved (not a hobby) know what your are developing before you write code, but don't waste too much time writing documentation that will become redundant fast as you will change your mind, and should change your mind during development as you become better informed. And your project could change course hugely, but start from a good, well defined foundation.

This is roughly how I start new projects and it's worked out pretty well, assuming we're talking about smallish projects. For example, I would not go this route if I were writing an ORM or something of that magnitude. Occasionally I'll resort to more formal research when I'm really in over my head. But for the most part I just start writing code and see what happens. You have to be prepared to throw a lot of code away when you do this however.