Were I writing anything for nuclear, I wouldn't be using Java in the first place but something with built in DBC.

Also... hunting bugs: I rarely seem to have to hunt bugs these days. Usually something goes wrong and I have a little trail of breadcrumbs left by a stacktrace to follow right back to what caused it. Failing that I pop in a couple of system out printlns (yes, really) or on occasion resort to the debugger in Eclipse. Bugs rarely take long to find or fix. I do have the luxury of having almost no-one else working on my code at the moment. Occasionally Chaz breaks it, but then again, he can't really program

I use JUnit for two kind of things (not necessarily being "strict" unit-tests):

1) creating testbeds for something I have to experiment with.2) keeping others from breaking key functionality

since I have to do testbeds with mock-services for a lot of stuff anyway - because the real environments are unavailable, clunky or test-roundtrips to time-consuming - doing them in JUnit and throwing in some asserts in the end does not really hurt.

and keepin others from breaking key functionality once they take over is more or less mandatory. if you won't do it with tests you would need a huge amount of knowledge transfer and coaching to ramp up your successor. this is a real time sink and if you fail you will accumulate a lot of legacy projects you have to take care of, which is very undesirable - at least at my current job profile.

having said that, doing strict unit testing and 90+% code coverage, is just a waste of time. you have to concentrate on the critical stuff and to quickly simulate an environment to save you time.

That tended to degenerate into arguments about what exactly was critical as I recall.

It still doesn't change the fact that maintaining a unit test alongside the code it was meant to test basically doubles the development time, and from what I've seen so far the biggest use of unit tests appears to be "protecting my code from some other developer breaking it" which is beginning to sound patronising towards colleagues. On the very rare occasion someone has ever accidentally broken your code in the days before unit testing got trendy, had the issue not been resolved usually quite rapidly thereafter? Every time I add up the numbers the same answer comes out: you get and maintain more working code, in less time, without unit testing, in most types of projects.

I write business software that sees an internal audience of less than a couple dozen people tops, to zero at least (i.e. automated processes). It's all in a range of languages, including lots of shell scripts, perl, python, java, and now scala. Other than the occasional flash of clever insights here and there, it's a grindingly dull soul-sucking endeavor, so gaming is that escape fantasy of mine that I'm probably never going to seriously dive into. I also used to do a lot of hacking on MUDs (MOO mostly) which is about as close as I got to doing real game work, so something RPG-related is still what I have that itch to do.

Honestly, I've never used Agda, I only hear Haskellers talking about it. I love Haskell because it reminds me of how smart I am -- that is, not smart enough to learn all of Haskell, at least not without cracking open my theory books. It's stretching exercises for the brain. Even then I'm not sure what actual useful programs I would write in it.

Well, lucky for me (Har har) I'm going through a course right now talking about the various Testing Techniques. The point that's been expounded upon by the professor, in regards to Unit Testing, is that it really shouldn't be written by the people coding the chunks. It should be written by people who understand testing, people who haven't touched the code in other ways, and should be used as a method for which the 'correctness' of a chunk of code can be gauged. Hopefully, the interfaces/contracts that each method needs to fulfill is well known and the testers can write the tests in parallel otherwise it will take a lot of extra time to get things done.

If it's written correctly, it should be able to provide various benefits namely: Did the change that I just made cause this set of tests to fail? And to hopefully provide a bit more information than a stack trace does especially when it comes to side-effects. If it's robust enough, it'll be able to help you pin-point the error (Mistake in code) instead of just finding the failure (The chunk of code where the mistake actually caused an issue that we can see).

And then, the professor spends the next four lectures or so informing us that all testing strategies are sub-optimal, but that they're better than the alternative. And especially that you have to pick your battles in regards to what you test. If it's something that you can't unit test reliably, don't do unit testing on it!

An ideal world where you've got one person doing the specs, another doing the code, another doing the tests, and so on, where everyone sticks to their responsibilities without being hampered by the barriers between each other ... it's like that frictionless vacuum physicists need for their models to come out perfectly. Except it's probably easier to create a frictionless vacuum.

and from what I've seen so far the biggest use of unit tests appears to be "protecting my code from some other developer breaking it" which is beginning to sound patronising towards colleagues.

I am talking about this:- you or you team write a large non-trivial software system for a department of a company- you get assigned otherwise- someone else or another team takes over

I have seen enough sideeffects by seemingly trivial changes that it's really a good idea to have the most complex or key functionality automatically tested. the new team has no idea how things work under the hood, so I consider giving them some way to be reassured that the system will work after their changes is just being polite, not patronising them...

Every time I add up the numbers the same answer comes out: you get and maintain more working code, in less time, without unit testing, in mostdifferent types of projects.

It depends on the project and how you use the tools you were given - that's it. Automated (unit-)testing is a tool - it's up to you, to make the best out of it.

As I said before, in any case you have create some environment to allow a quick rountrip for trying stuff out, doing it as a JUnit test is a no-brainer - you'll set up something anyway.

And for testing critical stuff. That's not only about the cost of writing and maintaining a software system, but also about preventing collateral damage, if the system fails. But this might greatly differ depending on the type of software system, the environment it is used in and the audience it's using. I do most stuff for insurance companies, banks and business-to-business marketplaces, so there might be completely different circumstances.

The point that's been expounded upon by the professor, in regards to Unit Testing, is that it really shouldn't be written by the people coding the chunks. It should be written by people who understand testing, people who haven't touched the code in other ways, and should be used as a method for which the 'correctness' of a chunk of code can be gauged. Hopefully, the interfaces/contracts that each method needs to fulfill is well known and the testers can write the tests in parallel otherwise it will take a lot of extra time to get things done.

While having your programs being tested by unbiased people is mandatory, letting them code unit tests is largely inefficient. Tests should be written while coding functionality. Thus, ideally, the benefit is not only a bunch of tests as side product but additionally faster development.Furthermore, black box testing against given interfaces is likely to be incomplete, for anything non-trivial.

I used to think that someone else should write the unit tests but then I realised that unless it's another programmer who is already working very closely on the same problem they're not going to have any idea how to write a proper test for the code in question.

I used to think that someone else should write the unit tests but then I realised that unless it's another programmer who is already working very closely on the same problem they're not going to have any idea how to write a proper test for the code in question.

Cas

Nasa apparently split programming teams into two halves - one half does the development and the other half writes tests to make sure the code works. (Although these sound more like functional tests than unit tests). The cunning bit is that every few months the two teams swap places.

It sounds like it would work really well for software that needs to be very robust, although I'm not sure most regular studios would be willing to pay for it.

Firstly I should say I'm doing a lot of lua code at work recently, and while it's nice language it's an absolute pain to refactor reliably without introducing more bugs. After every change you're never really sure if you've not made some silly typo or mixed your object types up and it feels like walking on egg shells the entire time. Unit tests would make developing with it much, much nicer. There's be extra code to maintain, but it'd easily save time over tracking down the latest stupid bug introduced.

Currently we seem to be doing the opposite - it's hard to refactor so we don't. This obviously means the code gets harder to actually get stuff done in.

So I think in certain situations unit tests can be a huge win. For something like a Rails webapp where you can unit test the bottom 80% and ignore the ui side it'd be great.

But for statically typed languages where most of the interesting stuff is in the ui (ie. java games) I'm still undecided.

Wrt NASA - there, precisely double the development time. Probably not that many actual bugs caught, but then, when a bug causes a billion dollars of space shuttle to fly into the Atlantic, you'd want to be quite careful

Hopefully the learned something from that whole O-ring thing. Patriot missile misses target by a bagillion miles? Programmer didn't understand that 1/10 of second isn't representable in floats. Euro rocket goes boom. Programmer doesn't understand truncation to 16-bit integer. Design. Data-structures. Algorithms. Numerical Analysis. (Oh, and listen to people with experience in the field in question.)

I write business software that sees an internal audience of less than a couple dozen people tops, to zero at least (i.e. automated processes). It's all in a range of languages, including lots of shell scripts, perl, python, java, and now scala. Other than the occasional flash of clever insights here and there, it's a grindingly dull soul-sucking endeavor, so gaming is that escape fantasy of mine that I'm probably never going to seriously dive into. I also used to do a lot of hacking on MUDs (MOO mostly) which is about as close as I got to doing real game work, so something RPG-related is still what I have that itch to do.

Scala can be used to solve mundane problems in mind-bendingly elegant ways.

Lots of programmers are stuck in jobs with really old code and technology for business reasons. I feel for them. Scala is bleeding edge, which is what most programmers want.

It sounds like you are weary of programming. I question whether games would reinvigorate an interest in programming, because at the end, games are still programming. Why not consider a field with more of a purpose that you believe in?

(totally off the unit testing subject, but on a subject that's more interesting to me)

For me unit tests may be considered as an insurance.You pay for it (time it cost), but you will probably never or rarely need for it (bugs that it may detect and safe you).And like insurance, in some use case you can avoid it, and in some others, you just simply can't.

By example, when i search a library like new collections implementations or distributed cache, i start to check if they have a good javadoc and if they have a good unit tests that covers the API.I mean unit tests by the fact to test a "short" perimeter of classes event if it's a functionnal tests.

- while coding some new non-trivial functionality, before plugin it into the rest of the app structure, so that when the plug fails (it will), at least I know the new subsytem works, it's the plumbing that's wrong

- to cut down development time of that new functionality, coz' it's very fast to launch the tests. I don't have to launch the whole app and put it into some specific state, just to make some tiny tweaks.

But after that, I hardly need the tests anymore.They even become a pain when doing heavy refactoring of public interfaces, coz' the tests need to be fixed too.

I personally think unit testing is a waste of time. I was fortunate enough to work on a large project which had a vast amount of unit testing in it. I noticed after a year working there that we spent more time on the unit tests than on the code itself, and in all the time I worked there, the unit tests only found 3 bugs ahead of production. After roughly doubling development time (conservative guess; I'd put it at triple).

IMHO unit testing is papering over the cracks in design and specification. Wherever unit testing is being used today would be better served by, for example, design by contract.

Flame on!

Cas

I don't see how the unit tests even managed to find 3 bugs. The point of unit tests is to _prevent_ bugs. If there was a bug in the codebase, then the original unit test _failed_ to prevent it, and it was only found by dumb luck when someone wrote a new unit test for some unrelated functionality.

In my work, we follow test-driven development. My code is all to do with parsers, code generators, metamodels and validators... I would be unable to function without comprehensive test cases verifying intended functionality and catching regressions.

In my work, we follow test-driven development. My code is all to do with parsers, code generators, metamodels and validators... I would be unable to function without comprehensive test cases verifying intended functionality and catching regressions.

The more business logic you have, the more you'll benefit from having automated tests. Not at the time of writing (although I use them to good effect to dev-test my stuff), but at the time that changes need to be applied. That's how I would summarize it anyway.

Unit tests are another way of defining requirements and an automated way of validating them. Just like requirements they fall into the same problem - human engineers define them.

As to not being able to function without them, of course you can. You just have to drive your development from your requirements manually and validate regularly by hand. It's not as easy and relies on you being conscientious and capable/qualified to do your job something thats severely lacking in the industry today. The danger with unit tests is that you end up relying on one person's interpretation of the requirements (or maybe more, one person at one time in project lifecycle). Revalidating requirements understanding regularly used to be implicit, now with unit tests it doesn't happen so much.

Unit tests are certainly not a magic bullet, but they are one tool in the process. On a large system that has had many people work on it, leave, and new people come in it is a very valuable one. For small systems or small teams it probably doesn't make as much difference.

I personally think unit testing is a waste of time. I was fortunate enough to work on a large project which had a vast amount of unit testing in it. I noticed after a year working there that we spent more time on the unit tests than on the code itself, and in all the time I worked there, the unit tests only found 3 bugs ahead of production. After roughly doubling development time (conservative guess; I'd put it at triple).

IMHO unit testing is papering over the cracks in design and specification. Wherever unit testing is being used today would be better served by, for example, design by contract.

Flame on!

Depends on the project. For Kryo's serialization, without unit tests I would certainly break things left and right, without realizing it. I'm sure other projects are similarly sensitive to changes.

I used to think that someone else should write the unit tests but then I realised that unless it's another programmer who is already working very closely on the same problem they're not going to have any idea how to write a proper test for the code in question.

Cas

Writing a proper test case is a problem in and of itself. When first writing the tests against a class, a programmer tends to write test cases that will obviously pass, because they only handle the cases they had in mind when implementing the class. This still helps prevent future maintenance from breaking the class's established behavior, but hardly proves the class is bug-free (or meets the specifications under a design-by-contract model). Splitting up the team (NASA-style) makes logical sense, but doesn't sound practical for most projects.

Personally, I'm in Cas's camp: I'm the only one working on my code right now. Stack traces tell me where the code breaks. Inserting System.err.println's helps me debug the problem. I don't even use an IDE debugger (but sometimes use VisualVM to inspect state at runtime). I also agree with Roquen's comment on (not) treating computer scientists as code monkeys.

I've never been on a design-by-contract project. In such an environment, are unit tests entirely redundant/unnecessary, or can unit tests still be constructed per the contract (by Team A), then run against the code being developed (by Team B)?

Although I am a firm believer in unit tests, I do find they are hard to implement for human-oriented code. I've been on projects that tried to unit test Swing user interfaces and that was both painful and unproductive. Testing the underlying data model with unit tests was no problem of course - but testing that widget X was present and clickable on screen Y was nearly impossible. Likewise I have 96% coverage of my metamodel code (it's dropped from 99%) but for my PDF generating code I have zero test cases. I have yet to write a test for my OpenGL code - how can you write an automated test that pixel x has been fogged correctly from all perspectives, for example. Correct me if I'm wrong, but unit testing the user interface is pretty much impossible.

But for the backend, unit test are invaluable and indispensable.

By the way, I don't agree with the "I don't need unit tests because I just follow the stacktrace" camp: most of my bugs (and I certainly make enough) are in logic or behavior. Unit tests let me check the logic and also fire the exceptions in my code - otherwise you can go to production with your exception cases mot covered by manual testing.

By the way, I don't agree with the "I don't need unit tests because I just follow the stacktrace" camp...

Clarification of my stance above: I don't claim that I don't need unit tests, just that I don't use them. I'm sure my code would be more robust if bolstered with unit tests, but I'm not sure of the cost:benefit ratio. Had I a team with sufficient resources, I'd probably be on the unit test side of the fence (but probably only for such direct cases, as suggested by ags1).

I think that a lot of the current push for TDD comes from the dynamic languages / Web development camp where tests largely compensate for the fact that these languages don't have types, i.e.: as Cas said, it's a crutch for shit languages.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org