Tuesday, June 29, 2010

I've always felt that the test automation literature is a little ... odd.

On the one hand, you have the floofly high-level tutorial, written so that it applies to every test tool. By trying to apply to every tool, the tutorial isn't specific enough to provide step-by-step instructions.

Then you buy something and try to use it. UGH. That's no fun. So maybe you hire some consultants to build the framework and basic test suite. That's great, but it costs you $8,000 a week - and - $320,000 and six months later, and the consultants leave, do you have the expertise to maintain the suite?

All too often, two years later, you've got some really expensive shelfware.

So I'm interested in tutorials about test tools. The thing is, once you get specific, the market is much smaller -- so it's rare to find a mass-market book about testing with a specific test tool.

But that's the other thing. With some test tools today, you don't need one. All the readers need is an article bit enough to get them started -- or a short series to keep them learning.

That said, the folks at SearchSoftwareQuality invited me to do a small series on Selenium, and have published the first two parts - a two-part mini-series on Selenium RC. Read all about it - part one and part two, online, for free.

The next article in the series will address Selenium IDE, the development environment for Selenium. I hope you find them enjoyable and helpful.

Monday, June 21, 2010

A little over ten years ago I interviewed for a developer job with Steve Hoek, at a tiny division of McGraw-Hill. Steve asked me if I saw myself collecting an award in five years - what would it be?

I replied that I'd like it to be some sort of industry award, but those sort of things don't really exist for do-ers. I hope I never forget how Steve replied -- he agreed with me. Those kind of things do not exist for do-ers.

Maybe they should.

Oh, yes, there are lots of problems. There are conflicts of interest. The inner ring-ers tend to get them disproportionately, without making significant or meaningful contributions to testing. Plenty of people do good work, dedicated work, for decades without doing any PR, and are virtually unknown. But I do think that a good faith effort, on balance, would be better than none at all. The Agile movement has the Gordon Pask Award; can Software Testing do something similar?

I think we can.

So I was very please earlier in the month when the folks at Software Test & Quality Assurance Magazine announced the Luminary Award. Please allow me to read the qualifications from the nomination page:

The software test and quality community will decide who receives the award, and the expectation is the community will choose a nominee with some or all of these qualities:

•A person that has inspired individuals to be better at their profession •A person that has dedicated their career for the betterment of software testing and quality •A person that has shown exceptional leadership in promoting the advancement of our industry •A person that has shown exceptional ability in promoting and educating concepts for the advancement of the industry •A person that has published and promoted the industry that has resulted in greater recognition and respect for the industry •A person deserving of an award based on merit not particular to personality (although nominees may be very popular we want this to be about career achievements)

When I read those requirements, a few words jump out at me. Some appear on the page, others are just how I summarize the tone:.

Why Cem Kaner? Well, he's been around a long time. In the early 1980's he got a PhD in Cognitive Psychology, and he has been advancing the cause of software as a human-driven activity ever since. After doing some programming and testing in that PhD Program. Dr. Kaner moved to Silicon Valley where he worked as a tester, programmer, test manager, documentation manager -- and get this -- he got a job at night in a retail software store while he held those jobs so he could get closer to the customer.

In 1988 Dr. Kaner wrote the first edition of "Testing Computer Software", a landmark book in the field, which he brought in two co-authors to help revise into the 2nd edition in 1993. After Silly Valley, Dr. Kaner got a law degree, practiced law for a short time, and went on to write Bad Software, one of very few books about the legal implications of defective software. In 2000 he moved to Florida where he started teaching software engineering at Florida's institute of technology, the only school in the nation to my knowledge to offer an earned minor in software testing. Some of Dr. Kaner's former students are heavily involved in the test community today.

Then there is testingeducation.org, the source for free and open testing training materials. There is the Association For Software Testing (AST) that Dr. Kaner helped champion and organize. There is the Black-Box Software Testing (BBST) course that the AST runs for free to it's members, that Dr. Kaner helped create. There's Lessons Learned in Software Testing, the most recent book Dr. Kaner has served as lead author on. Not to mention his publication list, which would likely use up more than all of the ink currently in my printer. And I haven't even mentioned the LAWST-Style peer conferences that Dr. Kaner lead, inspired, and served as an initial host for.

That's the short list; I've tried to hit the highlights. But notice something about those publications: He was almost always the lead author. They weren't done alone; they were done as a part of a community. And he doesn't do the classic professor trick of sticking his name on top and letting the "little people" do the work: Dr. Kaner is a do-er through and through.

Now look at the dates - this is a gentleman who views career as a marathon. He started his first testing job on his resume in June of 1983, and hasn't stopped since. I already mentioned his publication list.

I don't think it takes a genius to connect the dots back those initial requirements, nor to see how Dr. Kaner is uniquely qualified for the luminary award.

Yes, I said uniquely.

When I think of a 'luminary award', sure, there are other names that come to mind, but when I think of those four bullet points, only one name passes the test. But, just as an exercise, let's talk about some of the folks who might deserve an award, just not this one:

Dr. Jerry Weinberg: Author of over forty books on computing, consulting and quality, Jerry has also ran an enviable marathon career, going back to being the test lead on the Apollo spacecraft in the 1960's! Jerry built his own special community and inspired a generation of consultants. James Bach went so far as to refer to him once as the "Prince of Testers." But that other requirement is to focus your career on quality and testing, and I'm afraid I can't say that was Jerry's focus. Consulting, certainly, and Delivery, yes, and maybe Quality, but testing is not the main thing of the main thing of the main thing for Jerry the way it is for Cem.

Dr. Paul Jorgensen: My old professor at Grand Valley, Paul Jorgensen spent twenty years in industry testing telephone switches before getting his PhD and teaching software engineering for twenty more. Paul is also the author of Software Testing: A Craftsman's Approach, now in it's third edition. Massively published, long career, testing and quality, yes, and I have great admiration for Paul. What I can't say is that he's built a community around him the way Cem has. Every one of Cem's project is collaborative, every one seems to get new people involved. Paul deserves and award for individual effort, certainly, and has done a great deal to foster community in his decades of service. I just don't think he's a perfect fit for this particular community award.

Buccaneer Scholar James Bach: Is probably the closest. James also contributed to create AST, he's keynoted at every single international test conference, he also co-authored LLST, and he's invested a fair amount of time in helping new testers get into the craft. Like Cem, his publication list is a mile long, and he's done a significant amount of training of testers, as well as doing. Yet he started later, and he was a contributor on one book, not a lead author of three. I've no doubt James is worthy of a luminary award, I'm just not sure it should be the very first one. Also, I expect in the years to come, he will become more and more worthy.

There are a few other names you could mention. Michael Bolton, for example, probably falls into the same category of James, and so do my friends Mike Kelly, Ben Simo, and a handful of others. There are also several people like Glenford Meyers, Richard Bender, or Boris Beizer who did very promising work but didn't stick around to wrestle with how their ideas were implemented in practice, to see if they worked. One thing Cem and James have done is come back to revise those ideas in light of feedback from the field - both in terms of what works, and how to explain the concept so there is a high transference rate.

Likewise, there are some agile-testing names that may be eligible for the award in a few years, but right now, I'm looking at the intersection of lifetime, published, community, testing and quality. And I'm looking at Cem Kaner.

Now I am open to other people interpreting those requirements differently, and I am open to debate. You can suggest anyone you please, and nominate and later vote for anyone you please. But I hope that, after some reflection, you will join me in nominating Dr. Cem Kaner for the STP Luminary Award within the next week. After nominations, STP will open the community to vote on the top three or four selections.

The nomination form is up on the web right now; please join me in nominating someone you respect, value, and appreciate.

Wednesday, June 16, 2010

Part of the process of dusting off Creative Chaos and reusing it again is moving my RSS Feed. The new RSS feed is http://feeds.feedburner.com/CreativeChaos. In order to index the RSS feed, I need to tell Technorati, a blog index-er, that I own the blog and the new feed is correct. To do that, they give me a random 'code' and I put it in a blog post, indicating that I truely have control of the site.

Here's the code: 8Q94285VRTT4

Thank you for your coorperation as we remodel my blog; I apologize for the mess! :-)

Tuesday, June 15, 2010

... So we're driving out to my daughter's softball game in Hopkins, Michigan, and I choose to go straight instead of turning left at the shortcut. We've got plenty of time, and if I go straight I can turn past Sacred Heart Mission, which is a nicer drive.

My wife points out that this is not the fastest route. This is not the first time I choose the slower route; I commonly turn out the driveway to go downtown instead of taking M-89. Why is that?

I Pause.

Have you ever had one of those moments where you realize there is no right answer? Where the other person asks "why" or "couldn't you just...?" but they actually mean "There is no good reason" or "You should"?

Again, I Pause.

Carefully choosing each word, I say something like "well, I'm reluctant to answer. A proper answer would take a long time. But the short version is this: I suspect that I do not value efficiency as much as you do."

That did not go over well.

So let me share with you the longer version, that went over better.

Efficient ... or Effective?

We tend to use these terms interchangeably, but when you look them up in the dictionary, they mean subtly different things. Efficiency is a measure of utilization; we might say a gas furnace is highly efficient if most of heat goes out the heating ducts and into the house -- as opposed to out the exhaust, or heating the basement. A 100% efficient furnace would let no heat escape. When it comes to technology workers, we typically mean a team is 'efficient' if everyone is working, all the time. To do this with clerical jobs, we can create an always-filling inbox, but with knowledge workers we typically need to have them work on multiple projects at the same time.

Effectiveness, however, is different. Effectiveness is how good we are at accomplishing the task at hand. Consider the typical fire department. Couched in these terms, it is not efficient - 95% of the time a paid staff is sitting around, training or cleaning the equipment. Yet when we do have those 5% of emergencies, we need someone to respond quickly.

In that case, the fire department has a conflict between efficiency and effectiveness. And, for the sake of public safety, they'll choose effectiveness every time.

Likewise, all over the world, there are fighter pilots and entire airborne brigades standing at the ready, collecting salary. 99% of he time, they'll be sitting around playing cards. But you want them standing at the ready, don't you? I sure do.

In some jobs designed a certain way -- perhaps on an assembly line -- efficiency and effectiveness can be interchangeable. Not so in software.

This isn't my idea; Tom DeMarco explains how administrative assistants need a quick response time, thus they need to have built in slack in his book by the same name. Eli Goldratt, creator of the theory of constraints, talks about having factory employees waiting for the instant that the machine is done in order to load quickly in his opus The Goal.

But what's that got to do with driving to a softball game?

Well, it has to do with optimization. Optimization is another word we like to use a lot, and it ties back to efficiency or effectiveness. Optimization doesn't just mean doing a better job at the work. It means pushing the work toward the best possible way of doing it -- the optimal way.

The things, I find that optimal is generally best for a given way of thinking.

When we do the building project, and we get to the point that we are trying to save a penny on each brick, someone will say "hey, with ten thousand bricks, you are starting to talk about real money."

I yawn. Or, more accurately, when we get to an optimization problem, I start looking for a different way of thinking. For example, if we are near the best we can possibly get for brick, let's look into concrete or some other building supply.

Optimization, in my experience, is often a lot of work to squeeze out a little more reward.

I'm not excited about it.

In a similar way, if I'm water-sealing my deck, and I'm moderately fast at it, I don't look for a faster system of using a paint brush. Instead, I'll look into using a paint roller. I don't want to optimize paint-rolling, either -- I discovered spray-on water seal.

It's the same thing in software. Oh, sure, it's good to write a batch script to save us some typing. It's good to have an automated setup of our test environment instead of manually configuring. Don't get me wrong. But there's something that happens -- some tipping point, where we are spending a lot of mental energy to eke out an improvement from 95% to 97.5%. Me, I'll just leave it at 95% and look for something else to improve.

In some cases I just don't find any more improvement. Until hovercraft, low-flying aircraft, or the teleporter become popular, the fastest way to get to Hopkins from Allegan is going to be taking A-37.

It's going to take at least 55 seconds longer to get there if you take 20th Avenue.

Forc me to choose between efficiency and effectiveness and I'll choose effective every time. Yet even when the two align, sometimes the difference in approaches is round-off error. And, when that happens, there are other things to consider, like Aesthetics.

20th Avenue is a drive I haven't taken a thousand times, and it has it's own special charms, including Fat Blossom Farm and Sacred Heart Hall.

I really do believe that print journalism needs support right now, and subscribing is win-win. If I was a better behavioral psychologist, and my goal was to get you to subscribe, I think I wouldn't provide the link. At the same time, I believe that information wants to be free, and 'walled gardens' of information are kinda silly.

So there you have it. The article is out there, it's good stuff, I hope you enjoy it.

Sunday, June 06, 2010

Do you remember when Monster.com was actually a good source of job leads? Why, when it started, only tech-savvy people, techy-savvy hiring managers, and a handful of recruiters knew about it.

Then something happened. Happy, bright people told their friends, who might not have been quite as bright. They got big. They ran a TV ad During The Superbowl. The recruiters got wind that Monster.com was a place to find talent, and started putting out 'skills' based job ads, so they could build a rolodex, just in case a job popped up later.

Popularity brought with it to the attention of other companies that are interested in meeting people looking to improve their career prospects - colleges, online diploma mills, resume-writing services. These other companies started to offer Monster "Business Development" deals (money) for access to the monster candidate list.

Being good guys, the Monster Management refused, instead allowing you, the customer, to opt-in. Over time, they started to force you to at least re-consider every so often. Then they made the 'yes I'm interested' buttons bigger and bigger and the 'no thanks but let me see the job' buttons smaller and smaller.

Eventually, all the cool kids found craigslist. The problem was, happy, bright people who were pleased with Craiglist told their friends ...

About that same time Joel Spolsky's website, JoelOnSoftware, was hitting the top of it's popularity. Joel had created a regular "gravity well" for developers, and created jobs.joelonsoftware.com. It's a super-easy, super-simple site with only a few hundred listings. When it launched, and to some extent, even now, you don't really need to search. Just scroll down looking for interesting gigs.

Sadly, jobs.joelonsoftware.com doesn't really have a test/QA focus, so the search goes on. Two more sites I find interesting are jobs.freelanceswitch.com, another site that did the gravity-well thing, this time for freelancers, and jobs.37signals.com.

37Signals is that company that built basecamp, the office productivity tool, and has expanded to create a whole series of web-based collaboration tools. It's not really testing focused; if anything, they focus on graphic design -- but one thing they do have is really good writing.

Speaking of writing, the CEO of 37Signals has a monthly column in Inc. Magazine; I just got the June issue Saturday. This month's column is called "Never read another resume." It's for hiring manager, and suggests that resumes are inflated, misleading, or just plain tough to sort out. Jason suggests hiring for people the write a custom cover letter or have a portfolio of work over poring through resumes. (Or, maybe, people who find you at the cool website before it jumps the shark.)

When I started writing this, I intended to link to the article, but I'm afraid it's not available online yet. They may delay it for a few weeks (to give a perk to us paid subscribers), or they might never offer it online.

I subscribe to a half-dozen magazines, but there are only two I devour every time; Inc is one of them. You can pick up a subscription to Inc for ten bucks. For that matter, before you buy that next airline ticket, sign up for a rewards program with that airline. You'll likely earn enough points on a single flight to get a free subscription to Inc.

I have no financial relationship to Inc.; I am simply a fan.

Speaking of being a fan, I'll be at CAST 2010 in August. It'll be at the Prince Conference Center at Calvin College, but last week the hotel sold out of rooms. I recommend the Residence Inn (by Marriott) or Country Inn & Suites (my Mariott), both on East Beltline. I have to drive up east Beltline to get to the Conference Center, so if you're a longtime reader (ideally one I've met in person) and book a room (preferably at the Residence Inn), I might be able to swing by to pick you up.

We may also organize a few 'rebel alliance' after-conference events for CAST.

Stick around, more CAST details to come.

UPDATE: The Prince Conference Center reservation system has issues. (Shocking!) The word on the street is that they still have 5 rooms left for the conference. You can call 1-866-526-7200 to talk to a person.

Friday, June 04, 2010

It's been fun posting at "Testing at the Edge of Chaos", over at Software Test Performance Magazine, and also fun contributing to a monthly column with Chris McMahon.

At the same time, the magazine is changing. I'm happy about the changes, and I hope you will be too.

I talk a fair amount about web 2.0 and user created content; heck, in my day job, I test software for an Social Media Company. Yet when you think about a magazine, a website, and a conference, the model is something out of 1953:

Second: The whole SoftwareTestPro website has been revamped, including the creation of 'crews.' Crews are a professional membership feature, something like a special interest group. The folks at STP are also throwing some support behind a local chapter program, so you can create local user's groups. A professional membership costs $100 a year, but to get a test, they've opened up my "Ask the Tester" Crew for anyone with the free basic membership -- at least for a limited time.

Third: The monthly column I share with Chris McMahon is changing, from a sort of encyclopedia of testing into "Ask the Tester" -- were we invite experts from the community to be interviewed. Chris wants to move on to writing features and other content, so instead of a on-man interview crew, we'll ask the community to come up with questions -- that way, our experts answer your questions. Our next interview will be with Michael Bolton; but to write the article I'm afraid I need, well ... the questions.

So please, leave a comment with a question for Michael about testing, along with your name, city, and state/province or country.

It's an exciting time for STP, and i'm pleased to be able to dual-blog now.

Thank you for bearing with our construction in progress. I hope you have questions, especially for Michael. Please, ask away! :-0