Thursday, October 27, 2005

This is post is in response to few typical interview questions that are posted in various user groups. I typically don’t jump in and answer to them - But in some case I just can not remain silent - some questions and answers make me to speak up. As if they beg for answers. Here is one such occasion and here is how I responded....

1. What is difference between Bug and Defect: There are many definitions that float around there are no simple and universally acceptable definitions for these things. When used in an informal environment, both defect and bug mean same thing. It is some unwanted unexpected behavior that bugs somebody who matters. This definition of bug does not change depending upon the phase of SDLC. A bug is a bug is a bug is a bug. Same holds good for defect. I quote Cem Kaner, James Bach and Michael Bolton in this connection. Believe me they say same thing - no one can dare to question them in the knowledge in testing field. As per Michael Bolton - “I say that you may define "defect" in any way that you like, as long as the person that you're speaking with or writing to understands your definition."

Michael Bolton in a Google group post --

A bug is something that threatens the value of the product, or, if you like, a bug is something that bugs someone who matters. Both of these definitions come from James Bach. Your definition may differ. "We" depends on the context of the project. On a typical project, someone (the project manager) has the authority to determine whether something (a bug, failure, fault, defect, and symptom) is serious enough to merit attention. In my context, an intermittent problem is a bug if the project manager says it's a bug. James also wrote an article on intermittence in his blog; try http://blackbox.cs.fit.edu/blog/james

Following is an excerpt from Cem Kaner’s blog – Note that according to him the use of word defect in more formal context means "Legal Implications". If there is a defect in software, an end user can sue the producer of the software. He recommends that word "Bug" is more informal.

Quote: Cem Kaner --

I have two objections to the use of the word defect.

(a) First, in use, the word "defect" is ambiguous. For example, as a matter of law, a product is dangerously defective if it behaves in a way that would be unexpected by a reasonable user and that behavior results in injury. This is a failure-level definition of "defect." Rather than trying to impose precision on a term that is going to remain ambiguous despite IEEE's best efforts, our technical language should allow for the ambiguity.

(b) Second, the use of the word "defect" has legal implications. While some people advocate that we should use the word "defect" to refer to "bugs", a bug-tracking database that contains frequent assertions of the form "X is a defect" may severely and unnecessarily damage the defendant software developer/publisher in court. In a suit based on an allegation that a product is defective (such as a breach of warranty suit, or a personal injury suit), the plaintiff must prove that the product is defective. If a problem with the program is labeled "defect" in the bug tracking system, that label is likely to convince the jury that the bug is a defect, even if a more thorough legal analysis would not result in classification of that particular problem as "defect" in the meaning of the legal system.

We should be cautious in the use of the word "defect", recognize that this word will be interpreted in multiple ways by technical and no technical people, and recognize that a company's use of the word in its engineering documents might unreasonably expose that company to legal liability.

Unquote Cem Kaner.

2. Bug severity v/s Priority - Who assigns them: When the developers have plenty of time, bug arrival rate is lagging behind the fixing rate - nobody really cares about "Priority" and to some extent "Severity". Both of these are filtering mechanisms to select few bugs from the whole lot so that only important ones are fixed first. Severity is one way of grading the bugs from "bug impact" point of view - so that tester can influence the fix - say "This is more serious needs to be fixed first". After all, as very few people know - real value of tester is in getting a bug fixed than simply logging it. Severity can be/is assigned by the tester; can be modified by test lead if there is real need. There after it is in developer’s court. Developers use rating called "Priority" to pick top bugs to fix. So priority is set by Dev lead in consultation Program/Project manager some times even client will get involved. This can not happen without buy-in from test lead. What I am describing is IDEAL situation. In a mature Test organization this happens. I have see this (been a party to it) happening in companies like Microsoft. In Microsoft (also in many other organizations) – they use a process (ceremony in Agile world) called “Bug Triage” where dev lead, test lead and PM sit across the table with the list of bugs and deliberate on severity and priority. More often than not, the discussion is more oriented towards “Priority” than “Severity”. Bug Triage meeting is a formal platform to change the “Severity” and “Priority” levels.

In most of the test groups I have seen, Severity rating unfortunately is a means of performance of developer or tester. Like “This tester has logged 10 Severity 1 bugs” OR “The module developed by you had maximum number of Severity 1 bugs”. But then, that is another big topic of debate.

Last but not the least - All those who wanted to know about bugs but did not know whom to ask - read "Bug advocacy” by Cem Kaner - the bible on bug management. You will never have any doubts about bugs in rest of your life.

Wednesday, October 19, 2005

Testability to me is a feature of the product that makes it testable (a synonym for “usable”) in multiple contexts and ways. A mature testing process would always vouch for building testability right into the product during design stage where the developers are more willing and accommodating to accept “requests” from testers.

Typically only bugs make developers to listen to testers. Another important use of testability is that it makes “Test automation” more easy and efficient. Testers while doing automation need not spend hours and hours to create custom code to verify a test which could have been implemented as test hook – say a log file or a registry entry or database record or a status bar text.

Ask for a test hook.

Just to give an example – I was looking at an automation idea where the test was to kick off a batch file and wait for it complete and then based on the result of the batch file, automation script would proceed. The problem the team was facing is how to make wait for script to finish. One option given by a tool vendor is to use a wait commend with some hard coded wait period.

Here is where stuck this idea of testability feature. I simply asked the developer to include a feature/task of creating an empty text file or registry entry to mark the end of batch processing. Developer happily included that feature which made our “wait” task easier and efficient than putting a dumb wait(100) kind of command.

Another example of test hook is command line interfaces or APIs to product features that are typically driven with GUI. This will ease automation script maintenance when GUI unstable or changing.

One more example. We were planning to automate a feature where user would fill-out a form with lots of data which eventually would be submitted to web server in the form of an XML file. We wanted to simulate a load on web server by submitting a large number of such forms in a given span of time. When we asked the developer for help, he said that feature it not available. Then, we asked the development team for an API that would take the path to an XML file as the parameter and submit the contents of the file to web server to simulate form submission from client end. This made whole of automation work easy. Later on this testability feature was extended to include many interesting test hooks to make the job of testing easy.

Michael Hunter talks about testability at his blog – makes a good reading.

Monday, October 17, 2005

Here points about resume writing that I collected from various sources. I am not sure about the last point.. People do include (me too) their personal (contact) information in the resume so that they are "contactable".

I have been observing that people write resumes that are typically 8-10 pages. As a hiring manager, I mostly see first 2-3 pages, if I don’t get attracted by the profile till then; I will not read any further. My thumb rule is to have a page for every 2 years of experience. I have also observed that people write about their current and past employers in lengths - often about half a page for each employer. This is waste and is going make your resume less readable and less attractive. Sell yourself in the resume not your current or past employers.

Final point is that treat your resume as an ad that you put up in TV to market yourself in the job market. Decide what all you want to go in such an ad. Do you think - people sit and watch ads that are hour long and lack focus?

Read on...

Resume Writing: Seven errors common to an average resume

Too wordy. A résumé should be one page in length (one side only), or two pages at the most. A résumé is primarily an introduction - in the same way an advertisement is primarily an introduction - and should be under conscious control every inch of the way. Basic outline: Position Desired; Summary of Qualifications; Education; Skills; and, Employment.

Contains salary requirement. This is a big mistake. If you list a salary requirement it may well appear, to someone who has yet to appreciate your real value, to be too high or too low, and you may never get the chance to explain or elaborate. The thing to do is first make a favorable impression, and evoke some corporate response. There will always be time later to negotiate your salary - after the company decides it likes you and wants you and you're in some kind of bargaining position. It may be that their offer will not require negotiation.

"Me-oriented" Excessive use of the word "me", or "I" and prominent use of the phrases such as, "I seek," "my objective," etc. are to be avoided. Employers want to know what you can do for them. You must lead off with and elaborate on your benefit to the employer; plays up to what you think are the employer's objectives.

Assumes too much reader comprehension. This takes the form of listing and explaining numerous accomplishments, courses taken, etc., not necessarily related to your position objective.

Contains unnecessary and confusing information. (Different from being too wordy). You must be specific. Everything in your résumé should support and point to a single skill/expertise. In advertising, the simplest ad is best. No ad, no matter how high-powered, can sell several concepts at once. Neither can a résumé.

Stiff, formal language. Don't be flip, but make it readable. Aim for your audience and the people you want to impress. In short, communicate.

Includes personal information. Do not include any personal information. Name, home address, and home voice phone that's it.

Friday, October 14, 2005

Here are few points of the post that I made for QTP yahoo groups - in response to a query made for FAQ/interview questions on QTP. The main motivation for me to post this is that I see most of the new entrants in this field, do not know where to invest time and effort to know the field of testing. Often they end up reading some FAQ or Typical interview questions posted on some site and think that they have arrived. Software testing today suffers from lack of education and awareness about "What it takes to be a software tester" and "how to successfully carry out and add value to the overall process of software development.

Read on .....

Here is my advice to all aspiring QTP or Test engineers and professionals. These are lessons I learnt personally and useful for any software professional who is serious in testing.

1. Do not look for short cuts to learn and get knowledge. Have a long term plans to get good mileage in this profession. FAQs, etc are good to read only for knowing top line. To succeed in the interview you will have to win it from inside of your heart, invest honestly in studying and expect fruits. Banking on FAQs, interview questions etc may get you the job but will not keep you there.

2. Most important for a tester is to understand what makes a good tester? How he/she is different from a developer? What value tester brings to the table? How to find talent in testing and nurture it? How testing is different from QA or any flavor of process (CMM, Six sigma) etc.

3. Invest in sharpen problem solving and "thinking out of box" abilities. Read good stuff on testing. Participate in conferences, discuss with test professionals in other companies, participate in activities in SPIN, etc. Solve puzzles ( zig saw or shankuntala devi). Never stop learning.

4. Sharpen Technology skills. Know about "How web works" , DNS, Networking, protocols, XML, Web services, Cryptography, databases, Datawarehousing, UNIX commands, Fundas of J2EE, .NET, system admin list is endless. Today testers are expected to know the basics. I take lot of interviews for various positions. Most of the people do not have these basics. It is difficult to survive in this world of testing only banking on "Automation tool" knowledge.

6. Improve communication skills - take English class. Improve vocabulary. Read Read and Read.Most of the people I have seen ignore this important skill. They can not write a paragraph on their own without spelling and grammatical mistakes. Make a habit to learn a new word a day.

7. Read and write on blogs ( Google to find out what is blog- if you don’t know already). Here are few blogs that I suggest for every test professional.