Tag: testing

To make a good use of my time while “waiting for the stars to align”, which means waiting for some managers to remove impediments, I started writing down some real life notes about how to set up things for a miserably “failing” test automation strategy.

Let’s start with the communication details. The first thing you should completely avoid is to talk to product owners or other people holding job titles that have the words like “Manager”, “Director”, “Chief” or “God”. If you are so unlucky that they would come to you and ask about what are you doing, under any circumstance, do not explain what are you going to do and how you can benefit them with your craftsmanship. A good enough reply would be something along the lines of “Oh, well…..blah blah blah” (Put a technical obscured verbose sentence here, which is so complicated that you don’t even understand). And hold your face straight without any doubts or smiles until they fade away thinking that they are so big shots and you are such a big guru in your subject.

The next step after you avoid explaining any detail about what you are going to do is to avoid any collaboration with developers and manual testers. They are so into another league of competences where they will never understand what you are trying to do. Do not waste time explaining them what are your needs and how you could help the team to achieve better outcome of their craft.

The developers are so busy creating what is popularly known as a technical debt. Nowadays everyone likes debt which is holding the world together. If you do not have a debt, you are not human, so coders are full time dedicated to that goal. Do not disturb them.

For the manual testing team, well, they are almost testing a complete different application. Their testing has nothing to do with automation and again a complete waste of your time trying to align with them about what tests should and should not be automated. Do not even think about to disturb the peaceful manual boring procedural testing either.

The last thing you must not do is to spend time trying to convince people on that Test Automation could help everyone on earth. That is like a “Taboo” subject and you risk your job, potentially your life if you even just think about it. If you keep trying to do the right thing, your teammates will see some sort of automation symbol in your eyes and you will be branded as a strange geek.

I will stop here for now because I am getting so excited that I could smash the laptop against the window and…

Keep safe. Keep playing the game. At the end of the day, most of the people are having an immense hilarious amount of unqualifiable fun!

This post is open for collaborations. I will add up any good contributions to this subject. I am sure that after all this time working in our beloved testing projects, we have learnt something, didn’t we?

“Those who cannot learn from history are doomed to repeat it.” George Santayana

For those who want to avoid all these failures, here there is a professional resource to back up our miserable experiences. In chapter 9, RoI Robbers in test automation by Greg Paskal, has a superb list of points that has to be considered when building up a testing strategy.

Update by Francesco Calvino

To help the not-to-do list, make sure that your core application hasn’t got any attribute like:

“readability” (the ability to read the code and understand what the app does just by reading the code)

“testability” (the ability of having elements properly and clearly identified on the page, log files that explain what error you just found has occurred and an application that is not coupled with the test data)

“stability” (the test environment should be stable and not going up or down like a yo-yo)

Also avoid decoupling where the tests should not be depending on the data that feeds said tests, and why not, you should not have instrumented builds. (instrumented is when your application is enriched with 3rd party tools that will enable automation tests to run) Although this is not a bad one, not having an instrumented build is the surest way to know that the application under test is NOT exactly the same as the application that you will ultimately ship to your customers. Thus, all your testing campaigns will be utterly useless.

For now I think it is enough Marc….

Many thanks Francesco. I hope it helps, please leave a comment if you would like to add something.

Do not automate stories in levels 4th and 5th, do spikes and understand first by trying / testing. (Create failing test cases)

Convert 4th and 5th stories into 3th, 2on and 1st level stories.

Write “spiky code” before production code in levels 4th and 5th. In lower levels we write production code.

For stories in 1st level test manually just once. Do not BDD obvious stories, only create BDD test cases for complicated and maybe some complex stories. Do not plan for obvious stories, just do it.

New things continuously coming is part of the game. Reduce problems to fit within the boundaries of influence of the team. Break dependencies outside of the team.

Legacy projects are in the 4th and 5th levels of ignorance.

“Teams can spike, learn from the spike, then take their learning into more stable production code later (Dan North calls this “Spike and Stabilize”). Risk gets addressed earlier in a project, rather than later. Fantastic!” Liz Keogh

That was all for this session. I hope it helps, please leave a comment if you would like to add something.

The awareness of Software QA was raised in the last decade to a new realm. However, there has been some natural resistance as usual for any new evolutionary movement. This resistance is the new trend of merging development and quality and assurance into a kind of “DEVQA” roles. The movement is promoting “We have just developers! NO QA roles required!”

“The most fundamental problem in software development is complexity. There is only one basic way of dealing with complexity: divide and conquer”
Bjarne Stroustrup

In the current highly complex world, the quote from “Bjarne Stroustrup” certainly fits this debate. To merge developers and QA skills into one role is to go against the ‘divide and conquer’ principle. Yes, it is possible of course. It just depends on what level of quality is required. In case of developing just simple applications, software architecture and testing are not really critical. It could be enough with DEVQA roles. However, it is important to remember that software solutions tend to grow, rather fast, and if the team hits the “Jack pot” it would be very expensive to refactor on time.

In Spanish, there is a saying “Juntos pero no revueltos” which means “Together but not mixed”. To become a QA specialist requires years of real work experience as well as becoming a proficient developer. The ideal goal should be not to mix the two subjects but to reduce the distance between them.

However, nowadays with the intention to be “Agile”, people get confused and start crossing the line. The original trend was to try to reduce the distance between DEV and QA but some teams are crossing the limits too much and they mix everything on anyway and for any reason. “Because we are agile!” Yeah happy Agile party! All get mixed into a “DEVQA” orgy.

Therefore, in this new chaotic “agile” environment the questions raised to organise workflows are Who is more important?, Who has main responsibility? Who is shouting more? etc… These are a set of wrong questions. Instead, team members should start asking this way: “Who can contribute more to a given task?” “Who has more expertise and thus can be the reference leader of that task? “And who is the reviewer?” In a healthy environment, we should not even need a team leader to define these roles. Good professionals know when to lead and when to follow.

At the end of the Joe Colantonio’s podcast with Gerald Weinber, Gerald explains in more detail that in the good old days the QA team was formed by selecting the best Developers on each area, thus the collaboration between Devs and QA was optimal, exactly because there was no such huge separation of concerns and there was huge respect for each other. Yes, they were a team of only developers, the key point here is that the QA team was focused on building tools to improve quality rather than shipping new features.

There is another great source in order to dig in more detail about why promoting DEVQAs is a bad practice. “Joel Montvelisky” has a great post called“Why can’t developers be good testers?”. Even the title is not perfect, as it should be something like “Why developers can not focus on testing” the post has a good summary of key reasons:

1. Developers has “Parental feelings” towards their code.
2. Developers usually only focus on the “Positive Paths”.
3. Work based on the principle of simplifying of complex scenarios.
4. Developers are unable to catch small things in big pictures.
5.Developers lack the end-to-end & real-user perspective.
6.Developers have less experience with common bugs & application pitfalls.

While all the points are core principles of a good QA strategy, I would stand out the fifth point of “real-user perspective”. This point is a core concept of testing. If this is missing, all QA strategy becomes a hoax. The QA expert “Neeraj Tripathi” emphasizes this point clearly in the Joe Colantonio’s postcast“Principles of Effective Software Quality Management with Neeraj Tripathi”. As Neeraj states, “a QA specialist needs to keep customer perspective in mind”. Customer experience is the first class driver for a tester. One way to achieve this could be to do more BDD. Moreover, the main strategy is to have QA people in operational meetings and have direct contact with customers in a daily basis.

In summary, a QA expert should keep a customer mindset with full knowledge of the technical capabilities, while a Dev should have a more focused technical mindset. By correctly leveraging both mindsets, the project will be able to find the best strategies to succeed.

After all we should be able to agree that having both mindsets in the same brain opens the dangerous highway towards the new world of testing hoax strategies.

Many thanks for reading, please leave a comment if you have a testing hoax workaround.