Raji Bhamidipati and I co-facilitated a day-long workshop at ExpoQA Madrid in June. We shared techniques testers can use with their teams to build shared understanding at the product, release, feature, iteration and story levels. Participants worked in teams to try out the various techniques and contribute their own experiences with ways to enable shared understanding among the delivery team and customer team.

This was a new workshop for me and Raji. We based it on a tutorial that my co-author Janet Gregory did at a recent conference. It includes some valuable agile business analysis techniques and ideas from Ellen Gottesdiener and Mary Gorman in their book Discover to Deliver. We are grateful for all this terrific material we were allowed to share. Our slide deck is available, but this workshop was about the participants doing and practicing more than about me and Raji talking. I got a lot of new ideas to try myself!

Adding value as testers

We encouraged a mindset shift from bug detection to bug prevention. We shared some essentials for stories and

Table groups practicing techniques to explore requirements

requirements. The INVEST criteria from Bill Wake apply to feature sets and slices of features as well as to individual stories. The 7 Product Dimensions from Ellen Gottesdiener and Mary Gorman have been a huge help to me when discussing proposed features with business experts. We can consider these dimensions together with the Agile Testing Quadrants to help explore different aspects of a feature at the appropriate time.

A case study from an imagined tour bus company provided participants with features and requirements to explore. We used our tester’s mindset, the 7 Product Dimensions, INVEST and agile testing quadrants to come up with questions about functional requirements and “non-functional” quality attributes. Each group worked on a different dimension, and shared these on a wall chart of all 7 Product Dimensions.

Techniques to explore requirements

One table group’s story map exercise

As we moved down the levels of precision from release down to stories, participants tried out several different ways to explore requirements with customers: user story mapping and personas (see Jeff Patton’s excellent book ), context diagrams, process map / flow diagrams, state diagrams, scenarios, business policies, example mapping (see Matt Wynne’s post), and various approaches to guiding development with business facing tests – acceptance test-driven development, behavior driven development, specification by example. Participants practiced writing these acceptance tests together. Combining a story, examples, rules, acceptance tests, and most importantly, conversations, helps us all get on the same page about each requirement.

Obstacles and experiments to overcome them

Throughout the day, participants used the “speed car – abyss” retrospective / futurespective activity to identify what’s holding their team back, what’s helping them move forward, what dragons may lurk in their future, and how to overcome those. A lot of common themes emerged, as well as unique issues and new ideas.

A retro chart from one of the table groups

Parachute

It wasn’t surprising to see that missing, incorrect, misunderstood and changing requirements drag down many teams. Teams struggle because of poor communication with customers and end users, unavailable product owners (PO), poorly-defined priorities, and dependencies on other teams or pods, either within the company or external. Many teams lack time, resources and skills to do useful test automation, and are slowed down by a reliance on manual regression testing. Programmers and testers on newer agile teams often aren’t used to working together, and have communication issues. For example, programmers may dismiss bugs reported by testers. Even basics like lack of CI and not being allowed to self-organize their own workflow. Slow feedback loops, lack of testers, too much work, and inflexible deadlines were also common themes.

I was more surprised that some teams are held back by confusion over who should test what, and oversimplifying features and stories. I also hadn’t expected that testers and teams often don’t know how to communicate problems to managers. Teams need to be creative for these types of challenges.

Engine

Participants had good ideas early in the workshop to put some gas in their teams’ engines. Many want to try the 7 Product Dimensions. Visualization techniques such as story mapping and process flow diagrams looked like good options to many. We had discussed Three (or four or more) Amigos or Power of Three meetings along with example mapping, which some participants are already using, and others are keen to start. Models such as the test automation pyramid and agile testing quadrants help power some teams’ engines. Pairing is also seen as a good way to overcome drag. Quite a few participants are already doing exploratory testing, and more want to try, including techniques such as investigating competing products.

Some ideas I was reminded of by participants included using the MoSCoW Must/Could/Should/Won’t prioritization method. Several participants cited smaller, self-organizing teams, pods or squads as a key to being able to effectively explore requirements and build the right thing. Team building activities were mentioned as a way to help with that. Someone mentioned a firefighter role, someone to come in and help in sticky situations, which intrigued me. Another great idea is to add people to help coordinate activities among teams and help manage dependencies for delivering software that meets requirements.

Abyss

Many participants fear the same pitfalls as they’re already experiencing, such as changing requirements, unclear requirements and priorities and adding stories during the iteration. They’re worried that they’ll build the wrong thing, or fail to deliver on time. Broken or non-existent test environments lurk in the abyss. So do miscommunications with the PO and customers, misunderstanding of business rules, and a lack of documentation.

One insight is that teams get stuck into old habits and can’t get out of their comfort zone. They don’t experiment with new ways to discuss examples and business rules with customers, or spend time learning the business domain so they can better understand their needs. Some teams may get tripped up by working on more than one big feature at the same time. Others get stalled during planning, or don’t get test data in time, and then can’t deliver on time. Often they’re faced with unachievable deadlines or at the mercy of bad business decisions and micro-management.

Technical debt is a common pitfall. This also limits a team’s ability to deliver on time. Lack of testing skills and insufficient knowledge transfer can take a team down and result in incorrect implementation of features. Some participants worried that their team would lose sight of the business goals and priorities as they get bogged down with problems and technical debt. Some just don’t have enough people, especially testers, or enough time to get their work done. This also means they’re not being allowed to self-organize. Interestingly, some people feared spending too much time on testing.

Automation was again much discussed. Some teams have no test automation, others focus so much on automation they fall short on other testing activities. They worry about missing edge cases or backward compatibility issues.

Assuming that the business experts will be available for conversations and questions can lead to trouble. Another interesting insight was the possibility of a team confusing agile with ad-hoc, and going off in the weeds.

Bridge

Another group’s retro chart

My favorite suggestion from a participant to help build a bridge over the abyss is “learn mind reading”. I help with customer support on my team, and mind reading would really help there too! Here are many other great ideas to successfully explore requirements and build shared understanding with business experts.

Many participants plan to apply the INVEST method for features at planning and grooming meetings. Using visual techniques such as flow diagrams for complicated user stories is a good way to avoid misunderstandings and make sure edge cases are covered. Simplifying stories and making them independent is also part of our participants’ bridge. structured, visual discussion frameworks such as story mapping and example mapping helps clarify details before starting testing and coding. Get examples of desired and undesired feature behavior and business rules. Use personas to help elicit requirements.Slice big stories smaller.

Participants feel it’s important to work in small increments, take “baby steps”. Some participants plan to experiment with BDD and SBE. They thought starting on a small scale or doing a spike would help get buy in to give it a try. One group suggested combining these approaches with quality attributes to help make sure they think of everything. Others plan to experiment with Kanban and other processes. Finding ways to measure progress is important to see whether various techniques are helping the customer and delivery teams understand what to build and avoid unnecessary rework. Simply agreeing on a definition of done helps.

Another important area for the bridge is educating management. This helps teams get the time and resources they need, along with the ability to self-organize and manage their own workloads. It also may help ensure that stakeholders, designers and other key players are available for conversations and collaborating, and allow them to experiment with new practices such as pairing. They can promote more collaboration. More support from management also helps with finding ways to solve dependency issues among teams and improve cross-team communication.

Some participants want more test automation, including at the unit level, to help build their bridge. This is one way to help keep bugs out of production. With more time available, they can learn the necessary skills.

What will you try?

I agree with the group that suggested passion is an important way to bridge safely over the abyss! We need to be dedicated, disciplined and excited about collaborating across the delivery team and with the customer team. Conversations, examples, tests and requirements at all levels of precision combine to help us delight our customers and end users with great features. As testers, we can contribute so much to help our team deliver value frequently, predictably and sustainably.

What will you try to help your customers and your delivery team collaborate to specify and deliver valuable features?