Post navigation

Last Saturday, I attended my first monthly meet-up hosted by the Testing Camp in Timisoara. This session placed special emphasis on Go.CD and Docker, while aiming at introducing the topic to those of us who are yet unfamiliar with such tools or frameworks. Since I’ve mainly focused on Manual Testing so far, I was eager to get a taster of what Automation is all about, especially in the context of Continuous Delivery. So I packed my laptop and went to join the fellow Meeples who had registered for the session, delivered by Alina Ionescu and facilitated by Camil Bradea, Iulian Benea and Ecaterina Ganenco.

In order to assist us in this process, Alina informed us in advance about the tools we needed to install on our computers, prior to the meet-up. During the actual session, she also provided a step-by-step guideline for us to rely on while creating our testing environment. Once receiving the instructions, we paired up and started working on our assignment. It was quite challenging to navigate our way via the Terminal by typing and executing text-based commands, but it was all the more satisfactory when the steps started rendering results. Whenever we seemed to get side-tracked by an error message or some other type of constraint, we exchanged views with other teams or received support from the facilitators, who mingled and tackled as many questions as possible. By the end of the meet-up, I was thrilled to have set up my own GitHub account and to have performed my first Commit and Push. Creating the necessary stages and jobs was equally rewarding.

Most of the participants managed to run an automated test on their machine within the allotted time frame. Even those who had only succeeded in passing some of the stages, simply resolved to try again at home, since we could all just revisit the steps in the guideline that Alina had provided at the beginning of the session.

To sum this whole experience up, I’ll just leave you with my “Lessons Learned”:

This year, I submitted my first registration for the Autumn Camp, which proved to be a memorable and engaging learning experience. Instead of attending presentations, I had the opportunity to get acquainted with content owners that favoured the workshop format. Hands-on practice was the motto and each session mapped the journey towards a measurable learning goal. The content owners employed scaffolding techniques and strove to accommodate any and all emerging learning needs. My only regret is not having been able to select more than three workshops. On the other hand, having 8 parallel tracks on a daily basis, followed by innovative Test Labs, provided the 76 participants with diversity and the opportunity to pursue their topics of interest at leisure.

Without further ado, I’m just going to walk you through my experience at this year’s edition of the Autumn Camp, powered by TdT (22.-25.09.2016). Since I’m a passionate trainer & coach myself, you’ll also get some insight into that perspective, in addition to my interests in the field of testing. Enjoy! 🙂

To be or not to be Agile?

The first workshop I attended covered an introduction to Agile and it was delivered by Camelia Codarcea, the co-founder of AgileHub in Brasov. A seasoned Scrum Master with hands-on experience in Romania and abroad, Camelia cunningly balanced theory and practice within the allotted time slots. The morning session revisited the Agile Manifesto, while engaging the participants in a lively discussion about the 12 principles. We could each draw on our experience with Agile and its various frameworks, also considering shortcomings in terms of implementation. Another topic that prompted us to contribute revolved around splitting epic features and delivering functional software at the end of each sprint. The challenges and benefits of achieving Simplicity (“the art of maximizing the amount of work not done”) were also on the agenda.

This exchange anticipated the focus of the afternoon session, which featured group work for a practical display of Scrum. Once divided in teams of four, we each received our scenario and handled the assignments, whilst measuring our progress against the Scrum Board. Having phrased the User Stories in our Backlog, we practiced estimating and prioritizing by means of Planning Poker. Allotting as little time as possible for Planning & Grooming allows for more time dedicated to the actual tasks. It also reduces the level of frustration when it comes to the inherent variables of Continuous Delivery. Our hypothetical ride wasn’t smooth either, since the sprint we simulated was interspersed with changes in the customer’s needs, which we had to tackle by redefining priorities. This was aimed at our gaining a deeper understanding of the framework and of the fact that changes are not mere whims that trigger reshuffling of User Stories. The differences between Agile and Waterfall were thus easily traceable. Consequently, we went through Daily, Review and Retrospective, each team designating a speaker for debriefing. However, other team members could always pitch in and add relevant information. I really enjoyed working with my fellow team members Roxana, Paula and Loredana. Our different professional background enabled us to handle change at a faster pace and to take over tasks accordingly. Although we were on a tight schedule (both in terms of the simulated sprint and the actual workshop), the whole endeavor was truly pleasant. Diversity for the win!

Explore all Avenues!

The second workshop I had registered for featured two passionate content owners, “veterans” of the Testing Camp: Oana Casapu (Cluj) and Claudiu Draghia (Bucharest). Their session placed special emphasis on debunking misconceptions about exploratory testing allegedly not having an underlying structure. The activities Oana and Claudiu had in store for us followed a well-structured progression towards achieving this goal.

The first step dealt with semantics in a technical context. Once we had established the difference between an “approach” and a “technique”, we were prompted to embark on an individual assignment, bearing in mind what Cem Kaner put forward: “Exploration and script-following reflect broad visions about the best way to organize and do testing, not specific tactics for designing individual tests. Therefore, we call them approaches rather than techniques.” The ensuing debriefing session rendered interesting results. However, throughout the individual assignment, the participants favoured techniques they were already confident with or had an intrinsic affinity for: Domain Testing, Risk Testing, Usability Testing, Load and Stress Testing etc.

That is why the subsequent activity aimed at encouraging us to forage deeper into our toolbox of techniques and re-think our strategy in context. Coming up with mnemonics to sort through the variety of techniques would prove equally effective. In order to aid us in drawing up our mission, Oana and Claudiu introduced us to Session-Based Test Management and its metrics. This instrument is highly useful in structuring exploratory testing, paradoxical though it may sound. It caters for coverage, quantifies effort and enables backtracking per session. The SBTM format, in addition to a bug reporting tool, can provide more accurate insight into a tester’s activity. The group work we delivered while working with SBTM drew us from our comfort zones and had us tackling not only the challenges of an unfamiliar application, but also those of a different mindset. I, for one, intend to experiment with the SBTM template. And since we tested various functionalities in Impress to reach our conclusions, I’m thinking about carrying on with this activity in my spare time. Because testing is fun!

Let’s Put this to Good Use!

Since I’m keen on all things frontend-related, I chose my third workshop accordingly. This particular afternoon session looked into the ins and outs of Usability. Andreea Popescu (Cluj), a specialist on the matter, gave us an enthusiastic and well-researched view into this concept, while engaging the participants in an interactive and meaningful learning experience.

Andreea defined SMART objectives from the very beginning and enabled us to track them throughout the workshop, by correlating each of them with a practical activity. Group work was the core method. First of all, the four groups brainstormed “Usability” in the broad sense, just to arrive at the conclusion that a single definition is by far not comprehensive. The latter taster paved the way for the second exercise, during which we researched the cultural aspects of the countries we had previously extracted in preparation for this stage. Alex, Levi and I thoroughly enjoyed researching the specifics of the Spanish UI, colour scheme, browsing and spending habits, etc., while efficiently dividing the workload, rating and documenting our findings.

Now that we had the overview, we could put it to good use during the third exercise. Andreea had prepared another batch of information for us to extract. This time, each group drew a website. The assignment revolved around testing the respective website and identifying various usability features and issues. To guide us through this process, we had print-outs with usability questions and tools. Consequently, apart from the “Spain-ready” features (in the words of Alex :-)), we came across aspects that needed to be adapted for the Spanish Go Live, in light of what we had researched. They ranged from the layout and chromatics to various technical and linguistic inconsistencies. The subsequent debriefing session created a lively context for intercultural exchange among the groups. Truly inspiring!

My “Lessons Learned”

Since I’m very fond of drawing up lists, colour-coding and mapping any learning outcome, I couldn’t help but summarize this engaging experience. This is what I’m taking with me:

You ARE Agile. You don’t “do” Agile.

Scrum is a framework. Not a methodology.

“Approach” and “technique” are not interchangeable terms while testing. However, you can apply most techniques in an exploratory and/ or scripted manner.

The definition of Usability is not set in stone. While its main focus lies with identifying patterns or innovating by combining them, the cultural aspect is the one that actually prompts the approach to development and testing alike. Solid arguments rely on thorough research. You sometimes employ a comparative strategy in order to make improvement suggestions. Various tools enable you to conduct usability audits.

Last but not least, this may well have been an “Autumn” Camp, but TdT is in season all year round. If you wish to stay up-to-date with technical topics, learn something new, give a hand in facilitating a workshop or simply experience a sense of belonging, just join the monthly meet-ups and become part of a passionate community! See you around! 🙂

“I’ve noticed that a lot of testers struggle with risk analysis, and yet risk analysis is really the core purpose of testing. So, for this presentation, I want to engage you in a risk analysis exercise.” – James Bach