The Testing Show: GDPR with Daniel Leigh

July 18, 2018

Like many software users, it’s a good bet that you received a lot of pop-up messages or emails telling you that you had to log in to various sites and read a number of new statements and provide Consent to companies to let you use their services. If that’s how it seemed, you are not alone, there was indeed a lot of that, and it was for a very specific purpose. After Many of 2018, individuals and companies doing business with or using data belonging to anyone residing in the European Union had to, by law, make it clear that users had rights concerning how their data was being used and that they had choices as to what they could do about it. These changes are called The General Data Protection Regulation, or GDPR and is now law regarding data protection and privacy for all individuals within the European Union (EU) and the European Economic Area (EEA).

On today’s show, we welcome Daniel Leigh who helps us understand GDPR a little better, what the ramifications of these new laws are, what they mean to companies doing business in or dealing with data relating to anyone residing in the EU, what they mean for software development and data protection and privacy, and how software testers can help make a difference in this new reality of consent and privacy.

Additionally, in our news segment, what would it have cost various companies like Yahoo, eBay, and Equifax, based on how they responded to their data breaches in the past, were they subject to the new GDPR rules?

Transcript:

MICHAEL LARSEN: Hello and welcome to the Testing Show, Episode… 58.

[Begin Intro Music]

This show is sponsored by QualiTest. QualiTest Software Testing and Business Assurance solutions offer an alternative by leveraging deep technology, business and industry-specific understanding to deliver solutions that align with client’s business context. Clients also comment that QualiTest’s solutions have increased their trust in the software they release. Please visit QualiTestGroup.com to test beyond the obvious!

PETER VARHOL: Hi, folks. Happy to be here, and happy to contribute to this. Than you, Mike.

MICHAEL LARSEN: We’d also like to welcome Perze Ababa?

PERZE ABABA: Hi, everyone. Thank you for having me.

MICHAEL LARSEN: We’d like to welcome our special guest to the show (and I want to make sure I’m pronouncing this right), Daniel Leigh?

DANIEL LEIGH: That is correct. Yep. Thank you for inviting me.

MICHAEL LARSEN: All right. Fantastic. We are going to go ahead and get started today. So, Matt, let’s have you do your thing.

MATTHEW HEUSSER: Hey, thanks. So, today, we’re talking about GDPR, the European regulation that enhances privacy and security, required security, for companies that collect private information. Now, Daniel is the expert—right?—but private information could be as simple as your e-mail address. Right?

DANIEL LEIGH: Quite correct. The textbook answer for that question is: any sort of single piece of information or multiple pieces of information that can identify any individual. You know, like, for myself, Daniel Leigh, that piece of information on its own, rather unfortunately for everybody else, it wouldn’t be personal data, because there could be hundreds of thousands of Daniel Leigh’s (heaven forbid). But, when you start to add e-mail addresses and other bits of information, that’s when it starts to identify an individual. So, that’s when it turns into personal information.

MATTHEW HEUSSER: So, if you’ve been on a website in the past 2 months, you’ve probably seen an increasing number of these pop-ups that say, “Hey, we collect your data. Go here to learn about it, or just click this X to make this pop-up go away,” and that is driven by GDPR. Every company needs to make its policies for collecting data clear. Well, we will get into that some more. Basically, being good web citizens is now required by law if collect information on Europeans. If you have a website, you probably have Europeans going to your website.

PETER VARHOL: Matt, you made a good point there. It’s not just for European companies. It’s for any company that has registered European visitors, customers, whatever it might be, which is much broader, which involves many of the e-commerce and social media companies in the United States and in other parts of the world, for example.

MATTHEW HEUSSER: Yeah. I was just with a QualiTest customer. I can’t tell you much about them, but they are a supplier to a major manufacturer. They’re entirely in the United States, and the manufacturer is entirely in the United States. They kind of try to keep their suppliers in the United States, but they have customers in Europe. So, they had to do GDPR compliances. Before we dive into the real meat of it, I wanted to talk about this article in Forbes. It’s kind of neat. It’s looking at all of the data breaches over the past few years before GDPR was in effect and calculating the dollar cost of the fines they would have been, typically in the tens-to-hundreds-of-millions. Those times we got a letter saying, “Hey, we might’ve just given away your e-mail address and password. We don’t know for sure. Can you, uh, change that?” I’m a free-market person, but the amount of data breaches over the past few years just really bothers me. The idea that now government teach their companies to be good web citizens. I like it. Michael? Dan? Do you have any thoughts on the article?

MICHAEL LARSEN: Well, I was going through and looking through some of the options here and one of the things that is worth talking about to consider not just the fact that, “Oh, you know, there’s a data breach.” Or, “Oh, we have to find it.” They actually took 3 of the biggest data breaches recently. They looked at Yahoo, eBay, and Equifax (for those who remember the Equifax breach) and to consider what the actual fines would have been. If those breaches were to have happened today, with the way that they responded to those breaches, 3 billion user accounts were breached via Yahoo in 2013 and 2014, considered the largest data breach in history. They didn’t disclose the breadth within 72 hours. In fact, it took them until October of 2017 to fully acknowledge the multiple breaches that occurred. They would’ve faced, basically, anywhere from $80 million to $160 million in fines had this happened today. eBay, £10 or £20 million would have been there fines, and the Equifax breach (look at this). So, they had 143 million consumers compromised, and it doesn’t say the exact fine rate that they would’ve been under. But, because of $3.1 billion in revenue for 2016, it would’ve been [LAUGHTER] a very significant number.

DANIEL LEIGH: Yeah. I think one of the big factors on each of the 3 areas in which they fall down in is that notification period. So, obviously, there’s big issues in, you know, identifying your personal data, tracking that, and making it clear as to what that data is used for. Obviously, it’s having the technology as well, to be able to detect potential or the breaches that have occurred. Some of the lighter ones, like the eBay one, for example, I think it was discovered early May, but they did not notify until later in the month. I think that was one of the shorter periods; but, again, it was still outside of the 72-hour requirement. So, I think that’s going to be (certainly for these bigger companies) one of the big aspects to be able to get their head around and actually be able to implement and sort out so they can do that in the future.

MATTHEW HEUSSER: Oh, that’s interesting. Let me see if I get that right. The company is going to have to develop a mechanism to get the information out to all the affected customers in a timely manner from when the breach is discovered, and that’s going to be a process.

DANIEL LEIGH: Correct.

MATTHEW HEUSSER: Possibly e-mailing or paper mailing hundreds-of-millions. I mean, if it was Facebook, it would be billions of users.

DANIEL LEIGH: Yeah. I think one of the bigger aspects is not necessarily around the notification or the logistics around the notification of all the people. It’s actually having the technology to detect the breach. I think that’s where certainly like you mentioned, the Yahoo incidents, there’s quite a large gap from the actual breaches to when they actually were identified. I think that’s going to be one of the big areas that a lot of companies are going to have some issues and certainly have to spend a little bit of money to be able to resolve.

MATTHEW HEUSSER: Is that a process that would need to get tested?

DANIEL LEIGH: Definitely. Yeah.

MATTHEW HEUSSER: So, what does testing look like. Well, let’s explore GDPR for just another minute. Tell me if you disagree. This is a summary of the document that is hundreds of pages long. So, I’m going to get it wrong. But, I think of it like, “Create, read, update, delete” with private information. I, as a private citizen, need to be able to know what Yahoo is keeping on me. I need to have “create, read, update, delete” rights on all of my account information, and I think the new piece is “delete.” I need to be able to go in and say, “Hey, Etsy. I know I ordered some books 6 months ago, but I want you to delete my information and not have it anymore.” They need to be able to do that. Then, if they are hacked 6 months later, there’s no way the hacker could get my information because it’s not on there anymore. I think that’s new; and, of course, notifying your policies, notifying people in case of a breach. I think those are the big pieces of GDPR. Did I get that right, and did I miss anything?

PERZE ABABA: I think we have to discuss the notion of personal data as well and what that actually means to the extent of what we’re doing. Specific biographical information. You know, appearance and looks and behavior is actually really pretty tricky to deal with from a removal and tagging perspective. There’s a lot of stuff there that we could really dig into.

MATTHEW HEUSSER: We should get into, “What is personal information and what isn’t?” But Dan, did I get it right?

DANIEL LEIGH: Yeah. The end-user now has a lot more rights over their information. So, quite rightly, as you mentioned, they have:

Obviously, the right to be informed, which has always been there in the data protection.

The right to be able to request and see what data a particular company or entity has on them.

They have the right to erasure. They have the right to have the information deleted.

To restrict processing as well. So, they can actually limit the uses of that data.

Also, about data portability as well, to be able to have that data and be able to transport that to another system.

So, yeah, there are a number of rights that the individual now has, which is new for GDPR.

MICHAEL LARSEN: Having just gone through this process myself, back in May, we had to get our site up-to-date and ready for GDPR, one of the things that I noticed, at least for our perspective, we enabled 2 different workflows, and we could determine which one we were going to put in. On one side, was a literal consent or decline option. So, if you’re set for consent or decline, saying, “Hey, here’s all the information. Here’s what we gather. You have to give us consent to actually work with this. If you do, you can use the site. If you decline, we log you out, and you can no longer access the site. Then, there’s an option of, “Well, hey, if I decline and I don’t want to use this, well then how can I get my data?” We give you directions on how to do that. The other workflow that we had was one that just had acknowledge. That just said, “Yep. Okay. I acknowledge that’s the case.” That’s the, “I click here, so that this pop-up goes away.” But, in both cases, what I found interesting was, “Okay. What do we do with your data and if you want to get your data to remove it, how do we do that?” Raised more questions in my mind as in, “So, what are we doing with all of this actually?” [LAUGHTER]. And, “How are we effectively making sure that this is happening?” So, I know that this is going to be a longer conversation. But, again, that opens up to me those 2 options for workflows: the consent versus acknowledge. Is there a fundamental difference really, and is that common? Are you seeing that there is a literal, “You have to consent to do this?” Or is it just a matter of, “As long as you acknowledge that we’re doing this, we’re in the clear?” And, does it vary?

DANIEL LEIGH: Yes, it does vary. It might be worth just mentioning that, obviously, there are 6 particular lawful basis for the processing of personal data”

Consent, which is obviously one that you’ve mentioned. In laymen’s terms, yes. It’s literally, you fill out a form. It’s got to be made what the data you’re putting in that form is going to be used for, and then you give consent whether it’s a pretty clear box or other means that you consent to that data being used for that reason.

There are other lawful basis as well, so just to quickly touch on them:

You’ve got contractual. That can apply very much to employee information, the right to hold employee data for the purpose of employment. Obviously, the law basis, there is contract of employment.

There’s legal obligation, which is another one. It is as it describes.

There’s also public tasks and vital interests, in terms of if data needs to be particularly used to save a person’s life, I think is one example that the ICR regularly give. Then, that data can be used for that.

Another one, is legitimate interest, which is probably one we will touch upon if we talk about marketing. If, for example, we have a particular company that has an existing client that has signed up for a particular service, but the company also does other bits of business, in other areas, this particular client or the company would see that client would have a legitimate interest in being notified about those particular products and services. They can be notified under the lawful basis legitimate interest.

So, yes, you mentioned “consent,” but there are other lawful basis to be able to store and collect personal data.

PETER VARHOL: Daniel, I’m interested in what a company has to do to be able to comply with this. Is there a true certification, or is there a standard set of best practices that don’t provide the company with any protection if, in fact, there is breach? How does it work from the standpoint of the company? What are they required to do within their systems, and is there really a certification?

DANIEL LEIGH: There is no certification. So, there is no certificate to put on the wall, so to speak. We look at things like—

MATTHEW HEUSSER: I’ll sell you one. [LAUGHTER].

DANIEL LEIGH: We you look at things like ISO, for example, and go through the process to meet the requirements of an ISO 27001, for example, information security.

PETER VARHOL: Right.

DANIEL LEIGH: You can meet the standards of that. You can get externally audited. If successful, you get a certificate to put on the wall. Obviously, I’m simplifying that a lot. If that was done, the business practice and the business process would be a lot safer and a lot more streamlined. The example given was just so you have a certificate. With GDPR, it’s down to the, “Protects the individual. The individual has rights. As a data processor, as a data controller, they also have responsibilities with what they do with that data and how they handle that data. It’s those which forms the GDPR documents which obviously needs to be followed and put in place from the business perspective.

PETER VARHOL: Okay. So, let’s look at a company’s systems. They may be operating in the Cloud or something like that. Are they required to configure their databases in certain ways? One of the things that I’m familiar with, with GDPR, is transferring data. The way you transfer data changes under GDPR too. Between databases and between systems or between datacenters and things like that. Is that true?

DANIEL LEIGH: It does in terms of data location. Obviously, there are now restrictions on transferring data outside of the EU. If we remember, in the old days, Safe Harbor, US, now under the name of Privacy Shield, there are a number of regulations which will satisfy the GDPR requirement for transfer of data. There is now a lot of restriction on transferring data outside of the EU. So, yeah. There are data transfer restrictions. There are lots of things around that. But also, as well as in terms of configuration that have been enabled for a user to have the rights to have access to the data to be able to port that data elsewhere. There would need to be some additional configuration within the database to be able to meet those requirements as well. Yes, there would be some changes required for database configuration, for example, to meet the requirements.

PERZE ABABA: Quick question on that one: Are there very specific requirements on what we can do to identify personal data so that we can mask that and use that within a testing activity, for example, or is that just completely not allowed at this point?

DANIEL LEIGH: Again, I think what we’ve got to remember, “What is personal data?” Personal data is a piece data or a collection of data that can identify an individual. For example, a single name on its own would not be classed as personal data, but when you start to add other pieces of data to that, for example, an e-mail address or an IP address, that then starts to be personal data. Where we would start to sort of anonymize data or have a collection of data, for testing purposes, you may have names, addresses, potential mixed up, would that identify an EU individual? If the answer is, “yes,” then that is classed as personal data. If the answer is “no,” then it wouldn’t be classed as personal data.

MATTHEW HEUSSER: Let’s say you’re an insurance company in the United States. So, you have databases that have claims. Claims go back to member ID’s. Member ID’s are unique and do not include the Social Security Number, but I think that’s personal data. Because the member ID is on their card and you could use it to try to get services or something, and then that table has name, birthdate, other personal stuff in it. The member table has that information on it, and you want to do testing in a test environment that is with real live-ish data. So, what you used to do in the bad-old-days or dark-ages is test would just be periodically refreshed from production, and now it’s not really GDPR as much as it is HIPAA. We say, “If there’s no good reason, testers don’t need that information.” Like, maybe fourth level support does to debug a problem, but testers don’t need that information. So, we should give testers access to the test environment, but we want tests to be realistic. So, we had to go through a cleansing process, and there are a few tools to do this. HP sells a tool that isn’t terrible. It’s expensive. I have no business relationship with HP. The tool copies your data, keeps the same table format, and it anonymizes it and randomizes it. So, you’ve got data that’s realistic that you can run operations on, but it’s not live production. It’s no longer member identifying anymore. Is GDPR going to require us to follow a process like that?

DANIEL LEIGH: Not strictly speaking, no. But, obviously, the process that you’ve just mentioned, there’s obviously different stages to that. The highest level, the data controllers, have the personal information prior to any cleansing process, and obviously there’s a lot of requirements there to be met. But, if there was a particular business area or company that just receives the cleansed or anonymized data, then obviously that wouldn’t be classed as personal data. The restriction wouldn’t be applicable to that. But, obviously, that source, GDPR requirements are there.

PETER VARHOL: To follow up on that, I think that one of things that we’re looking at is we’re looking at outcomes here, and that is that a user has control over their data and that data is responsibly protected from breach, and if there is a breach, that the user is notified. Dan, I will follow up with this question: Do government entities also have to follow GDPR? I think that how a company does that is entirely up to them, but I think that GDPR is really looking more at outcomes from the standpoint of, “Are reasonable protections taken, does the user have control over their data, and is the user notified?” Am I correct in that regard?

DANIEL LEIGH: Yes. Going back to the original one, government, you know, public services organizations, are certainly not exempt from GDPR compliance, definitely not. Speaking Frank, being a data protection officer, within one of the larger government sectors, certainly if I’m thinking of some of the ones in the UK, the requirements, the scope, the technology considerations, I certainly don’t envy the people that carry out that particular role.

PETER VARHOL: But, they’re focused more on outcomes as opposed to what the nuts and bolts of those particular processes are. Because everybody has different processes, different nuts and bolts. Correct?

DANIEL LEIGH: Yeah. Correct. Yeah.

MATTHEW HEUSSER: Let me play ignorant executive for a minute. You want to test GDPR, which means you need to go to the test website, you need to see the little pop-up. You click, “Tell me more.” You need to read the policies. You need to be consistent with our policies. You need to click, “I want to control my data.” You going to be able to click, “Delete the data. It needs to be really gone. Update the data, if you change it. I agree, and then we put in production, and then you do the same thing in production to make sure production matches test. We’re done. This is a 25-minute testing job. Why are we talking? I’ve got deals to close.” How do you respond?

PETER VARHOL: It varies depending upon the data, but for example, if you’re handling credit cards, you have certain requirements as defined by the Payment Card Industry Security Standards Council. One of them is to regularly monitor and test networks to ensure that they are both functional and secure. So, I see an ongoing monitoring responsibility here by testing, and I see an ongoing testing responsibility to make sure that nothing has changed within that network, that infrastructure, that’s going to compromise that data. Does that sound right, Daniel?

DANIEL LEIGH: Yeah. Definitely. Because, like you say, you can have your processes, your nuts and bolts, and they can be different nuts and bolts, like you mentioned. But, like you said, that’s just one part. Obviously, to have that in place, it’s got to be initially tested, but it also needs to be the ongoing testing like you mentioned as well to ensure that what’s been put in place in terms of people, processes, technology regularly should be tested, audited, however a particular company may want to do that to ensure that they are actually doing their role. That feeds in as well to being able to meet that tight notification deadline as well.

PETER VARHOL: So, Daniel, I think the answer to your question is that: It is somewhat more complex that making sure that a user can access and, if necessary, delete their data. Matt, so, in answer to your question, I think an executive has to do a little bit more than, say, “I can look at or delete my data.”

MATTHEW HEUSSER: Yeah. I would say something like, “That tells for us today and the problem with the Yahoo breaches and all those breaches is that they were secure at one time until they weren’t.” So, we’ve got to do ongoing checking. I think testing the process to release the information in the event of a breach is going to be a significant piece. I mean, do you want to physically send letters in the mail to all 3 billion members? What’s that going to cost you? Just the stamps and the postage is going to cost you like $100 million.

MICHAEL LARSEN: At least.

MATTHEW HEUSSER: [LAUGHTER]. So, what you need to do is you need to figure out who was actually impacted and you need to figure out if the breach was bad enough that you need to send a letter out, or can you send an e-mail out, and you need to figure that out within 72 hours, which is 24 business hours. So, we’re going to have to have a pretty tight process, and testers are pretty good (well, some of us) in helping to make a process tighter. I mean, the feedback loop that is a huge portion of lean software testing.

PETER VARHOL: Matt, I think you’ve raised a very good point, in that you’re not only testing the systems, you’re testing the process here with GDPR.

MATTHEW HEUSSER: Yeah, which is something I think we should get better at as a community.

MICHAEL LARSEN: So, let me switch it up here a little bit. I want to approach this from a little bit of a different angle. So, let’s say, for example, I have an outbound marketing campaign to run and I want to say, “Hey. We’ve got these events coming up or we’ve got these options coming up.” I have a mailing list. Under GDPR, what changes now? What are the things that I have to consider? What does GDPR now require of me to be aware of and to do, specific to (say) that campaign?

DANIEL LEIGH: We talked about consent before. Certainly, in terms of marketing, we’ll look at the contact database, for example, that you have, that you normally send your campaign information to. Under GDPR, we need to be able to show that for each contact in there, we have consent. If we don’t have that consent, we shouldn’t be using that data. You’ve kind of got the chicken and the egg. If you use the example that you mentioned about sending an e-mail address and then asking them, “Can I send you this data,” you’ve already asked the question. You’ve already breached the particular rights of the user there by not having a basis to contact them on. A lot of companies, you know, they’ll have the website, the web forms, with the particular wording on them. So, essentially, it needs to be clear, concise, exactly what they’re signing up for. We’ve all been there in the past, haven’t we? We sign up for one particular thing on a particular website, and then we get inundated with all sorts of other things that we didn’t even know we were signing up for. Obviously, GDPR is designed (obviously, not just for that) to reduce that and to protect the rights of an individual’s data. Certainly, one of the changes is being able to evidence the lawful basis that particular company has in being able to send out that marketing information. If a particular company had a database of 5,000 contacts but they don’t have any consents from any of them, from previous or any other basis, unfortunately that data can’t be used. So, it can have quite a big impact, certainly, on marketing campaigns.

PETER VARHOL: Daniel, let me ask you this: Within companies that have data on EU citizens, who in these companies is taking responsibility for GDPR? Is it a new role? Is it being folded into one or more other roles? What’s your experience there?

DANIEL LEIGH: Well, when you look at GDPR compliance and you see the data protection officer role under the “about,” that doesn’t apply to every organization. There are a list of, “If your company does X, Y, Z, you may need data protection officer.” With some companies, you may have sort of a financial director that may be looked after the information security. That they may take on the role. For others, there would need to be a specific role which is the data protection officer. So, it could be a new person, a new role, that’s brought into a company, or it could be additional responsibilities on an existing person.

PETER VARHOL: Matt, to answer your question, I think that what will happen is that we’ll see what will happen when the first major breach occurs.

MICHAEL LARSEN: [LAUGHTER].

MATTHEW HEUSSER: Yeah, absolutely. I mean, do you guys remember? Was it Dodd-Frank or was it Sarbanes–Oxley? I mean, my CEO was running around crazy like, “Oh, no. I’m legally responsible for these financial statements,” and like nobody ever got arrested.

MICHAEL LARSEN: That was SOX. I recall a lot of that as well. I think, for my final word on this: I appreciate the fact that this is now an options. That we basically have control, at least in a manner of speaking over what we share; and, if we decide that we don’t want to participate with a company, that they are legally bound to turn off the spigot and let us export our data. I think that’s really cool, and I appreciate that. On the other hand, there is a part of me that is still mildly skeptical about all of this and wondering if this is really going to amount to very much, but I guess as Peter said, “Let’s see when the next data breach happens,” especially if it happens with EU data and what results from it. That will be the real indication.

PETER VARHOL: Daniel, do you have any thoughts on that?

DANIEL LEIGH: GDPR, for the individual, is great. Because, you know, we’ve all had the phone calls. We’ve all had the e-mails. We’ve all had the texts. All the different methods of contact that we just don’t want, we’re not interested in, and we probably can’t even remember when we signed up for it, if we even did at all. Like you said, to have the increased rights on our data is great. From a business perspective, yes, for me personally, I think it’s good. Any directive, any process, any technology that helps with information security, that can keep our data secure, from an employee perspective and the data that our company processes, I think it is a big positive. But, yeah, I’m very curious as to see, like you guys, where the next big data breach is going to come from, and what actually come from that. So, yeah. It’s certainly something to keep a close eye on.

PETER VARHOL: Okay. Thank you.

MICHAEL LARSEN: All right. Well, we are not winding down for the show. This is our classic shameless self-promotion section. Daniel, since you’re the newest member to our podcast, this is your chance to basically tell us a little bit more about: Where can we find you? Do you have a presence online? Anyway people can ask you questions? Are you appearing anyplace? This is where you can toot your own horn.

DANIEL LEIGH: I’d love to. But I think the answer to all those question is, “No, not really.”

ALL: [LAUGHTER].

DANIEL LEIGH: From the sound of my voice, you can probably tell, I sit in the background. I travel to lots of company sites. I make sure that, you know, we have the people, the technology, the processes in place around information, security, and quality management to ensure that, from a business perspective, from a process perspective, from a customer satisfaction perspective, and also obviously from an information security perspective, that we’re working to the best of our abilities. Certainly, want to get to get more out there, to sort of have different conversations with various people to cross what people go through and to actually talk about that. So, probably the only thing I could probably say is: You’ll probably be hearing more from me soon.

PETER VARHOL: [LAUGHTER].

MICHAEL LARSEN: All right. Fantastic. Peter, how about you, man?

PETER VARHOL: I’m happy to be invited to participate in these Testing Show Podcasts. I think they’re wonderful; and, if you want to get in touch with me: peter@petervarhol.com. Thank you.

MICHAEL LARSEN: Perze, what are you up to?

PERZE ABABA: Hey. Well, work, work, work. The good-ole W-O-R-K. But, outside of that, I will just probably take this time to get stuff that I’ve kind of learned from today’s conversation. One thing that really stands out for me is, you know, the notion of accountability. You know, whether we have good or bad or horrible or ignored practices and how we deal with data. Because of this law, now we are going to be accountable for the things that we do. So, I guess, let’s just be mindful of how we handle these things, and make sure we have the ability to let everybody know and give the people the change to opt out.

MICHAEL LARSEN: I like it. Matt, how about you?

MATTHEW HEUSSER: You guys know all about me and what I’m working on going. I’m going to be at CAST (Conference of the Association for Software Testing) in August. I just got back from Agile Testing Days United States in Boston last month, and mostly I’m enjoying the July holiday in the United States and catching up with friends and family, but I’m sure I’ll have more to talk about soon.

MICHAEL LARSEN: All right. Sounds good. On my end, I’m kind of trying to focus on specifically 2 talks that I’m going to be delivering at Pacific NW Software Quality Conference. One of which is on, “Future-Proofing Your Software and Taking Accessibility and Inclusive Design Principles on a Broader Scale.” Also, with my friend, Bill Opsal, we are putting together a workshop on basically a choose-your-own-adventure game on when you have to build a testing framework, and it’s the idea that we’re going to look at all the other things that often don’t go into the discussion when you’re having to decide to put a testing framework in place. We’re having fun putting it together, and it’s still in the works. So, with that, I guess that’s it for shameless self-promotion. Matt, back to you.

MATTHEW HEUSSER: Well, and that’s the show. Thanks everybody for coming, and we’ll be in touch soon.

MICHAEL LARSEN: All right.

PETER VARHOL: Thank you, Matt and Mike.

MICHAEL LARSEN: Thanks for coming out, everybody.

DANIEL LEIGH: Cheers. Thank you very much.

—

MICHAEL LARSEN: That concludes this episode of The Testing Show. We also want encourage you, our listeners, to give us a rating and a review on Apple Podcasts. Those ratings and reviews help raise the visibility of the show and let more people find us.

Also, we want to invite you to come join us on The Testing Show Slack Channel as a way to communicate about the show, talk to us about what you like and what you’d like to hear, and also to help us shape future shows. Please email us at TheTestingShow(at)QualitestGroup(dot)com and we will send you an invite to join the group.

The Testing Show is produced and edited by Michael Larsen, Moderated by Matt Heusser, with frequent contributions from Perze Ababa, Jessica Ingrassellino and Justin Rohrman as well as our many featured guests who bring the topics and expertise to make the show happen.

Additionally, if you have questions you’d like to see addressed on The Testing Show, or if you would like to BE a guest on the podcast, please email us at TheTestingShow(at)qualitestgroup(dot)com.