The feature we’re most excited about with TDS 4.0 is support for naming conventions. Most of us work on teams that contribute to the same code base. It is important that the unified product of that team’s effort be presented consistently no matter who contributed the code. Choosing to name variables or functions a certain way can make your framework more cohesive and easier to use by everyone.

By default, Test Design Studio comes pre-configured for the most basic naming conventions around character casing for language elements. These defaults are based on generally accepted industry norms for VBScript and include:

Variables and parameters start with lower-case letter and capitalize each new word.

Functions, subs, and properties start with upper-case letter and capitalize each new word.

Class names start with upper-case letter and capitalize each new word.

Constants use all upper-case letters with underscore between words.

The following illustrates a Sub whose name begins with a lower-case letter instead of upper-case

The violations for naming rules are displayed in the Error List along with any syntax errors and code analysis feedback. Each violation is also underlined in the editor with green “squiggles” to draw attention to the oversight.

These default rules are a great start for naming conventions, but individual policies at your organization are likely far more complex. Since every organization is different, we designed this feature from the beginning to be user-driven. All naming conventions are based on a series of rules in an XML-formatted file. We’ve provided a powerful set of criteria to help you define your individual conventions. We’ve even provided a working sample of a much more complicated rules file that you can use as a template for your own rules (look at ‘CodeAnalysis\rules.sample.typePrefix.xml’ under the TDS installation directory).

Not only can you define different rules based on item type (e.g. Sub, Function, Variable), you can also define rules based on the content. Do you name integer variables one way and boolean variable another way? No problem! Different convention for public vs. private items? We have that, too!

You can change the location of the naming rules XML file in the same spot where you turn individual code analysis rules on/off by selecting “Tools –> Options” from the main menu.

While we’ve tried to prepare a solid foundation for the rules engine, we know our customers will be the truest test of when the feature is complete. We fully support the functionality, but are releasing it under a “Beta” tag for now. We’re confident in the core functionality for the default rules we have provided, but we need to hear from more customers about how they want to implement rules.

If you are unable to implement your naming conventions using our present rules engine, we want to hear from you! Please contact us with examples of the rules you want to implement. If we can’t make the current rules engine work, we’ll see what we can do to add the support your need.

We hope you enjoy this new feature, and look forward to hearing your feedback.

With the recent release of Unified Functional Testing 14, we are happy to report that Test Design Studio continues to be fully compatible with the latest version. In fact, it doesn’t appear that the core GUI Test functionality has changed at all… and hasn’t for years! That should speak volumes about how much this manufacturer cares about those who still write code for UFT, but that’s why we have Test Design Studio anyway!

It is with great pleasure that we announce the pending arrival of Test Design Studio 3, and the following is a glimpse of what is to come:

A New Focus

Those who have been using Test Design Studio for a while know that the idea of what would be TDS 3 has shifted over time. Our initial goal was to take TDS into the future with an entirely rebuilt user-interface and re-architected back-end services. That undertaking proved to be bigger and more complicated than expected, and resulted in one delay after another. As time wore on and we analyzed each part of the system, we realized what we already had was pretty great. The passion and commitment to build a first-class editing experience for QuickTest Professional code authors was prevalent throughout, and could not be ignored. Instead of making our loyal fans wait any longer for the new release, we decided to refocus our efforts on bringing the already great application forward into the expectations of today.

We started with the user interface. While purely aesthetic, the appearance of an application is something that impacts every user during every moment they are using the software. The moment the TDS splash screen goes away, you are presented with a clean and modern UI. Gradient effects of yester-year have been replaced with a flat and clean appearance that result in fewer visual distractions and improved focus on your work.

What came next was a tremendous effort that, had we not told you, you may never have known. We refactored our code. A lot of code! Test Design Studio was first built in the age of .NET Framework 1.1, and the .NET Framework has seen significant improvements since. Design choices that had to be made based on the limitations of the early .NET Framework were no longer necessary. There was functionality written for TDS v1.0 that had exceeded its relevancy or had eventually gone in a different direction through the iterations of our software. We looked at everything with fresh eyes and adjusted the design as needed. Even many of the rewritten subsystems for the new design were incorporated into the existing design where possible. What was the result? The code is now cleaner and more structured. This translates to easier debugging, reduced maintenance, eradication of many existing bugs, and increased confidence to make changes without introducing regressions.

A New Integration

Perhaps the largest user-facing difference in this release over prior releases is a change to how we integrate with HP Quality Center or Application Lifecycle Management (ALM). Many would be surprised to know that TDS 2.5 still fully works with Mercury TestDirector 8.0! It is easy to forget that TestDirector did not support the single authentication point we have today. A user switching projects would have to exit their current project, select a new project, and login again with potentially different credentials. The server- and project-centric integration design within TDS was based on that original functionality from TestDirector. As TestDirector changed names to Quality Center and beyond, we continued to support the previous versions while tweaking our integrations to support their evolved systems. One such change was the authentication model shift which occurred in Quality Center 9.0. While moving TDS forward, those decisions from the past and desire for backwards compatibility still influenced our integration.

Since use of those legacy systems has sunset, we have reached a point where no supported version of Quality Center or ALM requires that older design. Today, we are happy to announce that TDS 3 has refocused and streamlined your integration with Quality Center and ALM into a single connection dialog.

You no longer have to step through the Connection Wizard to define a server and configure your account. Instead, you are presented with a simple authentication window that is familiar to how other tools also integrate with ALM. Changing ALM servers is now as easy as just changing your connection. File references which are stored by TDS will still note the server from which they originated, but will always use the current server connection. What hasn’t changed is the ability to interact with multiple projects after connecting to the server. While ALM integrations like that found in Unified Functional Testing require you to build a connection to an individual project, TDS will still independently manage individual connections so that you can simultaneously work with content from multiple projects.

Even with our changes, we still maintain a high degree of backward compatibility and are happy to continue support for Quality Center 10 and all versions of Application Lifecycle Management.

Continuing with the theme of refocusing our efforts, this release of TDS will remove some functionality related to ALM that was not core to this product. Test Design Studio has always been about coding tests. It’s right there in the name. Yet over time, the software became bloated with features related to ALM administration. Most of this functionality required Site Administrator access in ALM which was managed through a completely separate API. Other features, like interacting with ALM workflow scripts, required changing parameters on the server that could impact the security of your system. To put TDS on a new track of growth and evolution, we had to leave those administrative features behind and focus our integration on that which was necessary for strong test design.

A New Commitment

Our new commitment starts with an admission. Our choice to execute a complete rebuild of Test Design Studio was wrong. While our hearts were in the right place and we had the right goal in mind, we simply charted the wrong path. We still believe in that future for TDS, but will use a more iterative approach to get there. A new plan has already been put in place to make that happen, and it starts with more frequent application updates. In this new day of continuous improvement, we are letting go of the idea of a “major release”. Instead of holding back new features in an effort to group them into a major release, we will release those features as they are ready. Our goal is to provide a new release of TDS every 4-6 months. The first release of a new year will receive a major version number increase, while subsequent releases that year will receive a minor version number increase. TDS 3.0 will be released very soon and followed later this year by TDS 3.1 or even TDS 3.2. Next year will bring TDS 4.0, 4.1, and so on. Customers will receive these new releases at no additional charge as part of their ongoing maintenance agreement. As needed, maintenance releases will be provided between release cycles to address issues that might arise in the software.

All these years of working on the rebuild of TDS are not in vain. We still have that code and are very pleased with the results of what we have built so far. Much of that work has already made its way into TDS 3.0, and more will be integrated over time. Each new release of TDS will bring us one step closer to the future we always imagined. We are committed to this product and look forward to the journey ahead.

A New Invitation

We want to thank all of our many devoted users of TDS who have shared their ideas and experiences with us over the years. We build this software for you, not for ourselves. Your voice has a direct impact on the software we develop and put into your hands, and we want to hear your ideas. Don’t leave us guessing about how you want to use the software. Your exact individual feedback will be directly read by every person working on this software and not simply aggregated into metrics or reports. You have a real opportunity to directly influence a piece of software you use every day and further improve your own productivity. We have a UserVoice site if you want to use it, e-mail us, or even send us a tweet.

And now, we would like to invite you to try out the Technology Preview of TDS 3.0. The pre-release is currently invite-only, so please e-mail us if you are interested in evaluating the software. It has been too long since a major release of Test Design Studio, and we are anxious to step into this new era. You can install TDS 3.0 on the same machine as TDS 2.5, so you can evaluate the pre-release while continuing to use the last official release.

The foundation has been set to continue to iterate on TDS and carry it into the future. The new journey begins now with the technology preview, and we hope you will join us.

One of the most beneficial features of Test Design Studio is its ability to statically analyze your code and warn you of potential pitfalls. We want to highlight two of the new rules that are coming to the next release of Test Design Studio.

The error handling capabilities in VBScript are pretty limited, but still necessary. Using the statement ‘On Error Resume Next’ within your code is often unavoidable, and this statement is easy to abuse. We have added two rules to assist with the use of this statement:

The “Use error handling with caution” rule is very simple. Any use of ‘On Error Resume Next’ will be flagged as a warning by code analysis and you will need to manually suppress it. Ideally, your suppression comment will also indicate why you used the statement. This rule will help raise awareness to the dangers of not using the statement properly.

The “Close error handlers in routines” rule attempts to make sure that if you turn on error handling in a routine (using “On Error Resume Next”), you also turn off error handling before exiting the routine (using “On Error Goto 0”). It is not uncommon for a code author to turn on error handling within their routine to deal with a specific usage scenario, and then forget to turn it off before leaving the routine. Any calls to this routine will inadvertently turn on error handling without the callers knowledge. This can lead to unexpected code execution.

Already a Success Story

I was recently working with a client to help determine why a certain test was running in an infinite loop after a replay error. Not only was the script not failing gracefully, it was delaying the entire test suite by not freeing the lab machine for other tests to execute. I loaded the test and it’s relevant libraries into Test Design Studio with these new rules enabled, and Test Design Studio quickly located the needle within a haystack of over 100,000 lines of code.

A custom user function registered on a test object had set “On Error Resume Next” without turning it off. Later in the test, a “While” loop was processing that was encountering an error with the condition statement of the loop. Instead of failing the loop, “Resume Next” meant it continued with the next line of execution… the first line within the body of the loop! The errors continued through the loop until the “Wend” token instructed execution to return to the top of the loop and reevaluate the condition.

This process repeated until remote execution of the test was forced to terminate, but not until after more than 30,000 steps had been sent back to Quality Center/ALM for the last run. Since QC/ALM likes to show you the steps of the last run when you select a test instance in Test Lab, it also locked up QC/ALM while it attempted to process all those steps and eventually failed due to an out of memory condition. The steps had to be manually deleted from the database to restore functionality.

All this work because someone forgot to restore error handling functionality! With these new rules, hopefully that will never happen again!

A new book titled “QuickTest Professional (2nd Edition)” is now available from author Vinnakota Ravi Sankar. I was honored to be asked by Ravi to write the foreward for this book. Instead of writing a separate review for the book, I feel the foreword I wrote sums it up. It is included below as a review.

You can purchase the book on Amazon. Before you ask, I did not receive and will never receive any compensation from the sale of this book. I hope many of you find it useful.

From the Foreward

When I first learned QuickTest Professional (QTP), my only resource was the documentation provided with the software. This primarily consisted of a tutorial and reference guide. Having been a WinRunner user for many users (the functional testing tool replaced by QTP), I was able to draw on that experience to translate my knowledge to the up-and-coming product. Even with that background, I was not certain how Actions would fit into the function-based frameworks I used in WinRunner. VBScript, the scripting language behind QTP, also opened up a world of new opportunities for programming my automated tests. There were no “best practices” because the tool was too new, and certainly no reference books available. I had many questions, and access to few answers.

Fortunately, the landscape of knowledge around QTP has dramatically changed, and this book is a prime example of the fantastic resources new users have available. Help files and user guides typically offer a narrow view of the product features with basic information and maybe a brief example. This book will take you further by introducing those topics from the viewpoint of an experienced user. Topics are littered with discussion about the feature and many verbose examples are provided to further drive home the message.

Part I of this book focuses on QTP and the many features that fuel this powerful tool. New users of QTP will especially appreciate the first half as it takes you through a tour of the functionality that will guide your automation efforts, but even experienced users may learn something new. The second half caters to more experienced users by diving deeper into the concepts of the tool and illustrating those topics through extensive discussion of real usage scenarios with detailed samples and case studies. Presentations of best practices and automation frameworks will help you lean on the experience of the author to get your automation project started quickly and efficiently.

Part II of the book shifts its focus to VBScript. QTP users quickly realize that you need to learn VBScript in order to effectively use QTP. The fundamentals of this scripting language are an essential skill, and this book demonstrates not only the features of the language, but how you can use it specifically to solve many of the challenges we face in test automation.

You have the opportunity to read an unbiased account of QTP translated from years of experience. Today’s QTP user does not have to be left with unanswered questions. While the book primarily focuses on version 9.2 of QTP, it also highlights version-specific features and spends several chapters discussing the evolution of the tool through version 11. In the end, readers will have a single reference that covers many topics from beginner to expert.

Most users of Test Design Studio are already familiar with the benefits you gain from incorporating VBScript Class objects into your code. Using classes helps organize your code and improve comprehension.

In the next release of Test Design Studio 2, we’re making it even easier to add classes to your projects with a new ‘Class Definition Library’ template. Item Templates are used to generate new content for your projects. This powerful feature allows you to generate files with pre-defined content and are highly customizable to meet your individual needs (read more in the on-line documentation).

In the screenshot above, we have selected the ‘Class Definition Library’ template, and will name our file ‘Person.qfl’. The class we are creating in this example will be to represent a ‘Person’ object, so we are naming the file to correspond to the class we are creating. This name means more than just the name of the file, the template will use this name to generate the contents of the file as well. Below is the generated template:

In the screenshot above, we have highlighted all the areas where the name of the file, “Person”, was used in the dynamic generation of the class code. By naming the file the same as the desired class, we are able to quickly generate a lot of the code that you would normally have to manual type.

Lines 12 – 20 provide the skeleton to insert any code that should be executed when the class is created or destroyed.

Lines 22 – 34 include “Region” markers for outlining support of Properties and Methods that you define with your class. Note that the comments remind you to use the code snippet features of Test Design Studio to quickly generate the members of your class (read more about code snippets if you aren’t familiar with this time-saving feature).

Lines 39 – 46 include a public function to instantiate and return a new instance of your class and the corresponding documentation for the method. Some versions of QuickTest Professional have a limitation where you cannot use the ‘New’ keyword to create instances of classes that are defined in external libraries, so this trick helps resolve that issue. Note how the name of the class is used multiple times in this code block to produce the necessary function… great time-saver!

Now that you have a new class, you can easily make this class available to all the tests in your project by using the Build Process to programmatically update the library references for your tests.

While we’re happy to include this built-in template with the next release of Test Design Studio, there’s nothing to prevent you from creating your own templates right now.

Most areas of computer programming have the privilege of access to many reference books. QuickTest automation, which I have always stated is more of a programming activity than testing activity, is not one of those areas. This is why a new book on QuickTest programming is always a gem! The latest work by Anshoo Arora and Tarun Lalwani, titled “QTP Descriptive Programming Unplugged” does not disappoint.

This is not the first book for Tarun. Those who are still learning the in’s and out’s of QuickTest should check out his first book titled “QuickTest Professional Unplugged”. I also reviewed his second book “And I thought I knew QTP!” which utilizes a narrative technique to introduce technical concepts. This latest book, with the help of Anshoo Arora, is a return to the typical style of a technical reference manual.

I have been working with GUI test automation for 13 years, and there is one key aspect of interacting with a GUI that has never changed; you must be able to recognize the UI objects! Anyone who has ever been given a new application to test has had the realization that comes when you fire up your automation tool only to see that few, if any, of the screen objects are recognized by the tool. This is the “knife to the heart” of any automation effort because object recognition is so vital to successful automation projects. We face enough challenges with GUI automation, that object recognition should not be one of them.

When you find that QuickTest is able to recognize your objects, be thankful! The journey typically does not end there, and that is where this book is a valuable resource. Even when working with supported technologies, getting QuickTest to properly and consistently recognize your objects is a must.

This book is a valuable resource because it takes many years of experience in QuickTest object recognition and presents it in a clear, well-organized fashion. You will learn the differences between Local and Shared object repositories, and which strategy to use. Those who prefer to avoid object repositories will receive a healthy discussion on descriptive programming, the art of defining your objects at the time you use them instead of within a repository. Even advanced topics such as using the Document Object Model (DOM) for web applications or XPATH to identify elements are covered in detail.

The authors do more than just introduce topics. Topics are what you can expect from the QuickTest-provided documentation. This book takes those topics and discusses the risks and benefits of each. This was evident when I read about the topic of “Smart Identification” (I’ve always believed it’s one of the biggest misnomers since Little John in the Robin Hood stories), how it works, and why disabling it is the first thing you should do. For example, you can quickly learn about the concept of descriptive programming, but this book explains why you might want to use it, suggestions for improving success on large projects, and the potential pitfalls like escaping regular expression characters.

While primarily a discussion on object recognition techniques, the book does go off-topic on a few extras such as how you could write test code in a .NET language.

Those who are new to QuickTest should still start out with “QuickTest Professional Unplugged”, but they should quickly follow it up with this latest title. Even automators with several years and projects behind them have something to gain through the reference.

The most notable additions to this release are related to the build process. For those who may not be familiar with this feature, the build process allows Test Design Studio to perform routine tasks associated with the collection of files in your project.

A Little Background on Build

Perhaps the most important of the build tasks is the ability to update your QuickTest tests. QuickTest does not have a centralized approach to test configurations unless you use Business Process Testing. Each test is a self-contained entity requiring its own configuration. This is great if you write tests in isolation, but most users typically create a suite of many tests for a given application. Shared resources such as function libraries and object repositories are used by most (if not all) of those tests. When you add a new library or object repository to your shared resource pool, you must update each test to add this new resource. That process is time-consuming and easy to overlook.

The Test Design Studio build process changes that! Test Design Studio already has an excellent project-centric approach to test management, and keeping your tests updated is as easy as running the build process. Once initiated, Test Design Studio looks at all the tests in your project as well as all the shared resources. Each test is then opened in QuickTest through their API, object repositories are associated with each action, and function libraries are associated with each test. Test Design Studio even makes sure you have the proper add-ins set.

When I develop automated tests, I place most of my functionality in reusable function libraries. I keep libraries small and each one focuses on specific aspects of the application. This keeps my code organized and easy to manage, but can easily create 10 to 20 function libraries. Managing that many individual files would be cumbersome without Test Design Studio. When I need a new test, my actions are simple:

Add the test to my project using the ‘Add New Item’ command and selecting my QuickTest file template

Right-click the file in Solution Explorer and select ‘Build’

That’s it! My new test is immediately configured with all the function libraries and object repositories that are defined for my project. When I need to add a new function library to my projects, the process is equally simple:

Add the library to my project using the ‘Add New Item’ command and selecting library file template

Right-click the project in Solution Explorer and select ‘Build’

Just like that, Test Design Studio will process all of my tests to make sure they reference the new library.

What Changed in New Release

The build process is key to increasing your productivity and working in conjunction with QuickTest. One of our newer customers recently posted a series of suggestions for improving the build process, and we were pleased to make those enhancements (you know how you are, and thank you!).

Hint: If you have an idea on how to improve Test Design Studio, it’s a great idea to share it!

Test Left Open after Build

If you initiate the build process for a single test, Test Design Studio will now leave that test open in QuickTest after the build is complete. This is a great time-saver when actively writing and debugging a test since you can essentially use the ‘Build’ command to open the test in QuickTest (something always available off the context menu) but also ensure that all resources are properly associated to the test.

Version Control Improved

We made many adjustments to how version controlled tests are managed. A test must be checked out in order to modify it, so the build process will check out any test that was not already checked out. If no changes were necessary from the build, that check-out will be cancelled. When the test is modified, it will be checked back in only if the test was not already checked out. Those tests that were already checked out before initiating the build will remain checked out afterwards.

Keeping a test checked out proved to be a complex task due to an issue with the QuickTest API. If you save a version-controlled test and try to close it without checking it in, QuickTest displays a prompt asking if you want to check it in. It even does this if you are using the API, and that dialog was blocking the build process while waiting for user input. Even worse, the dialog displayed by QuickTest was often not visible on the desktop giving the illusion that Test Design Studio had stopped responding. To keep the build process operating smoothly, we actually had to use GUI automation to detect and dismiss that dialog when displayed.

Build Selected Command Updated

The ‘Build Selected’ command was originally designed to build the selected item in the Solution Explorer tool window. For those that do not have the Solution Explorer track the currently selected document, this caused some confusion. If you had one test selected in Solution Explorer but were actively editing a different test in an editor, the ‘Build Selected’ command would build the test in Solution Explorer instead of your current document. That has now changed. The command will only process the selection from Solution Explorer if that tool window is selected when activating the command. Otherwise, it will attempt to process the actively edited document.

Summary

For those not familiar with the build process, we hope you integrate it into your project management process. For those already using it, we hope you enjoy these enhancements. Thank you again to the customer who brought these suggestions to us, and keep that feedback coming! We all benefit from the ideas of others.

How safe do you feel knowing your code passes a syntax check?

We have talked at length about having correct syntax does not mean your code is correct. Sure, we start with getting the syntax correct, but we must also follow the rules of the language.

The ability to create functions and return values is one of the greatest tools available to a programmer! VBScript, unfortunately, does not use a simple statement such as “return” to signify the return value (as is common in “C”-style languages). Instead, it makes you repeat the name of the function in a syntax similar to assigning a value to a variable. This works great, of course… provided you type the name correctly!

While working on support for the Web Services add-in for the next release of Test Design Studio, we opened the “Web_Services.txt” file that ships with QuickTest to ensure that the user-defined functions were being properly registered to the “WebService” object. That is when we found the issue below:

Do you see what’s wrong? Test Design Studio makes it pretty obvious that the author of this file mistyped the name of the function when trying to set the return value. What should have been “GetHeaders” was typed as “GetHeader”. Test Design Studio saw two potential problems here:

“GetHeader” (since it does not match the name of the function) is the same as a reference to a variable. This file has Option Explicit turned on, and a variable named “GetHeader” does not exist. Test Design Studio will not let you use an undeclared variable when Option Explicit is enabled.

Functions, by definition, are meant to return values. This function declaration does not (due to the mistyped name). Test Design Studio warns you that you might have forgotten to return a value.

This error means that anyone attempting to use the “GetHeaders” function attached to the “WebServices” object is either going to get a run-time error or, even worse, never get the object they were trying to retrieve.

Sadly, that’s not the only error Test Design Studio caught. Elsewhere in the same file was this code:

Again, Test Design Studio makes it obvious that there is a problem here! The code author misspelled the parameter name in the function declaration, which made the reference to the parameter (using the proper spelling) inaccurate. Test Design Studio gave the following reports:

The code attempted to use an undeclared variable “XMLSchemaFile” (Option Explicit turned on). Since the name did not match the parameter, it was expecting the name to be defined elsewhere.

A warning was given that the user declared the parameter “XMLScheamFile” and never actually used it.

This function, as defined, would never properly validate the XML Schema file.

Had the author of this code been using Test Design Studio and the powerful code analysis features, neither of these mistakes would have been distributed to thousands of HP customers! Keep in mind that the syntax of this file was 100% accurate even though the file had multiple errors.

So I ask again… how safe do you feel that syntax checking is enough to verify your code?

To learn more about the Code Analysis features of Test Design Studio, check out our page on Code Quality. You may also want to download our Data Sheet.

The two major flaws in this code were caused by 3 simple characters. What might you have missed in your code?

Note: This file was analyzed from the general release of QuickTest v11.0. I have not verified if the issues have been corrected in any patched release. What to check your file? The default location is “C:\Program Files\HP\QuickTest Professional\data\BPT_Resources”. You may also find the file automatically added to your Quality Center/ALM projects in the Resources module.

We are happy to announce that the next release of Test Design Studio will add support for Delphi test objects! Support for QuickTest test objects like those in the Delphi add-in are just one of the features that give Test Design Studio an advantage over standard VBScript editors like Notepad++. Many VBScript-based editors can provide support for standard VBScript language elements, but only Test Design Studio adds support specifically for objects that are unique to QuickTest!