I agree to TechTarget’s Terms of Use, Privacy Policy, and the transfer of my information to the United States for processing to provide me with relevant information as described in our Privacy Policy.

Please check the box if you want to proceed.

I agree to my information being processed by TechTarget and its Partners to contact me via phone, email, or other means regarding information relevant to my professional interests. I may unsubscribe at any time.

Please check the box if you want to proceed.

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

virtualization may be the answer to software testing troubles.

Software is not only physically and geographically interconnected, but it also must handle cases that cross an organizational boundary, such as meshing public and private cloud software, for example. This is particularly true where data access is involved: Data from social media and public cloud must now be part of the same analytics as traditional data warehousing and business intelligence apps.

Sandbox testing

Isolating the software under test from a runtime environment when software testing took place on the same machine used to be difficult. When you tested on the same hardware that run-the-business apps use, you vastly increased the possibility of downtime in the production infrastructure.

Now, however, virtualization technology allows both hardware and software isolation. While the tester plays in a sandbox that looks like the runtime environment -- and in fact is a piece of the runtime environment -- the run-the-business apps safely proceed as if the sandbox wasn't there.

With distributed databases, you have the app, the database(s) and a portion of the database's data store all running on each piece of hardware; the actual architecture under test is more complicated. However, the approach can be the same: Create a sandbox in each machine containing a test copy of app, database and data store, and then run the tests as if the sandbox controlled the whole machine.

The sandbox technique for software testing is about as close to mimicking your runtime environment as you can get. The sandbox only uses a portion of each machine's resources; but the actual runtime of the app often uses only a similar portion.

Containers and app performance

The VM has many virtues, but it is designed to run on an OS kernel that may not run on the same machine or be optimized for that OS. Those factors cause a significant amount of performance overhead -- expense that makes it more difficult to exactly mimic runtime performance during tests.

A container acts like a VM, but only allows the same OS as the kernel, and accesses the kernel at a lower level. Thus, at the price of ruling out running Linux on a z/OS kernel, the container allows app performance that closely approximates that of a Linux app running on a Linux OS without virtualization. This software testing technique gives a better simulation of runtime performance.

Most distributed databases run on the same OS as the app. Redesigning the app to use containers and then testing it on them in sandboxes brings runtime performance benefits as well as a more accurate reflection of the runtime environment.

Data virtualization

Data virtualization is a special case, but the logic is that testing should consider cases where some databases or database/data copies are down -- e.g., mirrored storage when one of the two mirrors is unavailable.

Data virtualization software makes this simple: The software tester can switch off some database copies to mimic a hardware failure, and ask the app to perform analytics on the remaining databases by redefining the database to the data virtualization software. In data virtualization, the app doesn't know or care where the data physically resides.

Putting it all into distributed database testing

One way to put this software test architecture together is to write a service that issues test instructions remotely, abstractly and with the user filling in the details of software location and architecture. An agent for each physical location then simulates streams of input from app users, employing containers within sandboxes for the actual software to be tested. Where appropriate, data virtualization software simulates breakdowns of specific physical IT infrastructure.

About the author: Wayne Kernochan is president of Infostructure Associates, an affiliate of Valley View Ventures, which identifies ways for businesses to leverage information for innovation and competitive advantage. He has been an IT industry analyst for 25 years and has focused on key information-related technologies and ways to measure their effectiveness. Email him at wkernochan@aol.com.

7 comments

Register

Login

Forgot your password?

Your password has been sent to:

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

A big roadblock for us is having good data in our testing environments. Often, we lack the volume of data as well as the different values of data that we would see in our production environment. That can make a big difference when it comes to testing the types of data intensive applications and services that my team works on.

From the author for @abuell: Yes, the sandbox is set up on the actual production machines. This provides a better stress test, as other apps running on the production systems are given a chance to overload the system as a whole. The point of the sandbox is that it is designed not to risk those production apps.

This is an interesting approach. Generally speaking, we tend to run out stress tests on the real hardware and apps, or in a separate environment made to match the production environment. Having a sandbox on the production server itself is not something we have done, but it's intriguing.