If you have developed or maintained business applications, you will probably have found the following typical scenario several times:

A well structured application with at least two or three layers: UI, Business Layer and Data Access Layer (DAL).

A Relational Database Management System (RDBMS) behind the scenes doing all the hard work of persistence and data integrity.

When many people are involved in a system, with specific responsibilities (analyst, developers, DBA administrators, etc.) and sometimes, with contradictory requirements, you will find that databases are not frozen.

Normally, databases evolve (new tables, relations, changes in name's and data type’s fields) throughout the development process much more than we could have initially expected. Yes! Databases are alive!

On one hand, that’s a good thing: software is built to be changed. On the other hand, however, the problem arrives on the scene when we try to maintain our application robust enough surrounded by a rapid changing environment and functionality.

This common scenario is error-prone and is derived from the de-synchronization of applications and underlying databases. The tool and method provided in this article shows a way to automatically check the syntactical correctness of your SQL code in your Data Access Layer every time you need it.

In this way, you will have an automated way of detecting DB–Application inconsistencies inside your SQL code before releasing a new version of your software. It is useful to have a second chance to detect DAL bugs before compromising the robustness of your software.

Now, you have just realized where the name ‘Armadillo’ comes from: consider this technique as a defensive programming weapon for your daily programming toolbox.

The examples provided assumes a MS SQL Server database. However, the technique does not depend on the RDBMS. It can be applied to Oracle or MySQL with minor changes (in fact, the Armadillo framework is prepared for such extensibility: only an interface must be implemented to support a new DB technology).

The example assumes the following DB schema (scripts provided for MS SQL Server):

You know, the classical, minimalist and educational DB for gathering Customers, Invoices, and Invoice Lines.

Before starting with testing, first let's have a look at a typical DAL, for example, the Customer code in the DAL assembly. This code is designed to encapsulate the Data Access Layer and contains all the SQL of the application, it manages connections and database APIs.

The second example (Read() method) implements a method to retrieve a customer from the database using a SQL SELECT clause.

The following methods: Update(), Delete(), GetAll(), and GetBySurname() in the Customer class (not shown here in the article) do what is expected of them: they update, delete, retrieve all, and retrieve by surname the customers from the database, respectively.

Finally, in some cases, the DAL code can contain code similar to this:

The main strategy was to label our DAL code with special attributes provided by the Armadillo framework. Later on, our testing engine will find by reflection such DAL methods and test them accordingly during the unit testing phase.

SQLExecute is an attribute to label methods that execute non query SQL code. Use this attribute whenever you find a piece of SQL that changes the state of your database, i.e., INSERT, UPDATE, DELETE, etc. (non SELECTs).

Very similar to the previous one, SQLTextQuery is used to label methods that return SQL statements in text form and such statements are pure queries: they cannot alter the database when executed (it has no collateral effects).

The testing assembly is very straightforward and easy to build. It consists of a unique class with three methods: Init(), Down(), and TestDB() labeled with NUnit attributes to initialize, tear down and execute the test, respectively.

The Armadillo test engine can be executed providing:

a DB implementation (MS SQL Server in this case)

a DB connection and

the assembly to be tested (i.e., DalMyBiz. Note: this is done in an indirect way: selecting the assembly that declares the class Customer).

The core of the test engine is located in the TestSql() method of the class TestEngine. This method receives a reference to the DAL assembly to be tested. Using reflection, the following steps are performed:

Locate all public classes labeled with the DALClass attribute.

For each class found, search for public and static methods labeled with one of our special attributes: SqlQuery, SqlExecute, SqlTextQuery, or SqlTextExecute.

Depending on the label, each method has a special routine to test it: CheckSqlQuery, CheckSqlExecute, CheckSqlTextQuery and CheckSqlTextExecute, respectively. Each of them receives a reference to the method to be tested using reflection (System.MethodInfo).

SQL query methods are executed against the database. We are assuming no collateral effects on queries. I.e., be sure you do not have a trigger after a SELECT changing your DB. Therefore, the implementation calls the DAL method directly using reflection. If it fails, we will get a nice exception, if not, everything is OK.

As we explained before, DAL methods labeled with SQLTextQuery or SQLTextExecute attributes are supposed to return a string containing a SQL statement rather than executing nothing. Therefore, the test to perform is the following one:

Call the DAL method and retrieve the SQL statement in a variable of type string.

If dealing with a SQLTextExecute, set the database in a safe mode: parse-only to prevent execution.

Create a DB command and send the statement to the database.

Restore the DB mode (if needed).

Again, exceptions are the hint to know that SQL statements are corrupted.

Yes, you will probably have noticed it: DAL methods can have parameters, and we should provide valid values if we want to invoke them by reflection.

To overcome this, Armadillo creates default values for each parameter based on the parameter's type. See the method: CreateDefaultParams(MethodInfo mi). Therefore, it should not be a problem for the majority of the cases.

Anyway, if you have cases where you want to control the parameters to be used during a test, you have an alternative: Armadillo provides another attribute for specifying the specific values for a test: ParamValue.

In this first example, the test engine does not know a value to pass to idCustomer when calling this method. So, it will try to invent one using the parameter type's information. For this case, CreateDefaultParams(MethodInfo mi) will return "Hi!" for string parameters and "2000.01.02 03:04:05.006" for the DateTime one. Good enough, isn't it? :-)

However, if you need to change this behavior and force a value for a parameter, use the ParamValue attribute. See the following example:

The access to the database for testing is encapsulated through the interface IDbInfo. The functionality needed for testing SQL is the following:

a connection to the DB,

a CommandFactory (returns a System.Data.IDbCommand),

a method to set the database in parse-only & non execution mode,

a method to resume the execution of SQL statements, and

an exception to be captured to check for SQL syntax errors.

A default implementation for SQL Server is also provided in the Armadillo framework (SqlServerDbInfo). This implementation takes advantage of the MS SQL command SET NOEXEC ON/OFF to enable or disable SQL code without disabling the parsing and precompilation of SQL.

Armadillo can be extended to measure the time spent during the execution of an SQL code. This can be helpful to detect potential bottle-necks for SQL or database optimizations.

It can also be easily extended to support additional RDBMSs such Oracle or MySQL. So, if you need to adapt it to your needs, please do not hesitate to do it. Try to derive a new class from IDbInfo for your own needs.

The new Visual Studio 2005 incorporates a namespace for Unit Testing (Microsoft.VisualStudio.TestTools.UnitTesting). And, oh my dear! the attributes in this framework seem to be very, very similar to the ones in NUnit. Therefore, porting Armadillo from NUnit to Microsoft solutions for Unit Testing should be easy to achieve.

Dealing with SQL errors could be a nightmare, especially if the code is under-documented and the database is changed very often.

The provided method & tool allows you to test the DAL SQL code before releasing a new version. The objective is to armor your code against any SQL syntactic errors derived from DB changes.

If the unit tests are executed in this way, SQL code will be tested against the DB (consider it like a form of assuring SQL precompilation). Therefore, this can avoid ugly surprises to customers at runtime.

I developed and apply this technique because I really needed to assure the correctness of a running system with more than four thousand SQL statements. If you cannot automate this, imagine testing them manually one by one in each release. So, I will be happy if this could be helpful to somebody else.

where DefaultSearchItem is a static property, which returns an object with the default values.

It won't compile. There error is:

"An attribute argument must be a constant expression, typeof expression or array creation expression of an attribute parameter type"

I tried changing it to a static readonly variable but still got the same error. At that point I realised that this testing framework is just too basic and gave up.

However I still think the core idea (running the code against SQL Server in parse mode) is a good one. Pity it's too basic to use. Maybe I'll have a doodle with it and try to get it working with objects as parameters.

Respect to:(1) - Of course, the best practice is not to hard-code the connection string behind a static property, it was done in this way just to keep it simple (KISS) and show how it works. It's quite simple to retrieve the connection string from a standard configuration file or any other source. You have the full source code of Armadillo and the working sample. Therefore, fell free to adapt it to your own needs.

(2) - For sure, the parameter passing mechanism can me improved a lot (e.g. adding more attributes to provide the required value by given methods at runtime). But be careful, when passing objects to "attributes": it's the C# compiler, and not the testing framework, the one imposing such constraint.

but I had find more DAL framework for long time. Have not get a open source framework yet. although I had study the CSLA.NEt for serveral months, but it's heavy wight. could you suggest a ligth wight one.thx.

and In VS2005, I installed the resharper, and when I run nunit test for the sample, the profile select avaiable, and I install the jetbrains dotTrace, the profiler can solve the article mentioned ToDoList.

Depending on your starting point I will split the way in three main paths. Such starting points would be:

A. You have a preexisting database.

B. You have business objects and you need to make them persistent.

C. You have a conceptual model or an abstract object-oriented model (let’s say in XML form, for example)

Therefore, depending on the start point I have explored:

For A. (DB) It’s possible to build a basic DAL (CRUD functionality) and a basic Business Objects:- There are some nice template-base code generator tools like CodeSmith http://codesmithtools.com/[^] that are capable to inspect running DBs (example SQL Server) and create such DAL code for you. - For example try: .NetTiers’ templates http://www.nettiers.com/[^] for CodeSmith and SQL Server databases.- CodeSmith contains also templates for CLSA.Net and is able to generate DAL code for other databases.

For C. (Conceptual model) (This is the way I prefer, when possible):- Select your target architecture (depends a lot on the requirements you want to accomplish), and generate your DB scripts, DAL, Business Objects and IU (when possible) from your model.I really advocate for Model Driven Development and Forward Engineering (when possible ).

But you know, this is a hot-topic, I would like to hear more opinions about all of this. So, please share your opinions.

In your article submited9. To Do List * Armadillo can be extended to measure the time spent during the execution of an SQL code. This can be helpful to detect potential bottle-necks for SQL or database optimizations. I have installed jetbrains dotTrace to find the bottle-necks, when I try the Armadillo's Demo in VS2005, I find the solution for your To Do List.

Yes, that could be a good idea, in theory.Conceptually, it makes perfect sense to split in tests for each DAL class, or even better: a Test Fixture per DALClass and a Test per method to be tested.

However, in the practice, NUnit needs to be able to call each test methods without providing any parameters, without passing any additional info.

Armadillo is a generic engine that does not presume anything about the classes to be tested. Consequently, it launches a unique test for all the DAL Assembly methods (it cannot know the number and the names of the DAL classes in advance.

To overcome this:

The log was structured having in mind to make easy to search for specific errors if a test fails (search for –FAIL, method grouped by each DAL class)

If something fail, the execution continues till the last method in order to show all the errors in the DAL code.

If you need/want to prepare a Test per DALClass, I propose you a workaround: consider creating your specific test methods per DALClass. In this way, you will achieve what you want. However, you will loose the feature to auto-enroll for a test any new added DAL method till you write a new specific Test method for it.