Saturday, 7 March 2009

Database structure optimization log

We have just started a database optimization exercise. This is a log of the first steps that we took in doing this. Here is a query that lists the number of seeks, scans, lookups and updates that are made on indexes for tables starting with "MOD_"

There are a lot of updates to these indexs but they are never used. The next step is to time the delete statement several times with and without the index. It took 3062 ms with the index and 718 without.

The next part of the procedure was to setup a test server where we can make a controlled load. We monitored the server using the SQL PRofiler and then looked through the logs.

Here is a sumary of the findings: - At the start there where a lot of logins - There where several places where the same select statement was being executed - There where a lot of updates related to creating a cumulative frequency - Connections where not closed.

We where able to fix the login and connections problem by modifying the based class used in our data access layer to: Private IsInTransaction As Boolean = False Public Function OpenConnection() As OleDbConnection REM Establish connection to database If objConn Is Nothing Then objConn = New OleDbConnection End If If objConn.State <> ConnectionState.Open Then objConn.ConnectionString = ConnStr objConn.Open() End If Return objConn End Function

Public Sub CloseConnection(ByRef objConnection As OleDbConnection) REM Close connection to database If (Not IsInTransaction) Then objConnection.Close() End If End Sub Public Function BeginTransaction() As Boolean Implements IDataBase.BeginTransaction REM Function to to start a transaction IsInTransaction = True Return ExecuteSQL("Begin Transaction ") End Function Public Function EndTransaction() As Boolean Implements IDataBase.EndTransaction REM Function to to start a transaction IsInTransaction = False Return ExecuteSQL("commit ") End Function

The problem with the cumulative frequency calculation was made with a stored procedure.

The next step was to look at index fragmentation. The value for avg_fragmentation_in_percent should be as close to zero as possible for maximum performance. However, values from 0 percent through 10 percent may be acceptable. All methods of reducing fragmentation,such as rebuilding, reorganizing, or re-creating, can be used to reduce these valueshttp://msdn.microsoft.com/en-us/library/ms188917.aspx

As we investigated the queries looking at the actuall execution plan in SQL 2008 there is an output that makes suggestions over any missing indexs that can be used to optimize the select statementeg Missing Index (Impact 98.0075) : CREATE INDEX ......