Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

However, there are a few cases where I was able to reclaim even more space by executing DBCC CLEANTABLE. With hundreds of databases in my environment, it's not possible to know what users do in each one and it's completely acceptable that there will be changes involving the dropping of fixed length columns. I've typically found these opportunities by looking at row counts vs. page counts in some object space utilization scripts I've written. I'd like to take this a step further by attempting to automate the detection of these sorts of scenarios.

What I'd like to know is if anyone out there is actively monitoring for these sorts of opportunities and if so, what do you look for specifically?

My thoughts were to write something along the lines of gathering the max and min size of a row, the number of rows in the table, the number of pages allocated, and the number of pages used, then doing some basic math to log results that are well outside what would be "expected".

1 Answer
1

The solution I'd think of for this problem is to run weekly a job that will run *sp_spaceused* for all tables in a database and save this data in a table. If there are differences in size for each table greater than..let's say..10%, I would run the dbcc cleantable.