We use SQL Server 2005 Express at production area. I develop production software which reads and writes to SQL. I'm new user for SQL and i created a table according to advices which i found at internet.

So totally there are 121 Columns at my table and 12 client softwares connected to table. Our production has 2 steps and my problem is at first step. First of all i check whether if Serial Number of product is at table or not(I stored serial numbers of product at MifSeriNo Column ). Then if serial number of product is not at table then I UPDATE value of "MifSeriNo" Column (that it's Unique Key-NonClustered) according to serial number of product. This step works at first 2-3 hours perfectly(UPDATE query estimates less than 1 second when it works perfectly). But after 2-3 hours, update query estimates nearly 5 or 6 seconds.

When it works badly, i stop all client software. I restart SQL service and even i restart to SQL Server computer. Then i execute "Rebuild All" command from SQL Server Management software. When i do these steps, sometimes SQL turns out to normal condition or sometimes i have to do these steps 2-3 times.

This morning i had same problem. I did corrective steps which i explained above. It worked normally. After 10 minutes i checked fragmentation of ID and MifSeriNo column, ID was 0 and MifSeriNo was 51(i think it's expected).

I need your dear advices. May be we should buy SQL Server instead of Express Edition?

2 things stood out for me in your post, First, your table has 121 columns. It's very rare that I've seen large tables like that end up being the right approach. Can that be broken down and further normalized (that's long term). Second, that you're running this on 2005 express. 2005 express is limited to 1 CPU, 1 GB of memory for the buffer pool and 4 GB max size. How big is the table in terms of MB? you can use sp_spaceused to find this. Are those 12 clients concurrently connected (meaning all at once) or is it 12 possible connections at various times of day?It really sounds like you've reached the limitations of your server, your database edition or your design... or a combination of all 3 of those items. If you can post more details we can help you refine your solution further.

2 things stood out for me in your post, First, your table has 121 columns. It's very rare that I've seen large tables like that end up being the right approach. Can that be broken down and further normalized (that's long term). Second, that you're running this on 2005 express. 2005 express is limited to 1 CPU, 1 GB of memory for the buffer pool and 4 GB max size. How big is the table in terms of MB? you can use sp_spaceused to find this. Are those 12 clients concurrently connected (meaning all at once) or is it 12 possible connections at various times of day?It really sounds like you've reached the limitations of your server, your database edition or your design... or a combination of all 3 of those items. If you can post more details we can help you refine your solution further.