Most efficient way to import data into a MS SQL database

Yarışma Özeti

We are looking for the best way to read data from a database or file (source), compare this data against the data in our database, update changed details and insert new data.
Our system is for ID card production and our card production software is often installed in multiple locations all utilising the data in our MS Sql database to look up and print the details for each person also updating this database when a card is produced.

Our database is MS SQL, the source data may be from a database or from a CSV file depending on the client.
We have created several different import routines and experience the issues below
Usually the source data will be close to 30k records
The source data usually does not contain a last edited date or similar so we do not know which data is new or changed in the source data

The problems we are trying to overcome
The overall time the import/compare process takes too long from start to finish (close to an hour in some situations)
The import process makes the database unreliable when been accessed by other systems during the import - the database is being continually used by the ID production system so the import process cannot limit, restrict (locked records) or slow the database server.

We are not looking for someone to write an application as we will do this ourselves we ARE looking for someone to design the best way to perform this process.
I have attached additional specifications

The winner of this compedition will be the person who designs the best way to perform this task

Genel Açıklama Panosu

Notes
The author has considerable experience with application systems, Java, .NET, XML processing, JSON and SQL Server (including SQL Server 2008 R2).
The proposal is is based on experience with a commercial software system for flling messages which provided an XML transaction system as defined here.

Part 3 - Transaction Management
An administrator can query the transaction file to review transactions which have failed so they can be resubmitted as new transaction.
An administrator can periodically purge transactions which have been processed before a certain date.
Part 4 - Other
It is recommended that foreign key constraints should be added to the database to esnure integerity.
It is recommended that the use of indexes be reviewed as this might impact the performance of the underlying updates. It is also possible that indexes may need to be updated during processing to prevent problems with fragmentation and so forth.

Part 2 - Processing transaction
Transactions are processed. This can either be a continuous process, or a process started manually.
The transaction processing can be defined using a .Net program (eg C#) or a T-SQL procedure. It is recommended to use an application .Net program as this allows finer control over record locking, parsing, processing and errors.
The transaction process reads transactions records which are unprocessed (in FIFO fashion). It then parses the Json document to create an intermediate record structure. If the transaction is an update, the system uses the record identifier to retrieve the record, or create a new record if it does not exist. It then walks through all of the non null intermediate record fields to apply the updates to the database column.
If the transaction does not succeed due to invalid parse, incorrect identifier or foreign key the transaction is aborted and message written to database.

The crux of the proposal is to treat this as an application problem and properly design a transaction system in order to process the updates and insertions to the patron system. A key part of the proposal is a well defined transaction specification using a hiearchical document format such as JSON. Only fields which are to be updated need to be defined in the transaction.
Part 1 - Upload of transaction records
Records from remote data sources are converted into transaction format. These records are imported into a transaction table Transaction within the SQL Server database. The transactions can be imported as a batch, or via a continuous process

Sir, our team has already worked on such kind of project where we use to transfer all the trading records. I have read the attached document and understood all your requirement. Looking forward for your reply.

Hello sir,
I read your required. I think I can design new way to resolve your issue. My way such as:
1. create a replication for remote database and data will be sync immediately when update/insert/delete.
2. create a agent job to update data to remote database to your database.
3. if you don't have any permission on remote database and only have a data file such as (CSV) I can create a job on Agent to compare them together.

I have over 3 years experience working with SQL database so I think I have very way to resolve your issue.