I've some SQL Server DTS packages that import data from the FoxPro database. It was working fine until lately. The script that imports data from among the FoxPro tables tanks out about 470,000 records in to the import. I am just tugging the information right into a table with nullable varchar fields so I am thinking it should be a strange/corrupt data problem.

What tools can you use to find an issue such as this?

FYI, this is actually the error I am getting:

Data for source column 1 ('field1') isn't available. Your provider may need that Blob posts be rightmost within the source result set.

My opportunity uses Foxpro to keep a great deal of information... In my opinion, data corruption is extremely apparent, using the table neglecting to open to begin with. Have you got a copy of foxpro to spread out the table with?

Just suggestions. I'm not sure whether it a corruption problem without a doubt. I simply began installing FoxPro from the MSDN Subscription, so I'll find out if I'm able to open the table. SSRS opens the table, it simply chokes before you run through all of the records. I am just trying to puzzle out which record it's getting an issue with.

@Lance:

if you can get Visual FoxPro command line window, type:

SET TABLEVALIDATE 11
USE "YourTable" EXCLUSIVE && If the table is damaged VFP must display an error here
PACK && To reindex the table and deleted "marked" records
PACK MEMO && If you have memo fields

After doing that, the dwelling on the table must ve valid, if you wish to see fields with invalid data, you can test:

SELECT * FROM YourTable WHERE EMPTY(YourField) && All records with YourField empty
SELECT * FROM YourTable WHERE LEN(YourMemoField) > 200 && All records with a long memo field, there can be corrupted data

etc.

At 470,000 records you might like to determine if you are approaching the two gigabyte limit on FoxPro table size. When I comprehend it, the records can nonetheless be there, but become inaccessible following the 2 gig point.