If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Originally I set it up like this.
1. Query Data
2. Copy to Excel
3. Do a million edits in excel (the list above and more)
4. Remove duplicates
5. Save as CSV
6. Add several other columns of data (usually with same words in it like "Yes" to indicate an active product.
7. Separate Category paths and split into multiple columns
6. Import into mySQL dB for ecommerce site

Believe me, there's even more. I really think though there has to be an easier way. I must go through this same process every three months and it's tedious. I'd like to set something up so that I can ease the pain a bit.

One additional challenge I face is this. If say 1 dB has 40,000 records in it, then the one that is to replace it has 41,000, there's a new 1,000 records. No big deal right? But some of the original 40,000 are no longer available so basically I have to dump the whole original dB.

If a file name or sku is off by 1 character, then the records from the old dB (page names, reviews assigned to skus and such) won't line up and viola, a giant mess.

So this next Access dB cleanup I'd like to set up a method in which is not forgettable or maybe is programmed so that I can't possibly mess it up.

Does anyone have any ideas at all as to how to tackle this?

Note: Currently (if I could find my original setup docs that is) it takes me about 3 hours to go through 40,000 records, clean them up and get them ready for a CSV import to mySQL. Not that big a deal but it's mind numbing, slow, tedious and lots of other words that indicate it's not fun in any way.

bear in mind you codl do the same data manipulation / editing in MySQL

persoanlly I'd ditch the intermediate step in Excel.. spreadsheets are is great for data analysis and what if scenarios. its pretty crap at data storage, its pretty crap at data integrity, and its easy to (*&^ things up in spreadsheets nd not realise for a while that things are (*^&)(*& up

persoanlly I'd be tempted to stuff data immediately into MySQL but with a flag to indicate that that record isnt released for public consumption jut yet
I'd do your editing (well as Sindho points out update queries) on the MySQL data, remove the flag to indicate the record is now ready for public consumption

if that isn't appropriate then I'd keep the data in Access untill I was happy wit it then pump it accross to MySQL.

I have a legacy website which was set up before the advent of ADSL, strictly dial up lines. so that is a split db the dtaa entry and vaildation is done under access, then transferred onto the live MySQL site. its on my to do lists, get rid of the Access stage and stay purely in mySQL. with modern broadband connections theres no need for it, and it cuts out anopther application that needs maintaining