I have about a dozen email PST files that currently contain an aggregate of @250,000 emails, about 20% with attachments.

We are rolling out 360 Works.

Dowloading from our remote exchange server with that volume of email seems to be unworkably slow.

A couple of questions:

Is there a PST file converter that people like to use so I can just break up the PST file into fields and a separate but related attachment field?

Will 250,000 emails growing to 1 mil over time be feasible speed-wise within Filemaker pro? If not, I'm considering mySQL as a remote datasource, PHP within a web viewing hitting that source and interacting.

Why not open the PST files with a local copy of Outlook and automate the email extraction from there?

Speed: impossible to say without looking at the architecture of the solution and what the users (or the system) does with the data. Having several million records in a table is not in issue in and by itself. If that table however has 200 fields and most of them are calcs (stored or unstored) and you have layouts with many portals and summary fields on it then you are going to feel a lot of pain

So the design is the key, not the amount of data. Having a lot of data is going to expose design flaws more quickly obviously.