we are dealing with very large data >2million records per file.User will upload this to server. The data will be used by researchers to run some analysis.

Initial design was to upload every user data into same table-D with user session id and a file_name. Use the same table-D further is used in joins and select.

The problem was when we tested using it (>3users) despite indexing joined columns and searched columns. The performance was very poor. The data grows very fast in these tables. There is also problem of table level locking which gets select statements have to wait.

So we are thinking of creating a table for each user with some session Id appended to it. Then then use them further in joins and computing user needs. Delete when user session is over by additional demon program executing a stored procedure.

I have not seen much discussions about this design being used in any forum? We don't have DBA expertise. Is there any flaw in the design we are thinking of? What are the pitfalls that we are not able to foresee?