I have a geographic file (shapefile) that constists of around 40 polygonsI have an Excel-document that has several tabpages, each containing a few hundred records.

The process of joining tables however is extremely slow and it looks like the time required grows exponentially with the number of joins I create.

In 8 Excel tabpages I have data that refers to the polygons in the shapefile.So I try to create 8 joins between the geographic data source and the excel tab page. Like so:

The problem is: after the third join the whole process becomes so slow that Tableau becomes non-responsive for a long time. Only after the process has finished am I allowed to change the join type from an inner join to a left join (which appears to be a LOT faster).

An example of the data:

Geographic file contains polygons, each of which has an ID in the field owmident. Here's one of them, displayed on top of Openstreetmap.

The Excel file, e.g. the tab ESF1.Score contains water quality values for each of the polygons:

Question: is there a smarter way to import this data? For starters, I do not need any interaction between data on the various Excel-sheets. So is it really necessary for Tableau to create some sort of huge internal table that contains mililons of unique combinations of my records?

In stead of this:

I would much rather have had something like this:

Is this even possible in Tableau? Or are there better ways to prepare my data for Tableau?

Thanks Deepak. I'm trying a UNION right now. Seems like I have too many similar column names in my Excel sheets for that, however. So I'll have to rename them first. Tableau PREP looks great, but for now I should be able to organize the data myself since it's not that much.

It will be good, if you have same column names across all your datasets. You can easily Union them those 8 Datasets. It works like this data would be arranged columnwise on Top of One Another and You Get 2 Additional fields called Sheet and Table Name which you can use to Manipulate data.