Oracle – for when it was like that when you got there

Main menu

Post navigation

Oracle External Tables or What I did on my Holidays

This week’s missive is coming to you from the netbook. Deb and I have pushed the boat out this year and we’re currently in Canada for our holiday.
This has nothing at all to do with Oracle External Tables, but does explain the flavour of the examples that follow.

The Anti-Kyte on holiday...the costume didn't catch on

What are they?

External tables are a mechanism by which you can load flat file data into your database, or by which you can export data from your database.

Hold on, what about SQLLoader for data uploads ? And what exactly is wrong with Data Pump for exports ? Come to that, why not just use UTL_FILE for reading/writing files ?
Well, these are all pertinent questions.
In fact External tables make use of the SQL*Loader API for loading data and the Datapump API for exporting data.
They are essentially another means of achieving what you can do with all of these tools. Which one you choose will no doubt depend on what you want to accomplish.
The big difference is that you can use External Tables to query data not stored in the database wheras the other tools listed here are more concerned with getting data into the database first.
As I want to knock out a blog post about External Tables, well, you can guess which one I’ll be using here.
Our starting point is a table that contains all of the places I’m going to visit, together with the dates I’ll be there :

Setting up a directory

The first step is to tell the database where to look for the data we’re going to load into our external table. This involves defining a directory object to point to the os directory where the file is going to reside.
This is exactly the same step that you need to take when reading/writing files using UTL_FILE so, in best Blue Peter tradition, here’s one I made earlier :

then it’s more than likely a permissions issue on the os. Check that the oracle owner (usually oracle) has write permissions on the directory (at the os level rather than in the database) where the csv file is located.

Assuming all is working as expected, we can use this table in any valid query :

OK, so there are some limitations, you can’t specify indexes on the external table for example, and direct DML is also not going to work.
If we want to play with the data inside the database, we’re going to have to load it into a permanent segment…or are we ?
Say you had some external data that you wanted to manipulate in some way and dump the output back onto the OS without the overhead of a permanent segment such as a holding table ? Well, this is an option using only external tables…provided you want to dump the results into another Oracle Database. This is where the datapump type comes in.

External Table (Part 2) – Dump

Instead of creating a permanent table, where going to create a Global Temporary Table :

So now we have a dump file, should be simple enough to import into the target database, shouldn’t it ? Not using datapump import, apparently, which gives you an ORA-31619 error complaining that the dump file wasn’t created by a datapump export. Honestly, there’s no pleasing some database utilities !

Sigh, I suppose we’ll just have to create another external table on the target database. This time, however, we specify the type as datapump…

2 thoughts on “Oracle External Tables or What I did on my Holidays”

Thanks for that, I hadn’t considered datapump-based external tables before.

When you do the transform and output the results via datapump, why do you use a GTT, instead of basing the table directly on the query? If you skipped the GTT the data would no longer need to be duplicated in the PGA, the query results could go straight out to disk.

You’re absolutely right, the GTT is in fact completely unnecessary and you could reference the itinerary_ext table directly in the creation statement for exp_itinerary. Looking back, the table names I used could’ve been less confusing as well.
Of course, the GTT does make the code a little more readable ( well, I think so, anyway). There’s also my own personal dislike of using too much dynamic SQL, especially DDL statements which are always so much fun to debug.
When all’s said and done however, I just have to hold my hands up and say, erm, I missed that one. Well spotted and thanks for pointing it out.