chester c young wrote:
> 1. like the idea because right now I'm having to track which prepared statement (sets) are in which session. using xcache to track this, but little bit dangerous. could query the database first but the whole idea is to take a load off the db.
>
Thnx for support. The whole idea is to have DBMS support as much SQL
related functionality as possible. This way you wouldn't have to wonder
if the client has prepared the statement already or not. If you stored
it in the public schema then it MUST be there.
> 2. perhaps "global" could mean simply that the definition is global - if called for session and not exist in session, then session prepares it first from the global def. there would need to be a back reference in case the global def was changed or dropped.
>
Yes, this seems to be a good idea. Something like this would be easier
to implement then having the whole process run in some shared memory
space. The implementation could by like:
1. App cals SQL like "EXEC <statement_name>"
2. PG Engine looks first in local prepared statements as it does now
3. If it doesn't find it there it looks in public space
4. If it's found there copy both the SQL and the execution plan stored
in global space to local process
5. Execute the statement as if it was prepared in local process.
Simplified greatly, new implementation could "only" add steps 3 and 4 to
current implementation of PREPARED statements.
> 3. don't understand your statement about how prepared statements can be used as subqueries - you mean something like
>
> select .. from tab1 t1 where t1.col1 in( exec prep1(..) )
>
> or exactly what?
>
Well, its best explained on the example. Note that such construction
would require extra coding from PG developers.
##1 Lets pretend we have a prepared statement:
PREPARE PS_Sector (InSector INTEGER) AS
SELECT SectorID,SectorName,SectorType
FROM CompanySectors
WHERE SectorID = InSector;
##2 Now we can build another statement over the firs one like this:
PREPARE PS_SectorExpences (InDateFrom DATETIME, InDateTo DATETIME,
InSector INTEGER) AS
SELECT S.SectorID,S.SectorName,S.SectorType,E.ExpType,E.Ammount
FROM PS_Sector AS S INNER JOIN CompanyExpences AS E
ON S.SectorID=E.SectorID
WHERE E.Date>=InDateFrom AND E.Date<=InDateTo;
Let me explain. Statement ##2 uses PS_Sector in direct join with a table
with expences. By automatically passing "by name" the parameter InSector
to the underlaying prepared statement it first returns results from it.
Then it joins it to the table CompanyExpences by field SectorID.
This may look complicated to implement but it's just a simplified
version of implementing SUB-QUERIES. The same effect is gained if you
simple replace the PS_Sector reference in ##2 by actual SQL statement it
holds.
PREPARE PS_SectorExpences (InDateFrom DATETIME, InDateTo DATETIME,
InSector INTEGER) AS
SELECT S.SectorID,S.SectorName,S.SectorType,E.ExpType,E.Ammount
FROM (SELECT SectorID,SectorName,SectorType
FROM CompanySectors
WHERE SectorID = InSector) AS S INNER JOIN CompanyExpences AS E
ON S.SectorID=E.SectorID
WHERE E.Date>=InDateFrom AND E.Date<=InDateTo;
Only, this way, subquery can come with execution plan from ##1 prepared
statemnt.
Where could this be used ?
Since it's obvious performance downgrade, you must have a good reason
for using such approach. Mostly it's reporting !!!
Good report relies on data it gets. Sometimes statements needed for
reporting purposes include very complex data selection. Usually it takes
a long period of time until "everything is covered" by your SELECT
query. At that point you can implement the core logic into a "base
perpared statement" and then use prepared statements over it to get
various aggregations, or different point of views of data.
But, the best thing is: WHEN YOU ADD SOMETHING TO BASIC LOGIC OF YOUR
SYSTEM, YOU ONLY CHANGE THE BASE PREPARED STATEMENT. If you wrote it
cleverly, than all statements built upon it WILL IMMEDIATELY SEE THE NEW
IMPLEMENTATION. This is very powerful and efficient way of introducing
improvements in the system not having to worry that majority of your
reports will show false data until you find time to recompile them.
NOTE. You don't want to manipulate huge database systems this way but
its very usable for 99% of small and medium business apps (databases up
to few gigabytes). It greatly depends on your internal database
representation though.
Best regards,
Milan Oparnica