Many infrastructure issues must be addressed in order to avoid surprises
with SQL optimization. Shops that do not create this infrastructure are
plagued by constantly changing SQL execution plans and poor database
performance.

It is very rare for the fundamental nature of a schema to change. Large
tables remain large, and index columns rarely change distribution,
cardinality, and skew. The DBA should only consider
periodically re-analyzing the total schema statistics if the database
matches the following criteria:

§CPU-intensive databases:
Many scientific systems load a small set of experimental data, analyze
the data, produce reports, and then truncate and reload a new set of
experiments. There are also Oracle databases with super large data
buffer caches, with reduce physical I/O at the expense of higher CPU
consumption. For these types of systems, it may be necessary to
re-analyze the schema each time the database is reloaded.

§Highly volatile
databases: In these rare cases, the size of tables and the
characteristics of index column data changes radically. For example,
Laboratory Information Management Systems (LIMS) load, analyze, and purge experimental data
so frequently that it is very difficult to always have optimal CBO
statistics. If a database has a table that has 100 rows one week and
10,000 rows the next week, the DBA may want to consider using Oracle10g
dynamic sampling or a periodic reanalysis of statistics.

The following section will show how Oracle SQL optimization can be
adjusted to evaluate I/O costs of CPU costs.