Re: Calculating archivedlog file size per day

For each log switch there is one record count in gv$loghist view. So
in a given day
if I have 5 records in this view then there 5 log switch happened.
Count(*)=5
Each Log file size=100MB
So total redo generated= 5*100=500MB

So I have to MULTIPLY.

Yes. I am just trying to calculate approx size(ignoring manual log
switch and db shutdown,etc) to get an estimate.

Correct me If I am wrong.

Thanks again.
-Sami

On Sun, 13 Mar 2005 19:41:54 -0700, Tim Gorman <tim_at_evdbt.com> wrote:
> > I want to calculate archivedlog file size per day to estimate backup> > storage area for the database which is currently operating in> > NOARCHIVELOG MODE. My redo log filesize is 100MB and below is the> > query.> >> > select trunc(FIRST_TIME),count(*)*100 size_in_MB> > from gv$loghist> > group by trunc(FIRST_TIME);> >> > You should divide by 100, not multiply by 100, to have your answer displayed> in Mbytes.> > This query might give a rough worst-case estimate, but the amount of redo> actually written to each redo logfile sequence is not stored in this view.> See below...> > > If the database is in ARCHIVELOG mode, then I can use the below query> > to calculate the same (of course I can check file system)> >> > select trunc(COMPLETION_TIME),count(*)*100 size_in_MB> > from gv$archived_log> > group by trunc(COMPLETION_TIME);> >> > Just wanted to verify.> >> > It is not valid to assume that every archived redo logfile will be the size> defined for the online redo logfiles. Shortened files are created all the> time (i.e. manual logfile switch, shutdown, etc). A more useful query would> be:> > select trunc(completion_time), sum(blocks*block_size)/1048576 mb> from gv$archived_log> group by trunc(completion_time);> > Hope this helps...> > --> http://www.freelists.org/webpage/oracle-l>