The simplest way to gather statistics in these environments is to use a script or job scheduling tool to regularly run the GATHER_SCHEMA_STATS and GATHER_DATABASE_STATS procedures. The frequency of collection intervals should balance the task of providing accurate statistics for the optimizer against the processing overhead incurred by the statistics collection process.

If I run GATHER_DATABASE_STATS, do I still need to run GATHER_SCHEMA_STATS?
I guest if we run gather_schema_stats we can run in different session for different schema in parallel? That may make it faster instead of just running GATHER_DATABASE_STATS

If data is NOT changing, then no need to collect new statistics.
If data changes only a little, then no need to collect new statistics.
The rule of thumb is "a little" means less than 10%.
The term "data changes" is for INSERT or DELETE which impacts row counts.

Since V10 release default Oracle database comes with DBMS_SCHEDULER job that "automatically" collects new statistics.
You can & should decide if this is adequate for your database. If not, then you need to implement the desired changes to the statistic update schedule.