Here is the situation:
I have continuously hourly new data stored in
files that needs to be (1) loaded into the database
(2) once loaded into the database, some queries
are make to construct more summarized data
(stored in another table).

Currently, I use php to read the file and insert
them into the database. However, this takes up
database connection, and depends on Apache
running continuously.

I am looking at other possible languages that will
(1) connect to the database directly(by pass Apache)
(2) has some SQL query capability(the summarized
data depend on the group by clause)

I maybe wrong here, but I do believe you can set up cron jobs for this (if it's a nix). You'd basically tell cron to do a task on a specified time schedule, and give it a script to follow. I hope I didn't give you wrong info, but I know that's how I'd handle doing php automated stuff.

Thanks for reply. I apologize for not describing my problem more clearly. Please me explain.

We are currently unable to connect to a outside database via odbc. We need to get data out of that computer and into our new database. And the best way we can do it right now is export the hourly data off the old DB, ftp it over in file, and then load into our new database(I know...its not efficient, but it'll have to do for now).

Right now i'm loading those file into the database with php script ( file -> php/apache -> mysql ). The problem is, this takes resources from apache and database connection resource from mysql. I'm look for any suggestion on what language I can do to bypass using php/apache (file -> mysql).

What makes this more difficult is, besides loading the file data into the database via mysqlimport utility, we need to do queries on these new data and generate other sets of data from them.

One of the reason I'm summarizing these data with PHP is because PHP can query the database, get the data out, summarize them, and insert or update back into another table in the database. This process is done every hour.

However, if for any reason I need to take Apache off, not only can't the users see the summarized data, we can't even load the summarized data into the database at all.

Is there any language out there that's at shell level that has database query capacity?

Einziger is right here. You could use cron on *nix, or AT on M$ (schedule service) to schedule your dumps, queries or whatever you want.
Create a cron/at job that runs a script every hour. In that script you first connect to mysql, run a mysql query (or a dump of specific tables, a db or what you need) to get the information you need into a file. You could use something like select into outfile if you want summarized data in the target file/db. Then in your script you ftp this file to the target machine. On the target machine you create the corresponding cron/at job that reads this file into the database. No manual work is needed....
In this case you're only depending on mysql and cron/at to be running, and won't get any extra load on apache/php.