By and by, g1smd or someone like him is going to come along and say that it all depends on which version of joomla's htaccess you're using and how much you've modified it. (This applies to any CMS that relies heavily on rewrites.) Things that worked fine when you had 100 people a day reading a total of 300 articles may no longer work so well when you've got 10000 people an hour searching through half a million files.

Let's not get carried away here. I've had this discussion with Jim Morgan and I would say this is typically a minor effect unless you have a very large number of resources on the page and your site is otherwise I/O constrained. It is not typically the issue that will bring your site to a crawl.

In my experience with Wordpress and Drupal (and Joomla should be no different), what is going to really kill you will be

1. Very expensive queries 2. Fairly expensive queries that are repeated over and over 3. Expensive function calls that get repeated over and over 4. Front-end load - heavy Javascript in particular.

Using the default inefficient rewrites will typically be a fairly small hit and a late-stage optimization. That isn't to say that you can't have a rewrite that starts with .* that could create havoc in some situations.

Generally, how much adding more "pages" to a site will affect performance will be a function less of the number of pages than of how many tables need to be queried and how complex the joins are.

If you're querying a few tables - a URL lookup, a content lookup, and user/permissions lookup - this will be super fast. Those queries will be on indexed columns and in the case of the user and content lookup, it will match on an integer typically. Lightning fast even with many thousands of users and pages.

If you add one plugin to your site, though, that adds another table of data and dramatically complicates the query, it will take a significant hit.

And in terms of function calls - this can be a real eye-opener if you do some profiling with Xdebug or similar. I had a site where I was using PHP calls to getimagesize to inject proper height and width attributes into the HTML IMG tags. The pages were loading slowly (these were gallery pages that might have 50 images) and it turned out that 80% of the processing time was being spent just getting image sizes.

So in that case, there was a dramatic increase in server efficiency on the PHP end and a tiny decrease on the MySQL end if I grabbed the size data on data entry and stored it in the database with the file path (which I was storing anyway, so just adding two columns to an existing table) rather than generating it on the fly upon output.

Okay, none of that really addresses your question. The main point I'm trying to make is that you can't take a number of pages and say "At this size database, things will slow down." The "other variables" you mention tend to dramatically outweigh just the size of the database.

I'm not sure if you can do this with Joomla, but with Drupal you can use the Devel module and generate fake content for your site and then load test it with zillions of pages under concurrent load and see when it fails. I would imagine you might be able to write a simple script that would do the same for Joomla - autopopulate your DB. Keep adding 10,000 pages and 1,000 users until things grind to a halt