Description

When using the file based cache having a large number of cached pages (in my case over 100,000) makes the system inefficient as there is a function in django.core.cache.backends.filebased called _get_num_entries which actually walks through the cache direcotry structure counting files.

Maybe setting max_entries to 0 in the settings file could mean unlimited cached files, then the _get_num_entries function could be as follows:

I'm going to wontfix, on the grounds that the filesystem cache is intended as an easy way to test caching, not as a serious caching strategy. The default cache size and the cull strategy implemented by the file cache should make that obvious.

If you need a cache capable of holding 100000 items, I strongly recommend you look at memcache. If you insist on using the filesystem as a cache, it isn't hard to subclass and extend the existing cache.

I have Django sites of tens of thousands of pages running for over 2 years using the above patches, so your statements about filesystem caching not a serious strategy are irrelevant. Also, filesystem caching is not comparable to memcaching, they solve two completely different problems.

I had the same problem but a couple more requirements. For future readers, I want to mention the DiskCache (​http://www.grantjenks.com/docs/diskcache/) project. DiskCache is an Apache2 licensed disk and file backed cache library, written in pure-Python, and compatible with Django. There are no dependencies outside the standard library (no managing other processes) and it's efficient enough to handle cache evictions during the request/response cycle (no cron job necessary).