<p><b>Abstract</b>—This paper presents a technique for minimizing chip-area cost of implementing an on-chip cache memory of microprocessors. The main idea of the technique is <it>Caching Address Tags</it>, or <it>CAT cache</it>, for short. The <it>CAT</it> cache exploits locality property that exists among addresses of memory references. By keeping only a limited number of distinct tags of cached data, rather than having as many tags as cache lines, the <it>CAT</it> cache can reduce the cost of implementing tag memory by an order of magnitude without noticeable performance difference from ordinary caches. Therefore, <it>CAT</it> represents another level of caching for cache memories. Simulation experiments are carried out to evaluate performance of <it>CAT</it> cache as compared to existing caches. Performance results of SPEC92 programs show that the <it>CAT</it> cache, with only a few tag entries, performs as well as ordinary caches, while chip-area saving is significant. Such area saving will increase as the address space of a processor increases. By allocating the saved chip-area for larger cache capacity, or more powerful functional units, <it>CAT</it> is expected to have a great impact on overall system performance.</p>