Basics of Java Caching System (JCS)

Today, in Java application development, caching is considered to be an important aspects during application design. The main purpose of caching is to improve performance. Java Caching System is a distributed caching framework developed in Java. It supports memory and disk cache. JCS is completely configurable through the properties file. It provides mechanism for storing data in the cache, getting data from the cache, removing data from the cache.

In most of the applications, data is retrieved from the database. Such operation is time consuming and expensive. Current-day applications are data intensive. If the application is frequently accessing the database for every request then its performance will become slow. For this reason, many web applications are following different design techniques for reducing time and scale up.

What is JCS?

JCS is a composite distributed caching system that works on JDK 1.4 and higher versions. The performance of JCS is very impressive. The following are some of the main features of Java Caching System:

Thread Pool Control

Memory Management

Key Based Data Retrieval

Extensible Framework

Distributed Composite Cache

Element Event Handling

JCS – The Composite Cache

There are four types of cache in each region. Any type of cache can be used to make JCS active. The following are the four different types of caches:

1. Memory Cache (LRU)

This is much faster and is widely used basic caching mechanism. It uses Least Recently Used (LRU) algorithm to manage the objects stored in the memory. It is based on its own LRU Map Implementation. This Map implementation is much faster than any other Map.

2. Disk Cache

When the memory cache is full, Disk cache is used to store data. When indexing is used with disk cache then performance is much faster. The disk cache is totally configurable. To manage the queue worker threads which makes the process faster, a thread pool is used .

3. Remote Region

A remote cache server can be considered as a connection point for all the nodes instead of maintaining a connection with each and every other node. This cache region provides caching using a Remote Method Invocation (RMI) API.

4. Lateral Region

Lateral Region provides an easy way to distribute cache data on multiple servers. The cache data servers must have a port open to listen on, and socket connection must be created.