Easy solutions for Caching in the JVM

Bounded LRU Cache

A common use case. A cache data structure that:

  • has a bounded maximum size and thus memory consumption (which in turn implies a cache eviction policy like LRU)
  • is safe and performant for concurrent use

AFAICT, there’s nothing in the JDK or the concurrency JSRs that hits both these requirements. I googled and found the open source Concurrent Linked Hashmap library, which does.

[An incorrect comment about the project missing tests has been removed]

Unbounded Garbage Collectable Cache

If instead, you want a cache that will grow with use, but can be reclaimed if memory is short, then the use of a ConcurrentReferenceHashMap, configured with SoftReferences, is a good solution.

2 Comments

  1. ben manes said,

    June 22, 2010 at 7:40 pm

    Ben expressed his concern about a lack of tests over email. There are in fact ample tests and they are located under unittest/ in the release branch:

    http://code.google.com/p/concurrentlinkedhashmap/source/browse/tags/release-1.0-LRU/

    Thanks for the independent testing and contacting me with your concerns. All feedback is appreciated!

    Best,
    Ben

  2. Ben Manes said,

    September 28, 2010 at 3:19 pm

    FYI,

    We’ve completed integration of ConcurrentLinkedHashMap’s algorithm into MapMaker. This will be available in Guava r08.

    This allows the combination of #maximumSize() to bound in the common case, *plus* soft-references to automatically evict if under load. The combination allows graceful degradation rather than OOME, while not suffering the penalty of soft references in the common case.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: