Fading Coder

One Final Commit for the Last Sprint

Home > Tech > Content

Building a Thread-Safe and Scalable Computation Cache with Futures

Tech May 14 2

Memoization is a common optimization technique: remember the result of an expensive operation so that subsequent requests for the same input can be served instantly. In a concurrent enviroment, however, naive caching can lead too race conditions, duplicated work, or even deadlock. The solution is to combine a concurrent map with the Future abstraction so that only one thread performs the computation while others wait for the outcome.

  1. Define a generic computation contract

public interface Function<A, R> {
    R apply(A argument) throws InterruptedException;
}

  1. Implement a lock-free memoizer

import java.util.concurrent.*;
import java.util.Map;

public final class AsyncCache<A, R> implements Function<A, R> {

    private final Map<A, Future<R>> store = new ConcurrentHashMap<>();
    private final Function<A, R> delegate;

    public AsyncCache(Function<A, R> delegate) {
        this.delegate = delegate;
    }

    @Override
    public R apply(A key) throws InterruptedException {
        while (true) {
            Future<R> slot = store.get(key);
            if (slot == null) {                       // cache miss
                Callable<R> task = () -> delegate.apply(key);
                FutureTask<R> promise = new FutureTask<>(task);

                slot = store.putIfAbsent(key, promise);
                if (slot == null) {                 // we won the race
                    slot = promise;
                    promise.run();                  // trigger computation
                }
            }
            try {
                return slot.get();                  // await result
            } catch (CancellationException e) {
                store.remove(key, slot);            // retry if cancelled
            } catch (ExecutionException e) {
                throw launder(e.getCause());
            }
        }
    }

    private static RuntimeException launder(Throwable t) {
        if (t instanceof RuntimeException) return (RuntimeException) t;
        if (t instanceof Error) throw (Error) t;
        throw new IllegalStateException("Unexpected exception", t);
    }
}

  1. Demonstrate the cache in action

public class Demo {
    public static void main(String[] args) throws InterruptedException {
        Function<String, Integer> parser = s -> {
            try {
                Thread.sleep(1000); // simulate latency
            } catch (InterruptedException ignored) {}
            return Integer.parseInt(s) + 1;
        };

        AsyncCache<String, Integer> cache = new AsyncCache<>(parser);

        System.out.println(cache.apply("23")); // 24 after ~1 s
        System.out.println(cache.apply("23")); // 24 instantly (cached)
    }
}

The AsyncCache guarantees that:

  • Only one thread performs the expensive computation for a given key.
  • Concurrent readers block efficiently via Future.get() instead of busy-waiting.
  • Stale or failed entries can be evicted transparently.

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.