Building a Thread-Safe and Scalable Computation Cache with Futures
Memoization is a common optimization technique: remember the result of an expensive operation so that subsequent requests for the same input can be served instantly. In a concurrent enviroment, however, naive caching can lead too race conditions, duplicated work, or even deadlock. The solution is to combine a concurrent map with the Future abstraction so that only one thread performs the computation while others wait for the outcome.
- Define a generic computation contract
public interface Function<A, R> {
R apply(A argument) throws InterruptedException;
}
- Implement a lock-free memoizer
import java.util.concurrent.*;
import java.util.Map;
public final class AsyncCache<A, R> implements Function<A, R> {
private final Map<A, Future<R>> store = new ConcurrentHashMap<>();
private final Function<A, R> delegate;
public AsyncCache(Function<A, R> delegate) {
this.delegate = delegate;
}
@Override
public R apply(A key) throws InterruptedException {
while (true) {
Future<R> slot = store.get(key);
if (slot == null) { // cache miss
Callable<R> task = () -> delegate.apply(key);
FutureTask<R> promise = new FutureTask<>(task);
slot = store.putIfAbsent(key, promise);
if (slot == null) { // we won the race
slot = promise;
promise.run(); // trigger computation
}
}
try {
return slot.get(); // await result
} catch (CancellationException e) {
store.remove(key, slot); // retry if cancelled
} catch (ExecutionException e) {
throw launder(e.getCause());
}
}
}
private static RuntimeException launder(Throwable t) {
if (t instanceof RuntimeException) return (RuntimeException) t;
if (t instanceof Error) throw (Error) t;
throw new IllegalStateException("Unexpected exception", t);
}
}
- Demonstrate the cache in action
public class Demo {
public static void main(String[] args) throws InterruptedException {
Function<String, Integer> parser = s -> {
try {
Thread.sleep(1000); // simulate latency
} catch (InterruptedException ignored) {}
return Integer.parseInt(s) + 1;
};
AsyncCache<String, Integer> cache = new AsyncCache<>(parser);
System.out.println(cache.apply("23")); // 24 after ~1 s
System.out.println(cache.apply("23")); // 24 instantly (cached)
}
}
The AsyncCache guarantees that:
- Only one thread performs the expensive computation for a given key.
- Concurrent readers block efficiently via
Future.get()instead of busy-waiting. - Stale or failed entries can be evicted transparently.