-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve CacheLoader interface #53
Comments
To catch up from the private email, where we worked out a few of the details before this ticket. Currently the way to return a To our surprise, we think that this can be implemented elegantly without a breaking or invasive changes. It could be released in a minor version and be a natural extension to the existing code. The idea was that we can have A quick stub of the interfaces would look something like, @FunctionalInterface
interface AsyncCacheLoader<K, V> {
CompletableFuture<V> asyncLoad(K key, Executor executor);
default CompletableFuture<Map<K, V>> asyncLoadAll(Iterable<K> keys, Executor executor) {
throw new UnsupportedOperationException();
}
default CompletableFuture<V> asyncReload(K key, V oldValue, Executor executor) { ... }
}
@FunctionalInterface
public interface CacheLoader<K, V> extends AsyncCacheLoader<K, V> {
V load(K key) throws Exception;
default Map<K, V> loadAll(Iterable<K> keys) throws Exception { ... }
default CompletableFuture<V> asyncLoad(K key, Executor executor) {
// define
}
default CompletableFuture<Map<K, V>> asyncLoadAll(Iterable<K> keys, Executor executor) {
// redefine
}
default V reload(K key, V oldValue) throws Exception { ... }
default CompletableFuture<V> asyncReload(K key, V oldValue, Executor executor) {
// redefine
}
}
// both work!
Caffeine.newBuilder().buildAsync(key -> value);
Caffeine.newBuilder().buildAsync((key, executor) -> future); This introduces A quick pass to see how this might work is in the async-loader branch. That needs a lot more effort, but it feels correct so far. |
A little more work and the existing tests pass, making this mostly done. The parameterized test generator needs to be updated to create cache variants using an |
@ondbar I think this is ready to merge. Would cutting a release (2.2.0) for this feature be helpful for you? |
Creating an AsyncLoadingCache previously required passing a CacheLoader to Caffine.buildAsync(), optionally as a lambda. This is convenient when the computations are written in a synchronous fashion and the cache provides the asynchronous wrapping. That API was clunky when the computation is defined asynchronously, such as when bubbling up a future from another library. In that case the asyncLoad default method had to be overriden in addition to load (which may never be called). It offered all the power needed but in a combersome fashion. AsyncCacheLoader resolves this API flaw by providing the asynchronous computations a loader can make (asyncLoad, asyncLoadAll, asyncReload). It can be used as a lambda for buildAsync, e.g. (key, executor) -> future. Backwards compatibility is retained by having the CacheLoader extend AsyncCacheLoader. The asyncLoad and asyncLoadAll were already defined as defaults, so only asyncReload was introduced. This approach is nice by providing symmetry, low integration effort, and minimal impact to the API.
Creating an AsyncLoadingCache previously required passing a CacheLoader to Caffine.buildAsync(), optionally as a lambda. This is convenient when the computations are written in a synchronous fashion and the cache provides the asynchronous wrapping. That API was clunky when the computation is defined asynchronously, such as when bubbling up a future from another library. In that case the asyncLoad default method had to be overriden in addition to load (which may never be called). It offered all the power needed but in a combersome fashion. AsyncCacheLoader resolves this API flaw by providing the asynchronous computations that a loader can make (asyncLoad, asyncLoadAll, and asyncReload). It can be used as a lambda for buildAsync, e.g. (key, executor) -> future. Backwards compatibility is retained by having the CacheLoader extend AsyncCacheLoader. The asyncLoad and asyncLoadAll were already defined as defaults, so only asyncReload was introduced. This approach is nice by providing symmetry, low integration effort, and minimal impact to the API.
Creating an AsyncLoadingCache previously required passing a CacheLoader to Caffine.buildAsync(), optionally as a lambda. This is convenient when the computations are written in a synchronous fashion and the cache provides the asynchronous wrapping. That API was clunky when the computation is defined asynchronously, such as when bubbling up a future from another library. In that case the asyncLoad default method had to be overriden in addition to load (which may never be called). It offered all the power needed but in a combersome fashion. AsyncCacheLoader resolves this API flaw by providing the asynchronous computations that a loader can make (asyncLoad, asyncLoadAll, and asyncReload). It can be used as a lambda for buildAsync, e.g. (key, executor) -> future. Backwards compatibility is retained by having the CacheLoader extend AsyncCacheLoader. The asyncLoad and asyncLoadAll were already defined as defaults, so only asyncReload was introduced. This approach is nice by providing symmetry, low integration effort, and minimal impact to the API.
Released 2.2.0 with this API improvement. |
Consider improving CacheLoader interface for case where the underlaying code/service has async API.
Currently the async builder is naturally expecting lamba (returning value):
AsyncLoadingCache<Key, Graph> cache = Caffeine.newBuilder()
.maximumSize(10_000)
.expireAfterWrite(10, TimeUnit.MINUTES)
.buildAsync(key -> createExpensiveGraph(key));
the method buildAsync is then calculating value in provided or default executor. However there is also case where underlaying value can be provided as CompletableFuture already by some underlaying service.
Currently we can use future values if we implement new class, but it could be done more user friendly way.
The text was updated successfully, but these errors were encountered: