Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve CacheLoader interface #53

Closed
ondbar opened this issue Feb 16, 2016 · 4 comments
Closed

Improve CacheLoader interface #53

ondbar opened this issue Feb 16, 2016 · 4 comments

Comments

@ondbar
Copy link

ondbar commented Feb 16, 2016

Consider improving CacheLoader interface for case where the underlaying code/service has async API.
Currently the async builder is naturally expecting lamba (returning value):

AsyncLoadingCache<Key, Graph> cache = Caffeine.newBuilder()
.maximumSize(10_000)
.expireAfterWrite(10, TimeUnit.MINUTES)
.buildAsync(key -> createExpensiveGraph(key));

the method buildAsync is then calculating value in provided or default executor. However there is also case where underlaying value can be provided as CompletableFuture already by some underlaying service.

Currently we can use future values if we implement new class, but it could be done more user friendly way.

@ben-manes
Copy link
Owner

To catch up from the private email, where we worked out a few of the details before this ticket.

Currently the way to return a CompletableFuture directly is unpleasant by requiring that the asyncLoad (and maybe asyncLoadAll) are implemented by the CacheLoader. This still requires load be implemented, though it might do nothing if never called (only on reload). So while workable, the ideal would be to have an AsyncLoadingCache functional interface so that (key, executor) -> future lambda could be passed to the builder.

To our surprise, we think that this can be implemented elegantly without a breaking or invasive changes. It could be released in a minor version and be a natural extension to the existing code. The idea was that we can have CacheLoader extend AsyncCacheLoader to define / redefine the default methods of the asynchronous interface. Then buildAsync could accept either loader, the internal code would be easy to manage, and the flow would be understandable.

A quick stub of the interfaces would look something like,

@FunctionalInterface
interface AsyncCacheLoader<K, V> {
  CompletableFuture<V> asyncLoad(K key, Executor executor);

  default CompletableFuture<Map<K, V>> asyncLoadAll(Iterable<K> keys, Executor executor) {
    throw new UnsupportedOperationException();
  }

  default CompletableFuture<V> asyncReload(K key, V oldValue, Executor executor) { ... }
}

@FunctionalInterface
public interface CacheLoader<K, V> extends AsyncCacheLoader<K, V> {
  V load(K key) throws Exception;

  default Map<K, V> loadAll(Iterable<K> keys) throws Exception { ... }

  default CompletableFuture<V> asyncLoad(K key, Executor executor) {
    // define
  }

  default CompletableFuture<Map<K, V>> asyncLoadAll(Iterable<K> keys, Executor executor) {
    // redefine
 }

  default V reload(K key, V oldValue) throws Exception { ... }

  default CompletableFuture<V> asyncReload(K key, V oldValue, Executor executor) {
    // redefine
  }
}

// both work!
Caffeine.newBuilder().buildAsync(key -> value);
Caffeine.newBuilder().buildAsync((key, executor) -> future);

This introduces asyncReload for supporting refresh, which is asynchronous but wrapped in an executor in the cache's code. That method will be called by AsyncLoadingCache, while LoadingCache is free to call the reload directly. This is preferred so that the reload is atomic within a compute block.

A quick pass to see how this might work is in the async-loader branch. That needs a lot more effort, but it feels correct so far.

@ben-manes
Copy link
Owner

A little more work and the existing tests pass, making this mostly done.

The parameterized test generator needs to be updated to create cache variants using an AsyncLoadingCache and a few new tests added. That will help in coverage and be an assurance, as I'm fairly confident due to CacheLoader extending the async version that the flows are working correctly. So if tests pass now the more limited API should just work. Then only JavaDoc updates will be necessary before this is mergeable.

@ben-manes
Copy link
Owner

@ondbar I think this is ready to merge. Would cutting a release (2.2.0) for this feature be helpful for you?

ben-manes added a commit that referenced this issue Feb 18, 2016
Creating an AsyncLoadingCache previously required passing a CacheLoader
to Caffine.buildAsync(), optionally as a lambda. This is convenient
when the computations are written in a synchronous fashion and the
cache provides the asynchronous wrapping.

That API was clunky when the computation is defined asynchronously,
such as when bubbling up a future from another library. In that case
the asyncLoad default method had to be overriden in addition to load
(which may never be called). It offered all the power needed but in a
combersome fashion.

AsyncCacheLoader resolves this API flaw by providing the asynchronous
computations a loader can make (asyncLoad, asyncLoadAll, asyncReload).
It can be used as a lambda for buildAsync, e.g.
(key, executor) -> future.

Backwards compatibility is retained by having the CacheLoader extend
AsyncCacheLoader. The asyncLoad and asyncLoadAll were already defined
as defaults, so only asyncReload was introduced. This approach is
nice by providing symmetry, low integration effort, and minimal impact
to the API.
ben-manes added a commit that referenced this issue Feb 18, 2016
Creating an AsyncLoadingCache previously required passing a CacheLoader
to Caffine.buildAsync(), optionally as a lambda. This is convenient
when the computations are written in a synchronous fashion and the
cache provides the asynchronous wrapping.

That API was clunky when the computation is defined asynchronously,
such as when bubbling up a future from another library. In that case
the asyncLoad default method had to be overriden in addition to load
(which may never be called). It offered all the power needed but in a
combersome fashion.

AsyncCacheLoader resolves this API flaw by providing the asynchronous
computations that a loader can make (asyncLoad, asyncLoadAll, and
asyncReload). It can be used as a lambda for buildAsync, e.g.
(key, executor) -> future.

Backwards compatibility is retained by having the CacheLoader extend
AsyncCacheLoader. The asyncLoad and asyncLoadAll were already defined
as defaults, so only asyncReload was introduced. This approach is
nice by providing symmetry, low integration effort, and minimal impact
to the API.
ben-manes added a commit that referenced this issue Feb 18, 2016
Creating an AsyncLoadingCache previously required passing a CacheLoader
to Caffine.buildAsync(), optionally as a lambda. This is convenient
when the computations are written in a synchronous fashion and the
cache provides the asynchronous wrapping.

That API was clunky when the computation is defined asynchronously,
such as when bubbling up a future from another library. In that case
the asyncLoad default method had to be overriden in addition to load
(which may never be called). It offered all the power needed but in a
combersome fashion.

AsyncCacheLoader resolves this API flaw by providing the asynchronous
computations that a loader can make (asyncLoad, asyncLoadAll, and
asyncReload). It can be used as a lambda for buildAsync, e.g.
(key, executor) -> future.

Backwards compatibility is retained by having the CacheLoader extend
AsyncCacheLoader. The asyncLoad and asyncLoadAll were already defined
as defaults, so only asyncReload was introduced. This approach is
nice by providing symmetry, low integration effort, and minimal impact
to the API.
@ben-manes
Copy link
Owner

Released 2.2.0 with this API improvement.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants