-
Notifications
You must be signed in to change notification settings - Fork 32
IDisposable and Scoped values
All cache classes in BitFaster.Caching own the lifetime of cached values and will automatically dispose values when they are evicted.
This makes it convenient to dispose objects on removal, but without additional defensive code the caller is exposed to races that can result in unexpected failures. Consider this code which will throw InvalidOperationException
:
public async Task Race()
{
using var source = new CancellationTokenSource();
var cache = new ConcurrentLru<int, MyDisposable>(10);
// 1. background: add and remove the same value
var t = Task.Run(() =>
{
while (!source.Token.IsCancellationRequested)
{
cache.AddOrUpdate(0, new MyDisposable());
cache.TryRemove(0);
}
});
try
{
// 2. foreground: read the value and use it assuming it is not disposed
while (true)
{
if (cache.TryGet(0, out var d))
{
d.Use();
}
}
}
finally
{
source.Cancel();
await t;
}
}
public class MyDisposable : IDisposable
{
public bool isDisposed = false;
public void Use() { if (isDisposed) throw new InvalidOperationException(); }
public void Dispose() { isDisposed = true; }
}
This code fails because of a race:
- Foreground thread: cache returns the value, storing a reference in
d
. - Background thread: value is removed from the cache and disposed.
- Foreground thread: attempt to use
d
, which is now disposed.
To avoid races using objects after they have been disposed by the cache, use IScopedCache
which wraps values in Scoped<T>
. The call to ScopedGetOrAdd
creates a Lifetime
that guarantees the scoped object will not be disposed until the lifetime is disposed. Scoped cache is thread safe, and guarantees correct disposal for concurrent lifetimes.
var lru = new ConcurrentLruBuilder<int, SomeDisposable>()
.WithCapacity(128)
.AsScopedCache()
.Build();
var valueFactory = new SomeDisposableValueFactory();
using (var lifetime = lru.ScopedGetOrAdd(1, valueFactory.Create))
{
// lifetime.Value is guaranteed to be alive until the lifetime is disposed
}
class SomeDisposableValueFactory
{
public Scoped<SomeDisposable>> Create(int key)
{
return new Scoped<SomeDisposable>(new SomeDisposable(key));
}
}
In the following sequence of operations, Threads A and B both lookup an IDisposable object from the cache and hold a lifetime. Thread C then deletes the object from the cache while it is in use. Each thread's lifetime instance ensures that the object is alive until all threads have disposed their lifetime.
sequenceDiagram
autonumber
participant Thread A
participant Thread B
participant Thread C
participant Cache
participant Scope
participant Lifetime Cache
participant Lifetime A
participant Lifetime B
Thread A->>Cache: A calls ScopedGetOrAdd
Cache->>Scope: create scope
activate Scope
Scope-->>Object: create object
activate Object
Cache-->Lifetime Cache: cache holds lifetime
activate Lifetime Cache
Cache-->Lifetime A: creates A's lifetime
activate Lifetime A
Lifetime A-->> Thread A: A holds lifetime
Thread B->>Cache: B calls ScopedGetOrAdd
Cache-->Lifetime B: creates B's lifetime
activate Lifetime B
Lifetime B-->> Thread B: B holds lifetime
Thread C->>Cache: C calls TryRemove
Cache--xLifetime Cache: Cache removes the object and disposes lifetime
deactivate Lifetime Cache
Thread A->>Object: Thread A uses the object
Lifetime A->>Thread A: A disposes lifetime
Lifetime A--x Scope: lifetime de-refs scope
deactivate Lifetime A
Thread B->>Object: Thread B uses the object
Lifetime B->>Thread B: B disposes lifetime
Lifetime B--x Scope: B de-refs scope
deactivate Lifetime B
Scope--xObject: dispose object
deactivate Object
deactivate Scope
Pooling is a popular technique for reducing memory allocations that can be implemented using IDisposable
wrappers to return objects to the pool when the wrapper is disposed. Scoped caches are suitable for such pooling implementations, where a lifetime can guarantee that the cached object will not be disposed while it is in use.