-
Notifications
You must be signed in to change notification settings - Fork 452
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching of function bindings (input/output) for faster I/O #7310
Comments
7 tasks
This was referenced Apr 22, 2021
Keeping it open until all the PRs are merged. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What problem would the feature you're requesting solve? Please describe.
When a function accesses a remote object via bindings (e.g. blob from storage) or produces an output object, we should cache the object so that it can be served from memory the next time it is requested to improve performance (latency/throughput of fetching the object from remote storage).
Describe the solution you'd like
Using the shared memory architecture implemented earlier to exchange data between the host and out-of-proc language workers (#6791), we can maintain a caching layer on top. Instead of freeing the shared memory resources after single-use, the caching layer can keep data around and provide a mechanism for the storage extension to query that before going to remote storage to retrieve an object. This will reduce the time taken to retrieve objects from remote storage, reduce traffic to upstream storage servers and perform better (perhaps more predictable) performance to the application.
Describe alternatives you've considered
n/a
Additional context
PRs for the caching layer:
#7280
Azure/azure-webjobs-sdk#2692
Azure/azure-sdk-for-net#20209
Azure/azure-functions-python-worker#844
#7757
PRs for shared memory communication between host and workers (completed):
#6791
#6836
Azure/azure-functions-python-worker#816
The text was updated successfully, but these errors were encountered: