Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching of function bindings (input/output) for faster I/O #7310

Closed
gohar94 opened this issue Apr 22, 2021 · 1 comment · Fixed by #7280
Closed

Caching of function bindings (input/output) for faster I/O #7310

gohar94 opened this issue Apr 22, 2021 · 1 comment · Fixed by #7280

Comments

@gohar94
Copy link
Contributor

gohar94 commented Apr 22, 2021

What problem would the feature you're requesting solve? Please describe.

When a function accesses a remote object via bindings (e.g. blob from storage) or produces an output object, we should cache the object so that it can be served from memory the next time it is requested to improve performance (latency/throughput of fetching the object from remote storage).

Describe the solution you'd like

Using the shared memory architecture implemented earlier to exchange data between the host and out-of-proc language workers (#6791), we can maintain a caching layer on top. Instead of freeing the shared memory resources after single-use, the caching layer can keep data around and provide a mechanism for the storage extension to query that before going to remote storage to retrieve an object. This will reduce the time taken to retrieve objects from remote storage, reduce traffic to upstream storage servers and perform better (perhaps more predictable) performance to the application.

Describe alternatives you've considered

n/a

Additional context

PRs for the caching layer:
#7280
Azure/azure-webjobs-sdk#2692
Azure/azure-sdk-for-net#20209
Azure/azure-functions-python-worker#844
#7757

PRs for shared memory communication between host and workers (completed):
#6791
#6836
Azure/azure-functions-python-worker#816

@gohar94
Copy link
Contributor Author

gohar94 commented Sep 14, 2021

Keeping it open until all the PRs are merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants