You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We're currently not sending any cache policy headers in the query responses, and we probably should. For example, a query for an event with a particular ID should be highly cacheable. Maybe not indefinitely (sites may choose to delete events) but certainly for a while. Maybe it should be configurable? Not sure if there are other cases where we can determine that a query result won't change over time. Maybe not, but think about it.
Similarly, if there are queries whose results are clearly not cacheable we should state that in cache policy headers.
Motivation
This would allow proxies to cache responses to frequent queries to decrease the load in the Goer service and the underlying database.
Exemplification
If I have a service that frequently makes particular lookups I wouldn't have to implement custom caching internal to the application but could just set up a standard caching proxy (Varnish, NGINX, ...) and route outbound requests through it.
Benefits
Paving the way for better caching improves latency for clients and decreases the load on Goer.
Possible Drawbacks
None.
The text was updated successfully, but these errors were encountered:
Description
We're currently not sending any cache policy headers in the query responses, and we probably should. For example, a query for an event with a particular ID should be highly cacheable. Maybe not indefinitely (sites may choose to delete events) but certainly for a while. Maybe it should be configurable? Not sure if there are other cases where we can determine that a query result won't change over time. Maybe not, but think about it.
Similarly, if there are queries whose results are clearly not cacheable we should state that in cache policy headers.
Motivation
This would allow proxies to cache responses to frequent queries to decrease the load in the Goer service and the underlying database.
Exemplification
If I have a service that frequently makes particular lookups I wouldn't have to implement custom caching internal to the application but could just set up a standard caching proxy (Varnish, NGINX, ...) and route outbound requests through it.
Benefits
Paving the way for better caching improves latency for clients and decreases the load on Goer.
Possible Drawbacks
None.
The text was updated successfully, but these errors were encountered: