The :response_cache
plugin makes transparent use of HTTP conditional requests and caching strategies to improve performance and bandwidth usage (at the expense of some memory usage).
When the plugin is enabled, responses which advertise some etag
or last-modified
header will be kept in a cached responses store, and subsequent requests to the same URI, and abiding by the same “vary” header rules, will be performed with conditional headers in the request (if-none-match
, if-modified-since
…) based on the cached response.
If the response indicates that the cache is still valid (status code 304
), the response returned to the user will contain the body from the cached response; otherwise, it replaces the cached response in the store, and will be used for subsequent ones.
The cached response is discarded from the algorithm above, and dropped from the cache, when not considered “fresh” (i.e. when it expires as per its “cache-control” or “expires” header directives).
client = HTTPX.plugin(:response_cache)
r1 = client.get("https://nghttp2.org/httpbin/cache")
r1.headers #=> {
# "content-type"=>"application/json",
# "content-length"=>"226",
# "last-modified"=>"Sat, 10 Jun 2023 16:14:35 GMT",
# "etag"=>"33891c807e4448c780f6fa192d486774", ....
r2 = client.get("https://nghttp2.org/httpbin/cache")
r1.status #=> 200
r2.status #=> 304
r1.body == r2.body #=> true
# to wipe out the response cache
client.clear_response_cache
The store is a thread-safe in-memory hashmap. This means that it is only usable within the same process, i.e. its state won’t be shared in a multi-process environment. You can use it across multiple threads though.
This plugin is not an on-demand client-driven cache, where the client can set for how long a given response can be stored. You may want to build your own plugin for that.
Next: Circuit Breaker