-
-
Notifications
You must be signed in to change notification settings - Fork 16.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache accepts
Instance in Request Methods?
#5906
Comments
I conducted a performance comparison to evaluate the impact of caching the Benchmark ResultsBenchmark without caching:
req.accepts: 854.0032 ms
req.acceptsEncodings: 114.8911 ms
req.acceptsCharsets: 185.8458 ms
req.acceptsLanguages: 222.8439 ms
Benchmark with caching:
req.accepts: 758.2582 ms
req.acceptsEncodings: 55.7752 ms
req.acceptsCharsets: 68.0283 ms
req.acceptsLanguages: 93.0939 ms
Benchmark without caching:
req.accepts: 921.3947 ms
req.acceptsEncodings: 127.5885 ms
req.acceptsCharsets: 193.3242 ms
req.acceptsLanguages: 206.9786 ms
Benchmark with caching:
req.accepts: 808.1724 ms
req.acceptsEncodings: 73.8829 ms
req.acceptsCharsets: 77.4844 ms
req.acceptsLanguages: 134.1124 ms
Benchmark without caching:
req.accepts: 856.8224 ms
req.acceptsEncodings: 136.1057 ms
req.acceptsCharsets: 129.9873 ms
req.acceptsLanguages: 219.2094 ms
Benchmark with caching:
req.accepts: 829.7202 ms
req.acceptsEncodings: 99.3029 ms
req.acceptsCharsets: 91.8122 ms
req.acceptsLanguages: 113.6522 ms Summary of Results
Trade-offs
Given these performance improvements, especially for methods other than Should we move forward with adding this caching optimization? |
From my perspective what I think is that, In a typical backend application, the While the proposed caching mechanism undoubtedly improves response time, could this lead to potential memory overhead? Since the cached accepts instance would persist for the entire duration of the request, even if it is not needed after the initial invocation, this may result in inefficient memory usage. Could we provide developers with the option to decide whether or not they need caching as it has some performance benefit? |
Hey @ifty64bit 👋 Thank you for sharing your perspective; I appreciate the thoughtful insight. Full disclosure, I myself have very surface level knowledge, most of my points are my understanding and they can be wrong. I hope to learn more from you and others. So, you're absolutely right that in most backend applications:
The caching mechanism indeed provides a performance boost by preventing redundant computation when these methods are invoked multiple times in a request. While you correctly point out that they are typically called early in the request lifecycle, there are real-world scenarios where these methods might be invoked multiple times:
In these real-world scenarios, without caching, the However, as you mentioned:
This is a valid concern, especially in environments with limited resources or highly optimized systems. While the cached object is generally lightweight, there are cases where developers might prefer to avoid caching if they don't anticipate repeated use within the same request. One potential solution, as you suggested, would be:
I think this is a great idea. We could implement a toggle-able option for caching, ensuring that developers have control over whether to enable it. This way, caching can be used when necessary but avoided when developers want to minimize memory usage. Thanks again for raising this point 👍 |
The current implementation of the
req.accepts
,req.acceptsEncodings
, andreq.acceptsEncoding
methods in lib/request.js creates a newaccepts
instance for each call.Am I correct to think that this can lead to redundant object creation and parsing overhead, especially in scenarios where these methods are called multiple times within the same request lifecycle?
If caching the
accepts
instance seems correct to reduce overhead and help with faster execution, I am willing to create PR and benchmark comparisons for this. However, since I have very little idea about the overall scenario I need a confirmation of go / no go.The text was updated successfully, but these errors were encountered: