Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DO NOT MERGE] feat: switch to memcache-client over memcached #6119

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

mzikherman
Copy link
Contributor

@mzikherman mzikherman commented Oct 10, 2024

This just seems to work, at least locally, so we can iterate on the right options (cmdTimeout and such).

Going for do not merge for now since we want to make sure we're using the right configs/variables in a tunable way, monitor carefully, etc. - can discuss.

@mzikherman mzikherman self-assigned this Oct 10, 2024
@artsy-peril
Copy link
Contributor

artsy-peril bot commented Oct 10, 2024

memcache-client

Author: Joel Chen

Description: NodeJS memcached client

Homepage: https://github.com/electrode-io/memcache#readme

Createdover 7 years ago
Last Updated12 months ago
LicenseApache-2.0
Maintainers3
Releases28
Direct Dependencieslodash.defaults, memcache-parser, optional-require and tslib
Keywordsmemcache, memcached, nodejs, client, ascii and protocol
This README is too long to show.

New dependencies added: memcache-client.

Generated by 🚫 dangerJS against a6193d7

client.on(event, () => verbose(`[Cache] ${event}`))
console.log("hey")
const newClient = new MemcacheClient({
server: { server: MEMCACHED_URL, maxConnections: MEMCACHED_MAX_POOL },
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure if maxConnections is this pool config, there doesn't seem to be any connection pool config in the new lib

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does seem to be an internal maximum on the number of connections made by this client. (I was worried it might be a server option due to its location.)

})
VerboseEvents.forEach((event) => {
client.on(event, () => verbose(`[Cache] ${event}`))
console.log("hey")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was just making sure this was a singleton

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👋

@@ -108,15 +98,18 @@ function _get<T>(key) {
reject(err)
} else if (data) {
if (CACHE_COMPRESSION_DISABLED) {
resolve(JSON.parse(data.toString()))
resolve(JSON.parse(data.value.toString()))
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one API change, the returned data is an object with some metadata, value contains the actual cached object.

@@ -146,7 +139,7 @@ function _set(key, data, options: CacheOptions) {
return new Promise<void>((resolve, reject) => {
const payload = JSON.stringify(data)
verbose(`CACHE SET: ${cacheKey(key)}: ${payload}`)
client.set(cacheKey(key), payload, cacheTtl, (err) => {
client.set(cacheKey(key), payload, { lifetime: cacheTtl }, (err) => {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

slight difference for set

@@ -74,6 +74,5 @@ export function createCacheTracer() {
return {
get: createCommand("get"),
set: createCommand("set"),
delete: createCommand("delete"),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't see any use case in the codebase outside of tests for delete, and you see we renamed client.del to delete. Which seems non-standard as both clients implement del, and that's what the rate limit middleware wants as well - https://github.com/express-rate-limit/rate-limit-memcached?tab=readme-ov-file#client

So perhaps the del part of rate limiter, whatever that was, never quite worked? Then again that middleware is disabled anyway

resolved "https://registry.yarnpkg.com/memcache-parser/-/memcache-parser-1.0.1.tgz#66b8434e1bccd79a6da4bfe5fec41e5aa6f585b8"
integrity sha512-F5TjjHKUNMCWTVfiJD+7M9ecNbBdnK6xifYZ1WL8PoQ/dP9sB37VQV8hPtyuUqJJFj76qY70Ezj9saQ0gw2FJg==

memcached@^2.2.2:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the rate limit middleware still brings in memcached...maybe we should just clean that middleware up?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I dug through slack for some history and couldn't figure out the state of this, or if it was ever enabled. Ideally, rate-limit-memcached should be removed (and re-added if/when needed). I also wonder if the presence of both libraries could interfere with Datadog's tracing.

Which reminds me: we must confirm that datadog automatically instruments this alternate library.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we do remove rate-limit-memcached, we can also remove the mock increment implementation in cache.ts above.

@mzikherman mzikherman changed the title feat: switch to memcache-client over memcached [DO NOT MERGE] feat: switch to memcache-client over memcached Oct 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants