Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

oom caused after use #234's statistics record code #290

Closed
itherunder opened this issue Jul 27, 2023 · 4 comments
Closed

oom caused after use #234's statistics record code #290

itherunder opened this issue Jul 27, 2023 · 4 comments
Assignees
Labels
bug Something isn't working

Comments

@itherunder
Copy link

itherunder commented Jul 27, 2023

#234

@itherunder itherunder changed the title oom caused after use #262's statistics record code oom caused after use #234's statistics record code Jul 27, 2023
@tatsuya6502
Copy link
Member

Hi. Thank you for reporting the issue. Can you please provide more information to help us investigate further?

  • What is the moka version?
  • What is the cache implementation? sync::Cache, sync::SegmentedCache, or future::Cache?
  • Is your eviction listener code exactly same to the one in Statistics for metrics #234?
  • What is the max_capacity of the cache?
  • If you make the max_capacity smaller, does the issue still happen?
  • What is the average size of the value stored in the cache? (Rough estimate is fine)
  • What is the write rate to the cache? (Number of writes per second. Rough number is fine)
    • Are you doing heavy writes? (e.g. millions writes per second, like benchmarking)

@itherunder
Copy link
Author

Hi. Thank you for reporting the issue. Can you please provide more information to help us investigate further?

  • What is the moka version?

  • What is the cache implementation? sync::Cache, sync::SegmentedCache, or future::Cache?

  • Is your eviction listener code exactly same to the one in Statistics for metrics #234?

  • What is the max_capacity of the cache?

  • If you make the max_capacity smaller, does the issue still happen?

  • What is the average size of the value stored in the cache? (Rough estimate is fine)

  • What is the write rate to the cache? (Number of writes per second. Rough number is fine)

    • Are you doing heavy writes? (e.g. millions writes per second, like benchmarking)
  • 0.11.2
  • sync::SegmentedCache / I tried sync::Cache, same oom
  • exactly same as Statistics for metrics #234
  • still happen, I tried to set max_capacity just 128
  • 20B
  • 1000 QPS

@tatsuya6502 tatsuya6502 self-assigned this Aug 3, 2023
@tatsuya6502 tatsuya6502 added the bug Something isn't working label Aug 3, 2023
@tatsuya6502
Copy link
Member

tatsuya6502 commented Aug 3, 2023

@itherunder — Thank you for the info. It was very helpful! I believe I found the root cause of the bug, and the following PR will fix it:

When you have chance, please try it to see if the OOM is fixed:

Cargo.toml

(deleted)

I will apply the fix to Moka v0.11.x, v0.10.x and v0.9.x, and hopefully we can release them in this weekend.


EDIT

I just published v0.11.3 to crates.io.

Cargo.toml

[dependencies]
moka = "v0.11.3"

@tatsuya6502
Copy link
Member

I will apply the fix to Moka v0.11.x, v0.10.x and v0.9.x, and hopefully we can release them in this weekend.

I just published v0.11.3, v0.10.4 and v0.9.9 to crates.io.

Please reopen this issue if the OOM still happens with v0.11.3.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants