-
Notifications
You must be signed in to change notification settings - Fork 335
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leakage since 2.4.0 #460
Comments
Seeing the same exact issue in the project i'm working on recently updated from |
@lizdeika and @nitaliano that's pretty serious, thanks for pinpointing to If I'm not mistaken, there were really only two changes between 2.3.1 and 2.4: #405 and #447. I reread the diff and nothing jumps at me. It would help to know:
Question 3 could be partially answered by checking the options passed to
(You may want to replace Of course, a script we could run to reproduce the leak would be the best... |
Also, if either of you is using |
it was CRuby 2.6.5 compiled against jemalloc 5.2.1 |
Here are some extra details
We are also using sidekiq like @lizdeika. Hopefully next time I post back I'll have a repo to reproduce it 😄 |
To better pinpoint the problem, I created 2 branches of So if it was possible to check using: gem 'json', git: 'https://github.com/marcandre/json.git', tag: 'freeze_231'
# and if that does not leak, then try with:
gem 'json', git: 'https://github.com/marcandre/json.git', tag: 'equiv_241' |
I know this is an old issue, but I'm interested in it nonetheless because I noticed something in this gem that could cause issues. Are the JSON keys you're parsing known ahead of time and bounded to a certain set of strings, or are they unbounded? If they're unbounded, this commit: 1982070cb84a38793277f6359387938d80e4d2c4 introduces a memory issue by keeping all json keys in the ruby VM's frozen string table (where they're never released, as far as I know). cc @byroot |
That's incorrect. "fstrings" (AKA interned strings) are garbage collectable. |
Oh okay right, I just took at look at |
This might be nothing too, but in the parser there's a call to ALLOC_N and it gets free'd with |
AFAIK, it's mostly because that skip letting the GC know that some memory was freed, so GC might trigger earlier than it should but that's about it. Would be nice to fix it though. |
Alright, not sure if it's the same one, but I did find a leak: require 'json'
data = 10_000.times.to_a << BasicObject.new
20.times do
100.times do
begin
data.to_json
rescue NoMethodError
end
end
puts `ps -o rss= -p #{$$}`
end
The various |
Fix: ruby#460 The various `to_json` methods must rescue exceptions to free the buffer. ``` require 'json' data = 10_000.times.to_a << BasicObject.new 20.times do 100.times do begin data.to_json rescue NoMethodError end end puts `ps -o rss= -p #{$$}` end ``` ``` 20128 24992 29920 34672 39600 44336 49136 53936 58816 63616 68416 73232 78032 82896 87696 92528 97408 102208 107008 111808 ```
Fix: ruby#460 The various `to_json` methods must rescue exceptions to free the buffer. ``` require 'json' data = 10_000.times.to_a << BasicObject.new 20.times do 100.times do begin data.to_json rescue NoMethodError end end puts `ps -o rss= -p #{$$}` end ``` ``` 20128 24992 29920 34672 39600 44336 49136 53936 58816 63616 68416 73232 78032 82896 87696 92528 97408 102208 107008 111808 ```
I checked the leak I just fixed predates I think the only action I could see would be to setup ruby_memcheck and see if it catches something. |
Hoping it might find the leak reported in ruby#460
Hoping it might find the leak reported in ruby#460
Hoping it might find the leak reported in ruby#460
Hoping it might find the leak reported in ruby#460
Hoping it might find the leak reported in ruby#460
Hoping it might find the leak reported in ruby#460
Fix: ruby/json#460 The various `to_json` methods must rescue exceptions to free the buffer. ``` require 'json' data = 10_000.times.to_a << BasicObject.new 20.times do 100.times do begin data.to_json rescue NoMethodError end end puts `ps -o rss= -p #{$$}` end ``` ``` 20128 24992 29920 34672 39600 44336 49136 53936 58816 63616 68416 73232 78032 82896 87696 92528 97408 102208 107008 111808 ``` ruby/json@d227d225ca
Hoping it might find the leak reported in ruby/json#460 ruby/json@08635312e5
Hoping it might find the leak reported in ruby/json#460 ruby/json@08635312e5
@casperisfine Funny enough I found a similar leak many years ago: |
We have a couple of applications where
postmark
uses this gem.After updating bunch of gems we noticed a memory leak in sidekiq processes(we have a script that restarts sidekiq when memory is bloated).
We had to revert all the recent gem updates and the problem was gone.
Then we updated gems back one by one to find out which one caused the leakage.
The problem returned back with
json
gem 2.4.0 update and reverting back to 2.3.1 was all good once again.The text was updated successfully, but these errors were encountered: