Skip to content

Conversation

SergeiPavlov
Copy link
Collaborator

They were unlimited ConcurrentDictionary
Typical process may have > 1000 StorageNodes and these caches take significant amount of RAM:
image

Copy link

@botinko botinko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO not worth exchange static mem to cpu and more allocation in this case. if StorageNode was loaded to cache, it will be used in future with high probability.

@SergeiPavlov
Copy link
Collaborator Author

SergeiPavlov commented Mar 2, 2025

it will be used in future with high probability.

Caches themselves are Ok. I'm against unlimited caches. That is direct way to memory leaks

Also we always should consider good trading CPU vs RAM.
In our current situation the big consumed RAM is a problem, while CPU usage is low.

Copy link

@snaumenko-st snaumenko-st left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't it be better to use LFU implementation instead of LRU?
And also I'm concerned about cleanup of the items. Aren't there any requiring it after removal? E.g. if there are IDisposable items in these caches, we should take care of this explicitly.

@SergeiPavlov
Copy link
Collaborator Author

Replaced by #361

@SergeiPavlov
Copy link
Collaborator Author

And also I'm concerned about cleanup of the items. Aren't there any requiring it after removal? E.g. if there are IDisposable items in these caches, we should take care of this explicitly.

There are no unmanaged values, so no need for disposing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants