java - Can I constrain a HashMap by the amount of memory it takes up? -
I am implementing a simple cache using LinkedHashMap
based on the instructions I've seen . I use the following code:
Public class cache LinkedHashMap {Private FINAL INC. Capabilities; Public cache {super (capacity + 1, 1.1f, true); This.capacity = ability; } Largest entries removed from the protected boolean (entry senior) {return size ()> Capacity; }}
It is very easy though, it only puts a fixed size on the map. I am running on a very small pile and depending on the size of the cached objects and the capacity of my chosen it can still run out of memory, it is arbitrary of the objects and therefore I can not predict how big they can be. . To cut the cache I do not want to depend on SoftReferences
because the way it is cleaned is incredible; It changes from VM to VM, and they can either be retrieved soon, or they can never be recovered until they reach my heap.
Do I have a way to monitor the size of the map and constrain it based on that?
If the soft / weak reference is out of the question, then I have 2 non-trivial options:
1) Use Java Instrumentation to check the actual size of objects added in the map. Instrumentation interface provides an object, and you will need more code to explore references (and to avoid duplicates counting!). A solution that calculates the deeper shape of an object.
2) Use the JMX to track the size of the pile after GCX, and when changing to some dangerous threshold, change the map behavior. See the "Notifications" section.
Comments
Post a Comment