Hello,
We are experiencing some performance problems when using Ferret and we
are trying to isolate the problem.
We have about 80 GB in Indexes for one of our clients and when a
search is performed on those indexes the application gets really slow
and eventually it stops responding. We’ve been monitoring the memory
usage, and it rises very rapidly as the indexes are been loaded.
Ferret’s documentation says the index reader is automatically closed
during garbage collection, but either this doesn’t work, or it takes
much longer to happen than would be ideal for us.
So we are running out of memory and the mongrel instances become
unresponsive to a point that not even monit can restart them, we have
to kill the instances manually.
Does anyone knows how Ferret manages it’s memory usage, does it try to
load all the indexes needed for a search into RAM all at once?
If that’s the case, what happens when the indexes size exceeds the
available RAM?
Does anyone have this problem before?
The help anyone can provide will be greatly appreciated.