Indexing runs out of memory

I’m using Ferret to index a whole bunch of stuff at once. Thousands
of documents that produce an index which grows to about
1.25Gb. While the indexer is running, I watch the memory use of the
Ruby process grow steadily until it, too, is up to about 1.25Gb – at
which point the process crashes printing:

[FATAL] failed to allocate memory

Does anyone else have any experience with this mode of
failure? Should I not try to create the index all at once, but
rather do a few documents then close the index then re-open it then
do a few more? Or is a 1.25Gb index simply too big to try to create
on my machine?

TIA

Jeff,

On 2007-11-15, at 19:39, Jeff M. wrote:

[FATAL] failed to allocate memory

Yes, closing and reopening the IndexWriter might
help.

There has been reports about ferret index with 3 or
more gigs on this list… so i don’t think this is a
general problem.

Ben