I am having big problems with an application handling big datasets
through activerecord.
I need to handle about 10000 records and the memory usage of my
mongrel process reaches up to 700MB.
The memory is not freed when the controller terminates.
The controller just renders a big dataset to xml and outputs the data
to a FLEX frontend. I’ve used both the rxml templates, .to_xml()
function and also using an .rhtml go generate xml ( not very neat …
but much faster than the rxml… nearly up to 3 times )
I really don’t understand if this is normal or I am missing something.
Reading through the mailing list I’m afraid there isn’t much to do…
since this is a problem with Rails ( or Ruby ) that doesn’t release
memory to the OS.
I can rewrite my code to paginate the dataset during data browsing,
but when a report needs to be printed I need to extract the whole
dataset…
Any suggestions, I’m currently running monit to kill oversized mongrel
processes… but its not a solution.
Must I go back to PHP in this situation…??? I really hope not.
Thanks for any suggestion.
Massimo Santoli
http://blog.evanweaver.com/files/doc/fauna/bleak_house/files/README.html
On 8/24/07, msantoli [email protected] wrote:
processes… but its not a solution.
Must I go back to PHP in this situation…??? I really hope not.
Thanks for any suggestion.
Massimo Santoli
–
Cheers!
This isn’t necessarily a leak, and many scripting languages only
begrudgingly give internal memory back to the OS.
The idea is that you’ll get better performance by keeping the memory
allocated as it’s likely it’ll be needed again, i.e. for the next
request.
A leak, IMHO, would be where the memory in use goes up with each and
every request. And, even then, it’s not necessarily a leak if its
the way the language is supposed to work.
–
– Tom M., Co-CEO
– Engine Y., Ruby on Rails Hosting
– Support, Scalability, Reliability
– (866) 518-YARD (9273) x201
On Aug 24, 2007, at 2:42 PM, msantoli wrote:
processes… but its not a solution.
Must I go back to PHP in this situation…??? I really hope not.
Thanks for any suggestion.
Massimo Santoli
I would not try to instantiate 10,000 ActiveRecord objects. Hell, I
would try to avoid instantiating 10,000 of any object. If you are
just taking data from a database and poking fields into the
appropriate spots in an xml schema, I would try to get the http socket
output as close to the database as possible.
But as a first step, try skippin ActiveRecord and query the db
directly (using the connection that you can get from ActiveRecord).
Take that result – which is an array of hashes, iirc – and use that
in your template. But I don’t know how much that’s going to save you
really. Ideally you would find some way to stream the database result
into the template and out the http socket in one step. That’s not easy
to do in rails…
On the other hand, whatchya whining about? I’m still keeping an eye on
a fairly small Java app that has a fairly high number of steady users
and the Java process on that machine stays pretty steady at 1.2 GB ram
usage.
b