I’ve a Rails 3.0 application that have a problem with the memory
consumption.
The application works in production mode on Apache + passenger mod on a
Debian machine. It’s a virtual machine with 6 Gb RAM and 2 CPU (1 core
3GHz). It’s a mailing application and a simple click on a link to read a
mail requires a series of operations on the server to render the page.
This simple click needs about 50% of CPU during 2 or 3 seconds (I see it
with “top” command). It seems that ruby processes allocate memory for
jobs. But when the job is finished, the allocated memory seems to not be
freed (unlike the CPU usage).
The problem is that I’ve about 150 users on the system and the
Postgresql database is installed on the same machine. When several users
make operations at same time, memory usage increases until 100% and
Postgres can’t accept request any more (then Postgres stops to log).
Is there any configuration that I can make (Apache? Passenger? Rails?)
to prevent memory usage increasing (Why it isn’t decreasing with the
CPU?) and Postgresql crash?
Is there any configuration that I can make (Apache? Passenger? Rails?)
to prevent memory usage increasing (Why it isn’t decreasing with the
CPU?) and Postgresql crash?
There are Passenger configuration variables you can use to adjust how
long
processes stay alive to continue serving requests, how many requests
they
serve, the max number of processes, etc.
I use Ruby 1.8.7 . I will inquire about Ruby Enterprise Edition.
There are Passenger configuration variables you can use to adjust how
long
processes stay alive to continue serving requests, how many requests
they
serve, the max number of processes, etc.