I’m running several smallish apps each with small audiences and few
users, all on the same server. We’re currently deploying on
Apache+FastCGI with few problems, except the fact that each app starts
a new dedicated Ruby process. The combined performance demand on the
server shouldn’t be very high, but since the amount of apps is in the
thirties and since each and every one uses its own Ruby process, this
results in quite a bit of overhead.
My question is if this can be avoided. Is it possible to run many apps
that reuse the same Ruby processes and thus result in less pressure on
the server? What are my options?
I’m running several smallish apps each with small audiences and few
users, all on the same server. We’re currently deploying on
Apache+FastCGI with few problems, except the fact that each app starts
a new dedicated Ruby process. The combined performance demand on the
server shouldn’t be very high, but since the amount of apps is in the
thirties and since each and every one uses its own Ruby process, this
results in quite a bit of overhead.
My question is if this can be avoided. Is it possible to run many apps
that reuse the same Ruby processes and thus result in less pressure on
the server? What are my options?
Best regards,
Tomas J., Sweden
this part is pretty hard. I just ran into the same situation and i
still havent found an answer.
I was able to lessen the load.
In lighttpd you can set the number of FCGI processes per app. I set
this to 1 for each app and was able to push the server (Athlon XP 2600,
1.5GB RAM) to about 90 sites with decent traffic. i think you can do
this with Apache also.
The problem is that each app needs its own FCGI process.
With the current Capistrano type deployment, you have one app with many
users. This type of installation is much easier, because all of your
users go to one app, but in your case, you would need to separate them
and that’s when stuff gets a little crazy.
Right now I don’t think there is. You have to run at least 1 process per
Rails app. This is true with FCGI and Mongrel. CGI runs and completes
its
process after Rails does its business (I believe)
The only hope for situations like this might be the jRuby project.
My question is if this can be avoided. Is it possible to run many apps
that reuse the same Ruby processes and thus result in less pressure on
the server? What are my options?
Peter, I think you misunderstood the question; I believe LiteSpeed too
creates one process per app (key word: per app) and THEN creates more
as necessary as per your description. If you, indeed, didn’t
misunderstand, and isn’t mistaking, I’d love it if you could confirm
it.
However, I don’t think it’s possible, because I suspect this is an
issue with Rails itself, not the various dispatching solutions, since
the processes cache the model classes in memory, how could they
possibly contain several apps per process? Severe changes to Rails
itself would be necessary for a single Ruby process to be used by
multiple Rails apps at once. Again, I’m only almost certain of this,
if somebody with definite knowledge on the subject can confirm or deny
this, I’d really appreciate it.
Does anyone know if this issue is being investigated at all, if anyone
has addressed this concern before? As it stands now, Rails is an
excellent choice if you only run one or a couple of big apps on the
same server… but for many smallish apps, its barely useable, which
IMHO is a damn shame.
On Dec 28, 2006, at 3:58 PM, subimage interactive wrote:
Right now I don’t think there is. You have to run at least 1
process per Rails app. This is true with FCGI and Mongrel. CGI runs
and completes its process after Rails does its business (I believe)
The only hope for situations like this might be the jRuby project.
What does the development speed have to do with being able to run
multiple apps per process, or not being able to? I fail to see how the
two have any relation what so ever.
Does anyone know if this issue is being investigated at all, if anyone
has addressed this concern before? As it stands now, Rails is an
excellent choice if you only run one or a couple of big apps on the
same server… but for many smallish apps, its barely useable, which
IMHO is a damn shame.
It’s only a usability problem with respect to resources, and the only
way to measure that is against the entire pool of resources required
to create and maintain the application as well.
For instance, let’s say that Ruby is 16 times as expensive in terms
of memory as something else. What is the cost of that memory compared
to the cost advantage (if any!) of development speed and maintainability
of the application itself?
Memory is not expensive these days, and it’s getting less expensive
each and every day.
–
– Tom M., CTO
– Engine Y., Ruby on Rails Hosting
– Reliability, Ease of Use, Scalability
– (866) 518-YARD (9273)
So, in what way are they mutually exclusive? How would it negatively
impact he development speed if Rails apps could share processes?
I’m not a newbie to Rails, I’ve been using it for over two years and
I’m quite familiar with the productivity gains. Nowhere have I claimed
otherwise. I’m just having a performance problem even though I have
small audiences and few users. If apps could share processes, it would
decrease costs significantly, since each server could be used with
less overhead.
What does the development speed have to do with being able to run
multiple apps per process, or not being able to? I fail to see how the
two have any relation what so ever.
Both contribute to the total cost of ownership of the application.
Rails may use more memory than other web technologies, but memory is
cheap - if Rails saves development and maintenance effort, and gets you
to market faster, that’s an overall win.
So, in what way are they mutually exclusive? How would it negatively
impact he development speed if Rails apps could share processes?
In principle they are orthogonal. In practice, they are constrained by
the way Rails is at present.
Nobody has suggested that it would be a bad thing if a process could
host more than one Rails application - Dave T. has suggested it
would be good for Rails to have a ‘container’ that applications could be
deployed into, and Why’s sandbox appears to be a step in that direction.
I’m not a newbie to Rails, I’ve been using it for over two years and
I’m quite familiar with the productivity gains. Nowhere have I claimed
otherwise. I’m just having a performance problem even though I have
small audiences and few users. If apps could share processes, it would
decrease costs significantly, since each server could be used with
less overhead.
I picked up on the memory aspect, because it is frequently discussed and
because Tom gave memory use as an example of resource use - but you seem
to be saying that you have performance problems relating to the number
of processes running, before memory becomes a problem. Can you say more
about this?
I’m using it for internal applications here and it’s working fine.
Version 1.2.6 has a RailsDispatcher that addresses the problems that
occurred with the shared interpreters in previous versions.
I still haven’t made it available for our customers, but it might be
worth giving it a try, since at least in theory it should do what you
need.
of processes running, before memory becomes a problem. Can you say more
about this?
I’m the same guy. I’m asking questions related to how to improve the
per-server performance of Rails. Since I’m running several apps, the
server starts a new Ruby process for each app. This means I have
thirty or more Ruby processes running. However, just one or two
processes would suffice to handle the combined load. That means the
server uses 15 to 30 times more processes than necessary, simply
because of how Rails internals work. If Rails apps could share
processes, I would get 15-30 times more performance or more PER
SERVER. That’s one of them… “limitations”, to misquote Bush.
So I’m here to ask if anyone else has run into this particular problem
of running many small apps and having the performance sucked out of
the server NOT due to overwhelming amounts of visitors, but because
apps can’t share processes.
That’s all. I’m already aware of the productivity gains of using
Rails, have been for over two years.
Is mod_ruby even a viable alternative? I’ve never heard of anyone
using it for RoR? Actually I did think about it, but wrote it off
based on what I just said. Was that in haste?
Yes, I decided to try it after reading Shugo’s blog.
The only pitfall I found so far is that everything in your application
is “packed” into the Apache::RailsDispatcher::CURRENT_MODULE module. I
had some code that depended on the the class of a given object to
choose the action it would take, and I had to fix it by using
“self.class.to_s.demodulize” instead of just “self.class.to_s” to get
just “Foo” instead of “Apache::RailsDispatcher::CURRENT_MODULE::Foo”.
It’s no big issue after you realize what’s going on, I guess.
to be saying that you have performance problems relating to the number
of processes running, before memory becomes a problem. Can you say more
about this?
I’m the same guy.
Tom = Tom M., the person whose answer caused you to ask why he
brought up the issue of speed of development.
I’m asking questions related to how to improve the
per-server performance of Rails. Since I’m running several apps, the
server starts a new Ruby process for each app. This means I have
thirty or more Ruby processes running. However, just one or two
processes would suffice to handle the combined load. That means the
server uses 15 to 30 times more processes than necessary, simply
because of how Rails internals work. If Rails apps could share
processes, I would get 15-30 times more performance or more PER
SERVER. That’s one of them… “limitations”, to misquote Bush.
You would get the same performance with much less memory use.
Having 30 idle processes should nave negligible impact on performance,
unless their memory use is causing paging or swapping.
So I’m here to ask if anyone else has run into this particular problem
of running many small apps and having the performance sucked out of
the server NOT due to overwhelming amounts of visitors, but because
apps can’t share processes.
That’s all. I’m already aware of the productivity gains of using
Rails, have been for over two years.