I’m coming from a Ruby perspective here, and I’m looking to implement an
http-based web service which needs to use both Ruby code and a Java
library.
The latter contains a very large memory requirement (gigabytes) that
precludes the usual Ruby pack-of-mongrels-in-separate-processes
approach. Further, I have no need for the overhead of the Rails
framework here. There’s no database, no templates, no html.
I’m looking for advice on a lightweight webserver framework I can put
this service code into such that:
It can handle a large number of concurrent requests. (naturally
The Java library is loaded into in shared memory such that multiple
request handlers do not require multiplying the memory usage.
(Additional points for JRuby here, since we’ve got a real threading
model within a single process)
Unfortunately, all the documentation I’m finding for JRuby webservers
has been specific to Rails.
I’m not adverse to writing my own thread pool management for handling
concurrent requests, but I’d prefer not to have to deal with the socket
handling and HTTP processing aspects.
So far, my tests with WEBbrick indicate that it’s not capable of having
more than one request being processed at once. (I find this odd, since
it has a MaxClients setting - there may be user error here?)
Eventmachine sounds good on the surface for handling concurrent requests
within a process, but that’s only suited for places where we’re waiting
on a socket, (ie, database query) not waiting for CPU as would be the
case here.
I hardly mind getting rack compatibility, but Sinatra (using WEBBrick)
shows the same concurrency issue I was seeing with plain WEBBrick.
Specifically, given a program like this:
require ‘rubygems’
require ‘sinatra’
get ‘/’ do
sleep(5)
‘hello world’
end
If I make two near-simultaneous requests, the second request takes 10
seconds to complete. I haven’t dug into the code, but this implies to me
that the server-code block (the work of generating a response) is not
being spun into its own thread, and is blocking the server’s execution
of other requests.
I can prepare a thread pool, and I can throw the actual ‘hello world’
operation into a thread from that pool, but that doesn’t help me if the
server is still blocked waiting for my operation-thread to complete so
it can return and get another request.
I’m hoping I’m missing something obvious here. Thin solves this by using
eventmachine to manage other requests while waiting for your code to
return a value. I had presumed there would be thread-based solutions to
the same problem that I could conveniently use with JRuby.
Additionally, If you don’t need all the rails overhead and are going
rack based… look into Sinatra as your “webservice” sub-container
It’s a simple request handler that doesn’t come with lots of overhead
(essentially… URI → method routing). From hear you can manage how
you access your library (and you can manage your “threadsafetyness”
Jay
On Thu, May 27, 2010 at 12:31 PM, Bob McWhirter [email protected] wrote:
The latter contains a very large memory requirement (gigabytes) that precludes the usual Ruby pack-of-mongrels-in-separate-processes approach. Further, I have no need for the overhead of the Rails framework here. There’s no database, no templates, no html.
Eventmachine sounds good on the surface for handling concurrent requests within a process, but that’s only suited for places where we’re waiting on a socket, (ie, database query) not waiting for CPU as would be the case here.
If you run via JRuby-Rack (or another rack-happy container, such as
TorqueBox), the servlet container will give you discrete threads, and
you’ll see good concurrency.
Do not try to just run mongrel/thin/etc right under JRuby.
I’m not adverse to writing my own thread pool management for handling concurrent requests, but I’d prefer not to have to deal with the socket handling and HTTP processing aspects. http://xircles.codehaus.org/manage_email
I’m not sure if it’ll help, but I’ve got a really lightweight, Jetty 7
Rack
adapter that I was planning on APLing and throwing on GitHub in the near
future. Coupled with Sinatra, it might do what you need.
The project doesn’t have a name yet (um, any idea what I should call
this?),
but in some quick-and-dirty benchmarks on my Macbook, it could push 8500
req/sec (Thin did 7000, rack-handler-jetty did about 5500), so it’s
pretty
quick. Internally, it uses Jetty’s NIO SelectChannelConnector, so
concurrency is good; I hit it with everything ApacheBench could dish
out,
and it was fine.
That said, I expect better results on a Linux machine, because OS X
doesn’t
have epoll.
Basically, it’s node.js-like performance for JRuby, which I think is an
acceptable start.
It doesn’t use JRuby-Rack, because I’ve built a tightly-integrated
servlet
bridge between Jetty and Rack. This does mean that you can’t use it in
a
normal Java web container (which you can with JRuby-Rack); but you do
get
better performance and support for suspendible servlets via Servlet 3.0
/
Jetty Continuations.
I’m coming from a Ruby perspective here, and I’m looking to implement an http-based web service which needs to use both Ruby code and a Java library.
So far, my tests with WEBbrick indicate that it’s not capable of having more than one request being processed at once. (I find this odd, since it has a MaxClients setting - there may be user error here?)
Dont forget you can also use Rails with a “rack Metal” service. Metal
inside of Rails allows you to bypass all the routing and performance
sluggishness of actvesupport that gives amazingly good performance. We
use
it all the time under tomcat5 works great.
I got pulled into something else after starting this thread last month,
and didn’t feel like I should respond more until I’d done the due
diligence. In the interests of completeness though, here’s an update.
Some early web-searches on my part had indicated the pack-of-mongrels
approach common in ruby was also used in jruby. Since this implied to me
that mongrel wasn’t threading, I had skipped past it until one of my
coworkers actually cracked open the jruby mongrel gem and found the
Thread.new.
So it looks like we’re just going to do that. Start mongrel with a
simple request handler, and let the java threading do the work you’d
normally want multiple ruby processes for. This has the advantage that
the deployment is very ruby-ish, (no war-creation step needed, for
example) and we get to re-use some existing mongrel-server-setup code.
require ‘rubygems’
spun into its own thread, and is blocking the server’s execution of other
same problem that I could conveniently use with JRuby.
Additionally, If you don’t need all the rails overhead and are going
Rack application, and should be fairly light.
an http-based web service which needs to use both Ruby code and a Java
multiple request handlers do not require multiplying the memory usage.
So far, my tests with WEBbrick indicate that it’s not capable of
until one of my coworkers actually cracked open the jruby mongrel
-Bob
require ‘sinatra’
the server’s execution of other requests.
with JRuby.
Additionally, If you don’t need all the rails overhead and are
going
rack based… look into Sinatra as your “webservice” sub-container
It’s a simple request handler that doesn’t come with lots of
overhead
(essentially… URI → method routing). From hear you can manage
how
you access your library (and you can manage your
“threadsafetyness”
processes approach. Further, I have no need for the overhead of
threading model within a single process)
this odd, since it has a MaxClients setting - there may be user
I’ve got a jetty-based solution of you’re interested. It works just
like mongrel or thin, so no WARfiles, but with async support and
node.js level performance.