I’m looking to do something similar to the “secret URLs” in Rails
Recipes, however, with file downloads. I want to avoid providing direct
URLs to people if possible.
I have files on disk in public/, so it would be good if each user got a
“unique” URL to each file that they are permitted to see (dealt with by
a user_id/file_id table). Then they are only allowed to download the
file if they are logged in and the secret matches. Otherwise, anybody
could simply point to http://…/example.pdf and download the file,
whether logged in or not.
I could do this with redirect_to I believe, but as far as I know that
just sends a 3xx redirect to the browser which would point to the real
file.
first off, if you don’t want someone to have direct access to the files
(ie, http://www.yoursite.com/file1.xyz, where fie1.xyz is stored in the
public
dir), then don’t store them in public. move them outside of that
directory
and store the file information (path, name, etc) in the database.
second, for a simple example, you could store a hash of the
filename/mtime/etc in the database along with the file information. in
the
controller, make sure you have some sort of security in place that says
the
person has to be logged in and have access to the file, then you could
have
a url something like:
class MediaController < ApplicationController::Base
def download
media = Media.find_by_md5_hash(@params[:md5_hash])
unless media.nil? #make sure that the user has permission to get at this file
send_file media.filepath
else #oops, no file!
end
end
end
I you use lighttpd, then I would use mod_secdownload, which does all you
want and even more (the generated URLs are only valid for a certain
time,
you can set)
Never write a line of code if you can avoid it.
Fixing a typo!
…and the hash would end up in params[:id]…
P.S. Do not use @params! That directly accesses the params hash, but
I’ve heard core
team members recommend the params method call to insulate
yourself from possible
future change to the internal representations.
information. in the controller, make sure you have some sort of
unless media.nil?
map.connect “/media/download/:md5_hash”, :controller =>
–
– Tom M.
Keep in mind that using send_file like this with large files will ty
up one fcgi listsner for each download that is happening for the
length of the download. SO if you have five users downloading one
large file each then you will have 5 fcgi processes tied up until
finished! So mod_secdownload is a good option to look at if you are
suing lighttpd. Otherwise you can easily run out of fcgi’s and lock
up your whole app!
end
The default route would work just as well, and the hash would end
one large file each then you will have 5 fcgi processes tied up
backend process quickly.
Does anybody know how large Lighttpd’s FCGI buffer is, and whether
or not it’s configurable?
–
– Tom M.
I'm fairly certain that send_file with lighttpd will just ty up the
fcgi proc’s but I would love to be proven wrong. I haven’t seen any
options to configure fcgi buffers anywhere.
second, for a simple example, you could store a hash of the
def download
and your route might be:
suing lighttpd. Otherwise you can easily run out of fcgi’s and lock
up your whole app!
If you’re using a front end proxy, this may not be the case.
I seem to remember that Apache+mod_proxy allows you to specify the
buffer size, and if the
buffer was large enough, you could deliver large files into that
buffer and free up the
backend process quickly.
Does anybody know how large Lighttpd’s FCGI buffer is, and whether or
not it’s configurable?
Only question is: Why make a new route?
will ty up one fcgi listsner for each download that is happening
buffer was large enough, you could deliver large files into that
the fcgi proc’s but I would love to be proven wrong. I haven’t seen
any options to configure fcgi buffers anywhere.
I cannot prove this directly, but if Lighty properly implements the
FCGI spec as recommended, then the FCGI response should be buffered
by the HTTP server:
I’ve written an RSS feed parser/downloader using Ruby and ActiveRecord
and
http-access2 without Rails. I want this code to always be running in a
loop and downloading feeds on a schedule. I started by coding the
back-end (downloading feeds and putting the titles, links, and
descriptions into a database). Now, I’m wondering, is there an easier
way…
I suppose it is possible to invoke this code as needed when a user loads
a
page, however, it seems like it’d be easier if there was a rails
front-end
that managed the feed urls, but not the actual feed data itself.
I guess my question is, can a pure ruby process to handle this perpetual
munging of rss data be spawned from within rails? Or is it better to
just
spawn this back end code and run a database updater in an infinite loop?
Does anyone know how Odeo handles this?
first off, if you don’t want someone to have direct access to the files
(ie, http://www.yoursite.com/file1.xyz, where fie1.xyz is stored in the
public
dir), then don’t store them in public. move them outside of that
directory
and store the file information (path, name, etc) in the database.
second, for a simple example, you could store a hash of the
filename/mtime/etc in the database along with the file information. in
the
controller, make sure you have some sort of security in place that says
the
person has to be logged in and have access to the file, then you could
have
a url something like:
class MediaController < ApplicationController::Base
def download
media = Media.find_by_md5_hash(@params[:md5_hash])
unless media.nil? #make sure that the user has permission to get at this file
send_file media.filepath
else #oops, no file!
end
end
Thanks very much for this. The site will be very low usage, so it
doesn’t matter too much if the process is tied up for a while. I will
investigate the other modules suggested too, to see whether they’re easy
enough to use for the site.
I guess my question is, can a pure ruby process to handle this perpetual
munging of rss data be spawned from within rails? Or is it better to just
spawn this back end code and run a database updater in an infinite loop?
Does anyone know how Odeo handles this?
I may be missing something but here is how I may be solving a similar
problem for an upcoming problem. Downloading files will be a large part
of my application.
Have two web servers, mapped to different IP’s.
The application server
IP: 192.168.2.1 myapp.mydomain.com
will use one of the yet-to-be-determined scalable Rails implementations
The download server
IP: 192.168.2.2
Will use IIS or Apache
I’ll use one of the Apache plugins to “expire” certain downloads or, in
the case of IIS, will write a simple scheduled task to crawl the site
and delete old files.
For my purposes, I’m finding it easier to manage downloads on a separate
non-Rails server.
If you’re using Apache, mod_auth_token supports the same functionality
and interface as mod_secdownload, eg. allowing you to secure and expire
downloads
without having to pipe the file through your script.
I’m looking to do something similar to the “secret URLs” in Rails
Recipes, however, with file downloads. I want to avoid providing direct
URLs to people if possible.
I have files on disk in public/, so it would be good if each user got a
“unique” URL to each file that they are permitted to see (dealt with by
a user_id/file_id table). Then they are only allowed to download the
file if they are logged in and the secret matches. Otherwise, anybody
could simply point to http://…/example.pdf and download the file,
whether logged in or not.
I could do this with redirect_to I believe, but as far as I know that
just sends a 3xx redirect to the browser which would point to the real
file.