Jorge Alvarez

attachment_fu s3 uploads with backgroundjob

Posted in Desarrollo, files, Internet, rails, uploads by jorgegorka on 17/07/2008

Thanks to Jon Guymon for his article bj-makes-attachment_fu-happy that really help me solve a problem I was having with mongrel.

Attachment_fu + s3 sometimes do very weird things that makes mongrel to freeze. After a lot of googling I realize that the best solution to avoid problems was to do the uploads to S3 in a background proccess apart from mongrel.

I started with Jon Guymon approach and it worked well but i needed to create thumbnails and the local files to be deleted after the uploading to S3 was finished.

Just changing the part of attachment_fu that actually uploads the file to S3 do the trick.

Open vendor/plugins/attachment_fu/lib/technoweenie/attachment_fu/backends/s3_backend.rb and change the method save_to_storage to background the uploads.

This is the original method:

def save_to_storage
   if save_attachment?
     (temp_path ? : temp_data),
     :content_type => content_type,
     :access => attachment_options[:s3_access]
     @old_filename = nil

In my version I copy the file to a temporary directory in tmp/s3uploads (just to make sure the file does not disappear) then I add the upload task to the background queue

def save_to_storage
   if save_attachment?
    my_temp_file = RAILS_ROOT+'/tmp/s3uploading/'+"#{rand}#{filename || 'attachment'}"

     if temp_path,"w+") do |tmp|
     FileUtils.cp temp_path, tmp.path
     else, "w+") do |tmp|
     tmp.write temp_data
     fichero = RAILS_ROOT+'/tmp/s3uploading/' + "#{rand}#{filename || 'attachment'}"
     Bj.submit("./script/runner ./jobs/s3_uploader.rb " +
     full_filename + " " +
     my_temp_file + " " +
     bucket_name + " " +
     content_type + " " +

     @old_filename = nil

This way attachment_fu will spawn a task for every file it creates.

Now edit the file Jon Guymon created to handle the upload (jobs/s3_uploader.rb)

This is how my file looks like

Base.establish_connection!(:access_key_id => ACCESS_KEY,
:secret_access_key => SECRET_KEY)[0],[1]),
:content_type => ARGV[3],
:access => ARGV[4]


A simple upload to S3 and after finishing it I delete the temporary file created in tmp/s3uploads/

So far mongrel is doing its job with no more hangs and as a side effect users can uploads their files faster.

There are a lot of pages about backgrounding tasks and a good recipe in Rails recipes 2


6 comentarios

Subscribe to comments with RSS.

  1. Robert said, on 28/07/2008 at 7:54 PM

    This is a much better approach. No extra ‘updated’ column and more seamless.

    A typo:

    @old_filename = nil
    end <<<< remove this Tying to get this to work. For me it seems to be failing on exit_status=10752 How/where can i find out what this status means?

  2. Robert said, on 28/07/2008 at 8:05 PM

    Looks like my comment got formatted wrong. Two fixes: remove “require ‘upload_file'” from s3_uploader.rb. AND remove the typo (the last extra ‘end’ from save_to_storage in your example). All works for me now.

  3. Jorge said, on 02/09/2008 at 11:10 PM

    thank you robert, typo fixed

  4. mmo said, on 19/09/2008 at 2:20 AM

    Smart thinking on the work around. Thanks for the heads up.

  5. Morten said, on 02/10/2008 at 10:05 PM

    Great post! But how do I serve up the locally stored image (+ thumbnail) while the image is being uploaded to S3?

  6. Jorge said, on 21/10/2008 at 10:10 AM

    Hi Morten: With this approach you can’t. You must wait for the upload to complete before being able to serve the image(s).

Los comentarios están cerrados.

A %d blogueros les gusta esto: