The Artima Developer Community
Sponsored Link

Ruby Buzz Forum
Reworking Net::SFTP to handle large file downloads

0 replies on 1 page.

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 0 replies on 1 page
Matt Parrish

Posts: 12
Nickname: mparrish
Registered: Jun, 2007

Matt Parrish is founder and lead developer of Pearware LLC, a Ruby on Rails web development company
Reworking Net::SFTP to handle large file downloads Posted: Jun 26, 2007 2:47 PM
Reply to this message Reply

This post originated from an RSS feed registered with Ruby Buzz by Matt Parrish.
Original Post: Reworking Net::SFTP to handle large file downloads
Feed Title: pearware blog - agile web development
Feed URL: http://blog.pearware.org/feed/atom.xml
Feed Description: Agile web development using Ruby on Rails (and sometimes Java)
Latest Ruby Buzz Posts
Latest Ruby Buzz Posts by Matt Parrish
Latest Posts From pearware blog - agile web development

Advertisement

I’m writing an application that downloads access logs from our production servers and runs the AWStats package against them to create the statistics web pages. This process is setup as a Rake task that uses the Net::SFTP library used by Capistrano, written by Jamis Buck. There is also a front-end Rails application to manage each of the applications to be retrieved. Everything was working great until I tried to grab a 550MB file from one of our servers. Net::SFTP chocked as it ran out of memory.

It turns out that the command:


sftp.get_file log_file, local_file
ends up putting the whole file into memory, which is fine for small files, but not the large one that I was trying to download. Luckily it wasn’t too bad to refactor my class. Here’s the new code to achieve the same effect as the above sftp.get_file command.

        stat = sftp.stat( log_file )
        offset = 0
        file_length = stat.size
        length = 64 * 1024 * 1024
        File.open(local_file, File::CREAT|File::TRUNC|File::RDWR, 0644) do |f|
          while (offset < file_length)
            sftp.open_handle(log_file) do |handle|
              data = sftp.read(handle, :length => length, :offset => offset)
              f.write(data)
              offset += data.length
            end
          end
        end

This downloads the file in 64MB increments, using only that much memory at any time.

Read: Reworking Net::SFTP to handle large file downloads

Topic: Reading technical books Previous Topic   Next Topic Topic: Trying out Indi

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use