I have an application which uses an oracle database to track over 1,000,000 image and video files which need to be synchronised between 5 servers. Currently I'm using ssh and scp via perl's system() call but this causes several connections to be made for each file transfered, which is pretty ugly and not really quick (a lot of time is lost setting up each connection). The script has to be able to create and delete directory hierarchies, transfer and delete files and also handle permissions correctly. The applications that access the files use the same database to determine which files exist on a server, so it's also imperative that all error conditions are handled appropriately. rsync, rdist, rdiff and scp are all of no use in this scenario :-(
So I'm now trying Net::SFTP. First, to install it I ended up having to install 25 other modules and libraries (Math::*, Digest::*, Crypt::* ...). Finally, I'm able to at least use the psftp script included in the eg/ directory. It seems to work, but it can't transfer more than 120 KB/s. Manually using sftp for the same files over the same connection pumps 7MB/s (5 times faster, which is why I though Net::SFTP might be a good option).
Is the psftp script a representative example of Net::SFTP performance? My hope was to replace my existing code with Net::SFTP and manage these files and directories via a single connection to each server. Sadly, it doesn't look like I'm going to gain much by doing so.
I'm wondering if anyone has tried to glue perl with the OpenSSH's sftp code? (Does OpenSSH even have a library? All I can find is libssl). Would it then be enough to install just sftp (or have the sftp sources next to the Net::SFTP build directory for compiling)?
Does anyone have any ideas?
Thanks in advance,