20

The file is around 170GB. I wouldn't upload it on the FTP, and download it. It is not reliable, sometime, connections drop, and it will have big trouble. So, is there any better way to do so? Anyone suggest? Thanks.

P.S.:two computers are not in same network.

Jens Erat
  • 18,485
  • 14
  • 68
  • 80
Ted Wong
  • 943

6 Answers6

27

Wierdly enough bitorrent might work pretty well here, assuming office policies allow it- it breaks up the file for you, checks if its correct and if not redownloads it. You're probably going to want to run your own tracker, but many bitorrent clients do that anyway, and if possible use webseeds to speed things up even more - burnbit makes this easy.

As an alternative I'd also suggest doing the oldschool pirate thing, and splitting up the files and using a parity file, then transferring it by any means you have, FTP or web server. If you use a webserver, downloads can be continued - with something like wget, and the parity archive would allow you to rebuild the file with a few broken pieces.

Naturally consider encrypting the files or file chunks, if the data is of a sensitive nature as well.

A more recent option may be bitorrent sync - It runs on everything but the toaster (Unless your toaster run windows, linux on x86, PPC or ARM), and handles most of the grunt work for you. It uses the underlying bitorrent protocol but is a lot simpler to use.

Journeyman Geek
  • 133,878
20

Take a look at robocopy, it supports restarting, and in general is a lot more stable than other options.

4

WinZip can produce lots of (relatively) little files (originally designed to archive to floppy disks, but you can choose 1GB now). Then non-resumable FTP (as the Microsoft option is) is OK. Finally, WinZip will retrieve the file at the other end.

Mark Hurd
  • 419
4

Robocopy and BitTorrent have already been suggested and sound like a good idea. Other options that may work better in a restrictive network environment where you cannot e.g. create the SMB connection that seems to be required for RoboCopy:

FTP. I know you don't like it, but with a good server and client, it should work well. Create an FTP server on either source or recipient, make sure that it supports encrypted connections (to prevent transparent proxies etc. from interfering) and files > 4 GB. Then upload/download the file using a good FTP client (making sure to use binary mode). FTP supports connection resuming, so if the connection drops, just resume. A current copy of wget should be fine.

You can do the same with HTTP: Set up a HTTP(S) server supporting large files, and download it with a current copy of wget.

Otherwise, there are rsync binaries for Windows and numerous proprietary rsync-like programs that you could use. Especially if you expect that the file will need to be updated and only small portions of the file will change, you may want to look into that direction.

Remember that at 8 MBit/s (1 MByte/s), it will take you two days to transfer the file. Unless you have really fast connection, sending a physical hard drive with a copy of the file(s) may be faster.

Jan Schejbal
  • 1,132
1

Some things that come to my mind are private P2P Networks (uTorrent, DC++) or a tiny http server with wget

Midhat
  • 474
0

I would sugest to use some tool (windows) based on rsync http://en.wikipedia.org/wiki/Rsync