![]() My second solution was to use Syncthing to send the files directly to an inhouse NAS. My first solution was a simple FTP server, however, it didn't solve the problem of transmission time (often 3-4 hours on our connection). The second thing is that Windows' file copier is NOT made for copying stuff over the internet. ![]() One thing I found out, was that our VPN settings were completely mismatched with the network setup (mainly the MTU length). Since they used the standard Windows copy/paste file over an RDP connection, no wonder. ![]() Users had grave problems downloading file bigger than approx. I had the same problem in my previous job (except with 300GB+ offsite database backups on an (from the office) unstable connection). I have to finish the file download in 1 hour, because the file is generated hourly. UPDATE 1: Additional info: The parallel download functionality should not be removed, because they have a bandwidth limit (80-120 Kbytes / sec, mostly 80) per connection, so 10 connections can cause a 10 times speedup. I was thinking of writing a download tool in C# for this specific purpose, but if there's an existing tool, or if the curl command could work properly with different parameters, then I could spare some time. It cannot have 300 segments, of which only are 10 being downloaded at a time. The other problem is that this script downloads all segments at the same time. (I think the -range and the -C options cannot be used together.) It seems to truncate the relating temp file, and start from the beginning after each fail. One problem is that it seems to ignore the -C - part, so it doesn't "continue" the segment after a retry. I had to add -speed-limit 2048 -speed-time 10 because the connection mostly just hangs for minutes when it fails.īut recently even this script can't complete. Up until now I used a modified version of pcurl: The problem might be with their reverse proxies / load balancers. We contacted their admins (working from India) multiple times, but they can't or don't want to do anything.) I have to regularly download a relatively small file: 300 MB, but the slow (80-120 KBytes/sec) TCP connection randomly breaks after 10-120 seconds. Is there an existing tool, which can be used to download big files over a bad connection?
0 Comments
Leave a Reply. |