FTP HELP!!!!

  • Keep it rolling....

  • Any fresh ideas?

  • You say it happens on large datasets. Are you small ones getting thru? When ran thru the automated way.

  • Yes all small data sets 30MB it seems that is the breaking point. I have tried using both get and mget, automated and manual to recieve the files...there is no difference.

  • When doing a manual transfer of the small data sets (up to 30MB) on each node, is there any significant difference in speed/throughput?


    Cheers,
    - Mark

  • Speed and throughput is almost identicle on both nodes.

  • To the top!

  • Anyone? Any idea will be appriciated.......

  • Anyone? Any idea will be appriciated.......

  • COnsidering that I would have to say they have you goverend somewhere, either at the FTP server or a router that is dropping packets. I don't know of any off hand but I think you should be able to find a tool to send packetes to the FTP server like ping and see if you get a large number of dropped packets. But something should be logging errors soemwhere on lost packets.

  • Seems to me that the FTP server is closing the connection after n time (ie shutting down the connection as 'idle'). Probably the FTP sees the connection as idling even though a transfer is in progress. I'm in no way any guru on FTP, just guessing , but I think that when transferring files, the connection doesn't send any 'keep alive' messages to the FTP server, and perhaps this is why it gets shut down at ~30 MB..?

    On a sidenote, I've had the exact same message once when doing manual ftp'ing - I could connect but the connection was immediately shut down with the 'connection closed by remote host' message. The quick way around that was turning off the firewall, connect, and raise the fw again. After that everything went smoothly.

    /Kenneth

  • No one has ever experienced this?

  • I haven't read the whole thread but I'm just pitching in with my first idea. Since it seems that you can send up to 30 mbs, couldn't make small chuncks of lets say 25 mb and send them one at the time... maybe even disconnect between each file untill the process is over (if you still it the wall after 30 mbs). And once you are done sending the last file you could create a dir or send another file like finish_[today's date].txt to confirm a successfull transfer.

  • I have tried disconnecting and reconnecting between files and found the same results. I don't think that it is possible to have the file chopped up into smaller portions. We are receiving this file from another department / vendor. Could you elaborate on chunking the large files?

    Thank you :: MATT

  • I have never used a splitting software for automation tasks but I'm pretty sure winrar could do it. Here's a search from download.com that could get you started on other file splitters :

    http://www.download.com/3120-20_4-0.html?qt=file+splitter&tg=dl-2001&search.x=0&search.y=0&search=+Go%21+

    Basically you take your huge file and split it to the desired size into X parts. Then you transfer each parts and after everything has been transfered you use the same program to rejoin the small chuncks into the original file.

    Or another option could be to write a small activex script to use in dts to split the file and ftp it (I'm assuming you are transfering text files). Then you could rejoin the files on the other server using a vbs script or another dts.

    However all this assumes you can send as many small files to the ftp server and that you won't be stopped again at 30 mbs.

    P.S. If that still fails I would check if the ftp server blocks the account to 30 mb of drive space for uploads... but I guess you already thaught of that.

Viewing 15 posts - 16 through 30 (of 45 total)

You must be logged in to reply to this topic. Login to reply