December 15, 2004 at 7:13 am
Routes seem to be the same from both SQL box's to the Unix box.
Thank you for all the help guys....keep the ideas coming. I appriciate all your time.
December 16, 2004 at 7:49 am
Keep it rolling....
December 17, 2004 at 12:23 pm
Any fresh ideas?
December 17, 2004 at 12:37 pm
You say it happens on large datasets. Are you small ones getting thru? When ran thru the automated way.
December 17, 2004 at 1:47 pm
Yes all small data sets 30MB it seems that is the breaking point. I have tried using both get and mget, automated and manual to recieve the files...there is no difference.
December 17, 2004 at 5:59 pm
When doing a manual transfer of the small data sets (up to 30MB) on each node, is there any significant difference in speed/throughput?
Cheers,
- Mark
December 18, 2004 at 10:22 pm
Speed and throughput is almost identicle on both nodes.
December 22, 2004 at 7:50 am
To the top!
December 31, 2004 at 11:40 am
Anyone? Any idea will be appriciated.......
January 3, 2005 at 6:09 am
Anyone? Any idea will be appriciated.......
January 3, 2005 at 7:25 pm
COnsidering that I would have to say they have you goverend somewhere, either at the FTP server or a router that is dropping packets. I don't know of any off hand but I think you should be able to find a tool to send packetes to the FTP server like ping and see if you get a large number of dropped packets. But something should be logging errors soemwhere on lost packets.
January 4, 2005 at 2:04 am
Seems to me that the FTP server is closing the connection after n time (ie shutting down the connection as 'idle'). Probably the FTP sees the connection as idling even though a transfer is in progress. I'm in no way any guru on FTP, just guessing , but I think that when transferring files, the connection doesn't send any 'keep alive' messages to the FTP server, and perhaps this is why it gets shut down at ~30 MB..?
On a sidenote, I've had the exact same message once when doing manual ftp'ing - I could connect but the connection was immediately shut down with the 'connection closed by remote host' message. The quick way around that was turning off the firewall, connect, and raise the fw again. After that everything went smoothly.
/Kenneth
January 6, 2005 at 6:57 am
No one has ever experienced this?
January 6, 2005 at 7:37 am
I haven't read the whole thread but I'm just pitching in with my first idea. Since it seems that you can send up to 30 mbs, couldn't make small chuncks of lets say 25 mb and send them one at the time... maybe even disconnect between each file untill the process is over (if you still it the wall after 30 mbs). And once you are done sending the last file you could create a dir or send another file like finish_[today's date].txt to confirm a successfull transfer.
January 6, 2005 at 7:49 am
I have tried disconnecting and reconnecting between files and found the same results. I don't think that it is possible to have the file chopped up into smaller portions. We are receiving this file from another department / vendor. Could you elaborate on chunking the large files?
Thank you :: MATT
Viewing 15 posts - 16 through 30 (of 44 total)
You must be logged in to reply to this topic. Login to reply