March 8, 2022 at 12:09 pm
Hi All,
My table size is 80GB. In that table one column is XML when i transfer the table from server to local using SSIS like OLED Source and OLED Destination it will take time more almost 15 hours time take .How to reduce the time like fast way to transfer the data using SSIS.Please check below column list.
id name xmldata
Please help me
Regards
Pols
March 8, 2022 at 2:54 pm
Without knowing your setup (hardware side), I would say get a faster network connection. Transferring 80 GB in 15 hours is just over 5 GB per hour, which is 0.08 GB per minute (91 MB per min), or 1.5 MB per second (12 megabit per second). These are all approximations and I rounded to make the numbers nicer.
I work remotely (work from home) and my internet speeds are 10 MegaBit per second for downloads, so it would take me over 15 hours to pull in those 80 GB.
Now, that being said, it MIGHT not be your network, but could be your memory. SSIS operates as much in memory as it can. If your RAM on the server hosting SSIS is not enough, you may end up paging to disk which is a slow operation. If I remember right (been a while since I did analysis or investigation on this), SSIS pulls the entire table into memory before it writes it to disk. So if the table is 80 GB, you NEED to have 80 GB of RAM free on the server hosting SSIS. If you are running SSIS on your desktop, that will need 80 GB free to copy the data into memory before it can be copied to your database.
Now, to make things faster, what I would recommend is to create a linked server on your local system to the remote system and then just do an INSERT and skip over the SSIS side of things. That will remove the overhead of needed the free memory to hold the data for the transfer and removes the overhead of creating and maintaining an SSIS package. Mind you, if the problem is with the network, that won't help things, it'll still be slow. You can verify if it is this by watching the memory in task manager or perfmon to see if the server maxes out on memory.
My last piece of advice would be to remove all nonclustered indexes on the destination table. These will only slow down your inserts. Mind you, with only 3 columns, and the columns being what they are, I would expect you have 0 nonclustered indexes and 1 clustered index on name and id.
Others may have better ideas, but that is what I'd try.
NOTE - when I say "server" above, I mean whichever computer is running your SSIS. It MIGHT be a server doing it or it could be your local machine. I do not know how you have things set up.
The TL;DR version - if it is your network/internet speeds, you may be stuck with it with no way to improve. If it is due to memory constraints (paging to disk), using a linked server instead of an SSIS package may help. If it is due to the inserts themselves, removing nonclustered indexes on the destination may help.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
March 8, 2022 at 3:05 pm
Another possibility is pushing the contents of the table to a text file on the remote server, zipping that, copy to local server, unzip and import.
As implied in the previous post, finding where the bottleneck is will help determine the best next steps.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
March 8, 2022 at 4:59 pm
Quick question, how many rows are in the table?
😎
Given the numbers, you would be much better off doing a backup onto a USB stick, walk over to the other server (modern sneaker network) and restore from the stick. Also I think you should rephrase the question to "How to reduce the time like fast way to transfer the data?" as in this kind of situations, SSIS is normally not your friend. 😉
March 8, 2022 at 5:13 pm
In table 4000000 records .backup is not possible total database size is 1TB.
March 8, 2022 at 5:16 pm
table convert into text file then it is size almost 60GB.daily we can't create text file then restore another server and it is also time taken process. Please suggestion another way if possible. Only in oledb source and oledb destination in ssis.
March 8, 2022 at 5:23 pm
Thank you for your suggestion. How to create the linked server. Please help me for that one like sample script.
Regards
Pols
March 8, 2022 at 6:59 pm
My opinion - before making a linked server, find your bottleneck. If the bottleneck is the network, the linked server is not going to make things any faster. It will ONLY help if the bottleneck is the memory (ie RAM).
I'd look at your monitoring tools or if you lack those, look at perfmon/task manager to determine IF your memory is maxing out during the transfer.
And even if it IS memory, how much blocking (SELECTS will request a shared lock, but that'll still stop all exclusive locks from using the table) is acceptable? I imagine that 15 hours of blocking is going to be problematic, but even if you have a 1 Gbps connection, that is only 125 MB/s, so to transfer 1 GB would be approximately 10 seconds, so 80 GB would be 800 seconds assuming you could get max speeds the entire time for the transfer.
Before implementing ANY solution, I would try to find the bottleneck and address that. If your car won't start, you don't randomly replace things until it starts, you investigate the problem and pick the most likely solution to the problem. If your SQL process is slow, find the problem and fix that first. If you can only pull the data at 10 Mbps and your network team can't help you, nothing we suggest is going to help make the data transfer faster to your PC other than Eirikur Eiriksson's suggestion of a USB stick.
Now, if the network is the problem AND you can't improve it to your machine, your IT team may be able to build you a new VM that you could use that would have a faster link to the database and you could transfer the data there instead of your local machine.
TL;DR - find the bottleneck before trying to solve the problem to prevent extra work!
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply