Transferring millions of rows of table data across Sql server 2005

  • I am trying to transfer table data from one sql server to another. Both Sql Server 2005.Please note, I don't have access to SSIS package.

    Last week I migrated 4 millions rows through DTS. I took me 7 hours.

    Now I have to migrate 35 million rows. Actually, I cannot take more than 7 hours for this.

    Please share me your inputs/ideas for this. Is there any simpler way to do it. Are you having scripts or something which you could share with me. Any help would be of great help.

    Thanks a lot.

    M&M

  • Try using BCP for this. It is a lot faster than either DTS or SSIS stuff like this.

    By the way, how is it that you have access to DTS on SQL 2005 but not to SSIS?

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Our access is restricted. And also, SQL server 2000 is also configured.

    M&M

  • mohammed moinudheen (9/15/2008)


    Our access is restricted. And also, SQL server 2000 is also configured.

    Also, if your access is restricted, why are you the one doing the data transfer? Have the dba do it.... once (s)he get's the drift of it, one of two things will happen... either it'll get done, or they'll give you access for you to do it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Just for giggles - which server did you fire off the transfer from?

    Meaning - was it a pull or a push? (A pull is often MUCH faster)

    Did you drop the indexes?

    have you considered doing the export to an external file and importing from there (sometimes faster than a direct transfer)?

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • BCP is faster than any other way I can think of. I did about 8 million each day when I was at the Gap. It still takes a while, but i never found anything faster. You can make little style sheets for each format as well.

    For me, I had to get them to a remote server, so I managed to get FTP access to the box, upload a compressed file, then run a script as a job on the server which extracted the file and then ran the BCP script locally. So... just in case you are working off the box somewhere, that was a big speed improvement. If you can get your data on the same server (actually, if you can get it on the same box, but a different drive) then you can speed things up quite a bit.

  • Another vote for BCP. Works wonderfully for me.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply