Any helping hand

  • Hi Everyone,

    From last 3 months We where facing a problem the best way for data transfer.

    I have a OLAP Server same as One of my production server .The Production resides 200 km away from OLAP server we connect by lease line.

    As our production server is in 24*7 mode . We have 200+users connected each time .We get 1 hour time for dts during the shift change.

    In my olap server i used 24 tables each having more then 50,000,00+rows and i have to transfer the data everynight.These 24 tables gets updated daily as these are the main transaction tables in our ERP setup.

    So firstly we created linked queries

    Ex: Delete from databasename.username.tablename

    insert into databasename.username.tablename from [linkedservername].databasename.username.tablename

    (it took 13+ hours to transfer the data from production server to olap server)(Tried on holidays)

    Then we tried Creating BCP Files and transfering then on network.

    Ex: exec master..xp_cmdshell  'bcp epinav.dbo.salesline out c:\BCPFILES\salesline.bcp -n -"networkname" -U -P'

    This took 7 hours to transfer and we faced problem in loading data

    string truncation errors so we have to script the production tables and execute them on my olap server and reprocess loading BCP files.

    in all this process took 9 hours to complete.

    This also doesnt work in our case.

    Then we tried a process which need too much of maintenance.

    The steps are

    (1) Created BCP files (takes total time 15 mins)

    (2) scheduled automatic delete and create files. ( 3mins)

    (3)Created batch files to zip the bcpfiles (zipping of 24 bcp files takes approx 5 mins)(used winzip)

    (4) scheduled the batch files under windows schedular

    (5) transfer zip files to olap server takes 5 hours to transfer.

    (6) unzip it in a location

    (7) LOad data from BCP files.

    In this 7 steps it takes Apoprox 6 hours.

    We tied this step also

    We created a small database(DTS) of 24 tables used in olaps and  created a DTS package in which we used execute sql task and datadriven queries to transfer data from production server to my DTS server and takes backup of the DTS database ZIP it and tranfer it on the network the datbase size is 6 GB and takes longtime to transfer(time not calculated).

    After transfer completes we restore the database and transfer data from dts database to olap database.

    Total time(+6 hours).

    At last stay bare handed no result for our hardwork....

    Can u all pls suggest that what is the best and fast methord to transfer data.

    Thanx for ur help and Time

    We use Microsoft Navision Axapta the ERP Programme for user interface.

    We have SQL server 2000 SP4 on both the machine.

    Production server has 3 CPU with 6 GB RAM.(RAID 5)

    From

    Killer

  • Instead of bcp why dont you try Bulk Insert

  • Just to make sure I'm understanding what you've posted, the total time from steps 1 to 7 is 6 hours??

    If so then I'd look at the main bottleneck in your process which seems to be the 5 hours it takes to copy the files to your OLAP server. How do you copy the files?

    You might try using a different copying method, that supports a higher throughput. Have you tried something like ROBOCOPY from the Windows Resource Kit?

     

    --------------------
    Colt 45 - the original point and click interface

  • Also, using Data Driven queries in the DTS packages will slow things down a lot. This causes the package to process data from the source in row-by-row method instead of in bulk mode.

    You should probably BULK INSERT the data to a staging table and then use stored procedures to perform the insert/update/delete actions on the OLAP database.

     

    --------------------
    Colt 45 - the original point and click interface

  • If i am understanding your post correctly, you are trying to copy the whole transaction table(s) every 24 hours, when in fact you only need thouse transactions which have been generated in the ERP solution since the last copy.

    We have a main Finacial Transaction table in our ERP system whice we copy to SQL for analysis purposes.  The data is exported from the ERP system by "Date Entered", ie only those transactions added 2 days previous are exported.  This data is then appended to SQL.

    If there is any concern over the integrity of the source data(timing issues or cutoff points), then extract data that is less than or equal to 30 days old, delete the same time period in SQL and then append.

  • Hi,

    Thanks .

    My tables are updated daily and we dont know what row has been updated.

    so i tranfer all the records and the tables script also.

    My problem is transfering the files on network .I dont have problem in inserting the data.

    I am using this command to copy the zip files

    exec xp_xmdshell 'copy  \\servername\foldername\filename.zip  d:\copyfolder\filename.zip'

    Hope u undertsand.

    from

    Killer

     

  • As mentioned in my earlier post, try using ROBOCOPY from the Windows Resource Kit.

     

     

    --------------------
    Colt 45 - the original point and click interface

  • Hi,

    Ok, I will try from that but dont we another option i mean from sql server only.

    Or some other way that can make my dts easy.

     

    from

    killer

  • As mentioned in my earlier post, the main bottleneck you seem to be encountering is transferring the files to the OLAP server. There is nothing that DTS, or SQL Server can do for that.

    You can execute ROBOCOPY from within a stored procedure using xp_cmdshell, but you are still going to be limited by network bandwidth.

     

    --------------------
    Colt 45 - the original point and click interface

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply