Ok I need help with an issue.

  • I have posted this over on Microsoft's forum too.

    But I need some help on this issue.

    From my post at Microsoft's forum:

    Running SQL Server 2005 Enterprise 64 bit

    Windows 2003 SR2 64 Bit

    The issue I'm having is I have a standard update insert process I'm building using a customer table that has over 5 million rows in it. The setup is looking at the customer key and looking across to another server, both servers are using SA logins. (Yes, I know this is not the prefered way but this is how the firm I'm consulting for has it setup and will not use domain logins.) In the lookup I have it doing a redirect to an ole db destination with a maximum commit rows of 10000. The OLD DB Command has the following SQL:

    UPDATE [dbo].[DimCustomer]

    SET [BranchKey] = ?

    ,[PriceColumnKey] = ?

    ,[CustomerName] = ?

    ,[CustomerNumber] = ?

    ,[FirstSaleDateKey] = ?

    ,[LastSaleDateKey] = ?

    ,[PrimaryMarket] = ?

    ,[City] = ?

    ,[State] = ?

    ,[SalesRep] = ?

    ,[Warehouse] = ?

    WHERE CustomerKey = ?

    The fields are mapped correctly to the parameters created. The error that I get when running the package for this large set up data is as follows:

    Error: 0xC0202009 at Data Flow Task, Dim Customer Updates [197]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.

    An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections.".

    An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".

    An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Named Pipes Provider: No process is on the other end of the pipe.

    I have looked all over the web I have found the following microsoft kb:

    http://support.microsoft.com/kb/2006769

    This didn't work!

    I have also was looking at the SQL Profiler and the only error I get in there is Changed database context to 'CaptainSCR_Dev'. and change of language. I see that is does run the following SQL when run.

    exec sp_executesql N'UPDATE [dbo].[DimCustomer]

    SET [BranchKey] = @P1

    ,[PriceColumnKey] = @P2

    ,[CustomerName] = @P3

    ,[CustomerNumber] = @P4

    ,[FirstSaleDateKey] = @P5

    ,[LastSaleDateKey] = @P6

    ,[PrimaryMarket] = @P7

    ,[City] = @P8

    ,[State] = @P9

    ,[SalesRep] = @P10

    ,[Warehouse] = @P11

    WHERE CustomerKey = @P12' Took this part out......

    I have also check to make sure that the server does allow remote connection and the timeout is set correctly for the server, and have also did change the network packet size from the default 4089 to the 32K.... As the kb mentioned with a restart of the sql server between the changes, and a remote login validation.

    I have no problems with doing this on smaller tables with the same connections in the same SSIS package.

    Also, the lookup is using a full cache lookup I have tried none and partial and this make the package data flow fail in other area of either the oringination or the file OLE DB destination.

    Any help would be greatly appreciated on this, for I'm at a loss of this one and have not run into this one before.

  • are you certain that your OLE DB connection is using the correct server and database?

    What i would do would be to create a simple SQL script task with hardcoded values

    UPDATE [dbo].[DimCustomer]

    SET [BranchKey] = 10

    and then execute this task alone, this will help you isolate the issue.

  • I would advise you to consider a modification to your update methodology. With large datasets, this RBAR method performs badly.

    A faster alternative is to send the records to be updated to an intermediate staging area and then, at the end of the dataflow, run a proper set-based update against this table. This could easily fix your problem as a desirable side-effect.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Hello

    i have 15 companies n each company have 10 tables from bucket 1 to bucket 10 , so total 150 tables,

    so i want to export this tables, 10 tables of each company to one Excel workbook with 10 Different Worksheets

    so in this case Excel File location would be remain same

    but then if i go to second companies 10 tables then excel file also would change

    so if i use export wizard it would let me do only for one company

    so that way i have to make 15 Export Wizard or SSIS package but

    i want to only make One SSIS package which can pick up the table name to Export and then put it in Source connection string and then export it to the Excel

    so i am trying to use for each loop but i dont know what variables to use or what correct methods to to this multiple export of tables to excel

    please give me any example or

    Please let me know your suggestions

  • vip_shah (5/31/2011)


    Hello

    i have 15 companies n each company have 10 tables from bucket 1 to bucket 10 , so total 150 tables,

    so i want to export this tables, 10 tables of each company to one Excel workbook with 10 Different Worksheets

    so in this case Excel File location would be remain same

    but then if i go to second companies 10 tables then excel file also would change

    so if i use export wizard it would let me do only for one company

    so that way i have to make 15 Export Wizard or SSIS package but

    i want to only make One SSIS package which can pick up the table name to Export and then put it in Source connection string and then export it to the Excel

    so i am trying to use for each loop but i dont know what variables to use or what correct methods to to this multiple export of tables to excel

    please give me any example or

    Please let me know your suggestions

    My suggestion? Please start-up a new thread instead of hijacking an existing one.

    Your question isn't even remotely related to the original question in this topic.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Koen Verbeeck (5/31/2011)


    vip_shah (5/31/2011)


    Hello

    i have 15 companies n each company have 10 tables from bucket 1 to bucket 10 , so total 150 tables,

    so i want to export this tables, 10 tables of each company to one Excel workbook with 10 Different Worksheets

    so in this case Excel File location would be remain same

    but then if i go to second companies 10 tables then excel file also would change

    so if i use export wizard it would let me do only for one company

    so that way i have to make 15 Export Wizard or SSIS package but

    i want to only make One SSIS package which can pick up the table name to Export and then put it in Source connection string and then export it to the Excel

    so i am trying to use for each loop but i dont know what variables to use or what correct methods to to this multiple export of tables to excel

    please give me any example or

    Please let me know your suggestions

    My suggestion? Please start-up a new thread instead of hijacking an existing one.

    Your question isn't even remotely related to the original question in this topic.

    Well, apparently a new thread was already started up:

    http://www.sqlservercentral.com/Forums/Topic1117853-364-1.aspx

    Please do not crosspost in the forums. It will waste people's time and fragment replies.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply