July 22, 2009 at 8:13 am
John Rowan (7/21/2009)
I'm in agreeance with the others. Why do you think you need a temporary table for SSIS?
There are a lot of situations....We had a ton of predeveloped queries, that i was tasked with running against a few hundred SQL Servers of mixed versions, and storing the results into a repository (to gather metrics).
One such query was something like:
Declare @Dir Table(Results varchar(1000))
Insert Into @Dir
Exec xp_cmdshell 'dir e:\mssql\backups\*.bak /b'
--add @@ServerName to results set and store in repository
or
Create Table #Temp(Results varchar(1000))
Insert into #Temp
Exec xp_cmdshell 'dir e:\mssql\backups\*.bak /b'
--add @@ServerName to results set and store in repository
This is just one example, so please dont post and ask why i dont just utilize the FileSystemObject, etc. 😉
There was another solution that i ended up using, but it sucked to have to break the queries out into multiple steps (one for the results of xp_cmdshell and one for the select @@ServerName), then had to combine the results in a seperate task, and also had to create additional real tables. It would have been really nice to just use the queries as written, as the results of running the queries in a query window were the exact output needed to be stored (servername and backup filenames as an example).
July 22, 2009 at 9:51 am
Jeremy Giaco (7/22/2009)
John Rowan (7/21/2009)
I'm in agreeance with the others. Why do you think you need a temporary table for SSIS?There are a lot of situations....We had a ton of predeveloped queries, that i was tasked with running against a few hundred SQL Servers of mixed versions, and storing the results into a repository (to gather metrics).
One such query was something like:
Declare @Dir Table(Results varchar(1000))
Insert Into @Dir
Exec xp_cmdshell 'dir e:\mssql\backups\*.bak /b'
--add @@ServerName to results set and store in repository
or
Create Table #Temp(Results varchar(1000))
Insert into #Temp
Exec xp_cmdshell 'dir e:\mssql\backups\*.bak /b'
--add @@ServerName to results set and store in repository
This is just one example, so please dont post and ask why i dont just utilize the FileSystemObject, etc. 😉
There was another solution that i ended up using, but it sucked to have to break the queries out into multiple steps (one for the results of xp_cmdshell and one for the select @@ServerName), then had to combine the results in a seperate task, and also had to create additional real tables. It would have been really nice to just use the queries as written, as the results of running the queries in a query window were the exact output needed to be stored (servername and backup filenames as an example).
There are many situations where people choose to use cursors too, but that doesn't mean that that is the way to go. I'll admit that there are some viable used for not staging (not temporary) tables in SSIS, but most of the time I see them used is because, like cursors, the developer just didn't know how to do it otherwise. The ability exists within SSIS to create stand alone ETL solutions. Staging tables should for the most part be un-needed in a true ETL environment. It's Extract-Transform-Load not Extract-Load-Transform-ReLoad. The latter puts more of a burden on the database subsystem whereas standalone ETL only consumes resources on the ETL server for ETL operations.
The scenario that you described could have just as easily been done using a data flow task to add in extra attributes, such as @@ServerName, to your DBA stats repository. Is what you did harmless, yes. Would it be worth you re-writing your package to not use a staging table, not really.
My point is that, like cursors, many folks choose to take the easy route and use staging tables which results in ETL solutions that are less portable, less scalable, and more of a burden on the database sub-system than they need to be. My question to the OP was geared towards understanding why they think they need a temporary table becasue they most likely do not and coding the ETL package to not use one will remove all of the headaches that they are now facing.
July 23, 2009 at 10:04 am
Jeremy Giaco (5/7/2009)
I find that even when using SET FMTONLY ON; (and NOCOUNT ON) I still have issues with using temp tables in general in SSIS..i can get around it by using Table variables, until i connect to a 2000 box, in which case i cannot do things like:-------------------------------------------
--Works in 2005, not in 2000, cant use #Dir in SSIS, so im kinda forced to do other tricks in SSIS.
Declare @Dir Table(Results varchar(1000))
Insert Into @Dir
Exec xp_cmdshell 'dir e:\mssql\backups\*.bak /b'
--parse results here...
One thing that may be SOMEWHAT helpful for anyone out there is that you can wrap your code in a stupid If statement like "If 1 = 1 Begin...End"...that combined with SET FMTONLY ON has NEARLY worked for me. I was able to preview the data with that, but once i executed on my package, i only get results for the first of my dynamic datasources...but thats another topic. All in all, im going to probalby end up doing this in VBSCRIPT, i accomplished that in about 1/2 hour 2 weeks ago, still working on the SSIS version lol.
Further to John's post I thought I'd post a screenshot which hopefully show's what you can do to SSIS-ify your logic. I totally agree with him in what he says about temp tables only I think that it is worth changing your stored procedures seeing as you said you had to refactor them anyways!
From what you described I think this will achieve the same thing. Put this within a foreach loop and execute once for each server that you have whilst dynamically changing the "Target" connection string and variable... it works on 2k and 2k5, there are no stupid if 1=1 statements, no SSIS trickery, no temp table required, and no file system object 😉
HTH
Kindest Regards,
Frank Bazan
December 3, 2009 at 7:34 am
So, using a temptable is not a good idea.
What is the best way to solve the following problem then.
I have two databases, A and B. DB A has an orders/order_rows structure where order_rows has no primary key, but only a foreign key to orders on order_no
DB B has a orders/order_rows structure too, which should be filled with data from DB A nightly on a incremental basis.
I was first thinking of creating a temp table in DB A called '#orders_in_db_b' I fill with all order numbers in DB B. I will then
1. Transfer orders from A to B where order_no not in #orders_in_db_b
2. Transfer order_rows from A to B where order_no not in #orders_in_db_b
Is there a better way to do this than using the temp table?
1. I don't want to include all orders in the flow and ignore those already present, as that will be a huge overhead of transferring unnecessary data between two servers (bandwidth is an issue)
2. Using that approach would fail on the order_rows anyway, as when processing order_rows, all orders are already available in orders in DB B, as orders has to be transferred first (or the foreign key constraint between orders and order_rows will prevent insertion of order_rows). And as no primary key in DB A's order_rows, there is no way to match order_rows between the two database, and hence has to make sure an order is always transferred in it's entirety and only once, or order_rows will be duplicated as the same can be inserted over and over again.
December 3, 2009 at 8:12 am
Can order rows be inserted/modified after creation? Or does this problem merely boil down to taking across order headers/details for those orders which do not already exist in the target database?
Is there a date created / date last modified field on the order header table?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
December 3, 2009 at 8:18 am
Phil Parkin (12/3/2009)
Can order rows be inserted/modified after creation? Or does this problem merely boil down to taking across order headers/details for those orders which do not already exist in the target database?
Precisely, they are static once created.
Phil Parkin (12/3/2009)
Is there a date created / date last modified field on the order header table?
No, there isn't.
December 3, 2009 at 8:41 am
OK what is the nature of the PK on the orders table - is it sequential/ascending?
(If it would help this problem) can you add any fields to either of the order header tables?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
December 3, 2009 at 8:52 am
Phil Parkin (12/3/2009)
OK what is the nature of the PK on the orders table - is it sequential/ascending?
It's not. There are more than one source of orders to DB A, which uses different sequences.
Phil Parkin (12/3/2009)
(If it would help this problem) can you add any fields to either of the order header tables?
In DB A, no. In DB B, I could theoretically add a dummy field is_newly_imported bit, and set it to 1 when imported, then update it to 0 after completion of order_rows.
But, I'm not found of messing up the data with dummy data to circumvent this problem. Then I rather use a temp table.
December 3, 2009 at 11:25 am
Check the link, Using Temporary Tables with SSIS, its simple, its just a small trick,
create a global temp table with the required strucutre first.
Set RetainSameConnection=TRUE on the connection manage.
set DelayValidation=TRUE for the sql and data flow tasks.
Conver the global temp table to local.
Sriram
April 13, 2010 at 7:36 am
Hi,
I'm not sure if you have solved your problem but in case you haven't or if somebody else out there has a similar problem the easiest way around this is to use a Table Variable instead of using a Temp Table.
One thing to note when using a Table Variable in SSIS is to add "SET NOCOUNT ON" in your SQL Command, if you don't do this the preview will return rows without any problem but on execution no rows will be written.
April 14, 2010 at 3:21 am
Hmmm, table variables have some disadvantages that temp tables haven't (and of course the other way around).
You can't truncate a table variable and you can't use DDL statements against a table variable (like adding constraints or indexes).
This article gives a nice overview:
http://www.sqlservercentral.com/articles/Temporary+Tables/66720/
It kind of depends on the use that you intend for your temporary table.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 14, 2010 at 10:36 am
Agreed da-zero. But table variables are meant to hold smaller datasets and having an index doesn't really benefit here and as regards truncation of the table variable, it's not required as the variable gets wiped off at the session end just like any other variable.
However only SSIS 2008 allows the use of table variables. Correct me if i am wrong.
Amol Naik
April 15, 2010 at 12:49 am
Amol Naik-681410 (4/14/2010)
Agreed da-zero. But table variables are meant to hold smaller datasets and having an index doesn't really benefit here and as regards truncation of the table variable, it's not required as the variable gets wiped off at the session end just like any other variable.However only SSIS 2008 allows the use of table variables. Correct me if i am wrong.
True true. (I haven't read all the thread, so I'm not sure what the original requirements were. That's why I said "It depends" at the end :-))
In some cases I encountered I needed indexes on my temporary tables (which did hold a lot of data) and truncation could be useful if you use it in a loop or something like that.
But it all depends, really 🙂
SQL Server 2005 allows the use of table variables, but SSIS, I'm not sure, I'll have to look it up. I know you can store a result set in an object variable, although that's not really the same.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 15, 2010 at 3:54 am
Hi Amol,
You can use them in SSIS 2005 also.
Al
Viewing 14 posts - 16 through 28 (of 28 total)
You must be logged in to reply to this topic. Login to reply