January 17, 2007 at 3:32 pm
Hi,
I have a package which loads the contents of a text file into a staging table on the database, and upon the completion of this, the workflow goes to a Execute SQL Task, in which I put a stored procedure to run, which manipulates this staging table data and loads in the respective data tables. The stored proc requires a input parameter, which is duely provided.
Now, this package has been working a treat for the past 5 months...I have not had any problems with it.
But since last week, I noticed that when the package was run, it would say Package Executed Successfully, but I saw that the runtime of the Execute SQL Task was 1 second (in reality it was more like 1ms). Now, this task normally takes 20 seconds or more. So I checked the data tables and found that the staging data had not been loaded.
So I ran the query which I use in the task, in query analyser and it ran for about 20 seconds. I checked the data tables again...and they did have the correct data loaded.
I have tried everytime this whole week and the same problem happens. I tried to remove that task and put a new one (identical in makeup though) and still the same thing happens.
I am at my wits end as to what to do to solve this.
Has anyone come across this before?
Any reply would be much appreciated as this is driving me up the wall.
Thanks a lot.
January 18, 2007 at 6:14 am
Run SQL Profiler while running the DTS package and look for the SQL statement from the ExecuteSQL task. Does it look like it has the right parameter values?
Russel Loski, MCSE Business Intelligence, Data Platform
January 18, 2007 at 9:05 pm
I checked via Profiler, and the parameters plus the query itself is correct. I ran what was in Profiler in Query Analyser and that is working fine, taking its sweet time, but when via DTS then in less than a second it finishes.
I tried to create a new Execute SQL task with no parameters from the Global Variables, but hardcoded the parameter. That time it worked...!!!
So..yeah..i dunno why that is happening.
January 19, 2007 at 6:31 am
Let me get this straight. You ran the dts package with an executeSQL statement that used a global variable. When you saw the statement in the profiler, the statement completed event displayed and the value from the global variable (not the global variable itself) was in the text for the statement. If you hard coded the parameter, and exactly the same text shows up in profiler (same as when you had the value in the global variable).
Russel Loski, MCSE Business Intelligence, Data Platform
January 19, 2007 at 7:01 pm
Yes, that is correct
January 20, 2007 at 9:29 am
I'm stumped. Hopefully someone else can help you.
Russel Loski, MCSE Business Intelligence, Data Platform
January 20, 2007 at 9:29 am
I'm stumped. Hopefully someone else can help you.
Russel Loski, MCSE Business Intelligence, Data Platform
January 21, 2007 at 9:35 pm
Ok, the 2 profiler statements are EXACTLY the same, but essentially are the same, since the params are the same.
The 2 statements are:
1. when hard code:
EXEC USP_DTS_BatchFile_FraudCheck 'D:\FTP\RPIBatch\Output\'
2. when global variable:
exec [PredatorRHB1]..sp_procedure_params_rowset N'USP_DTS_BatchFile_FraudCheck', 1, NULL, NULL
go
exec sp_executesql N'exec USP_DTS_BatchFile_FraudCheck @P1', N'@P1 varchar(200)', 'D:\FTP\RPIBatch\Output\'
go
January 22, 2007 at 3:58 am
You probably did this, but just in case: you ran the entire statement in 2 in Query Analyzer and it ran?
Russel Loski, MCSE Business Intelligence, Data Platform
January 22, 2007 at 5:01 pm
Yeah, I did mate. It ran fine.
January 23, 2007 at 6:24 am
When you run the query in query analyzer you have the same security context as when the dts package failed (same NT login, same SQL login, whether the NT login or some SQL login)?
Russel Loski, MCSE Business Intelligence, Data Platform
January 23, 2007 at 6:38 pm
Yes, absolutely. Everything is done with an SA login
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply