April 22, 2016 at 12:08 pm
I have two SSIS Package in which the Job fails and there is no information in Job History other then the step failed.
There are task to TRUCATE and reload the table.
Other than creating task in the package to store information in a table for each table that is refreshed is there a way to identify the cause of the failure?
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 12:25 pm
bit difficult to say....bit sparse on info.....care to elaborate a bit. ?
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
April 22, 2016 at 12:30 pm
Welsh Corgi (4/22/2016)
I have two SSIS Package in which the Job fails and there is no information in Job History other then the step failed.There are task to TRUCATE and reload the table.
Other than creating task in the package to store information in a table for each table that is refreshed is there a way to identify the cause of the failure?
Use the Log Welsh!
😎
April 22, 2016 at 12:31 pm
J Livingston SQL (4/22/2016)
bit difficult to say....bit sparse on info.....care to elaborate a bit. ?
The problem is that the Job that executes a package fails with nothing in the Job history but the Step Failed.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 12:38 pm
check to see how long your history retention is; it could easily have been written, but aged out by the time you look at it.
if it is still a the server default, failed job steps could easily be pushed out/aged due to other job steps(ie LOG backups ever 15 minutes,for 100 database,s is 1500 messages, 500 more than the 1K default.
Lowell
April 22, 2016 at 12:38 pm
The most common cause of a TRUNCATE TABLE failure is the login running the code doesn't have db_owner on the database or ALTER permissions on the database. ISTR that even granting ALTER didn't work for me. So try changing the TRUNCATE TABLE to DELETE FROM TABLE.
If it fails then, there is another problem. And you'll have to look at the logs.
April 22, 2016 at 12:59 pm
Brandie Tarvin (4/22/2016)
The most common cause of a TRUNCATE TABLE failure is the login running the code doesn't have db_owner on the database or ALTER permissions on the database. ISTR that even granting ALTER didn't work for me. So try changing the TRUNCATE TABLE to DELETE FROM TABLE.If it fails then, there is another problem. And you'll have to look at the logs.
I already did that.
Testing.
Thanks.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 1:01 pm
Lowell (4/22/2016)
check to see how long your history retention is; it could easily have been written, but aged out by the time you look at it.if it is still a the server default, failed job steps could easily be pushed out/aged due to other job steps(ie LOG backups ever 15 minutes,for 100 database,s is 1500 messages, 500 more than the 1K default.
I just created the job yesterday.
There is an entry for the job and the step but the only information that I get is that the step failed.
Thanks.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 1:03 pm
You COULD always configure the SSIS packages to create logs which will tell you what happened on error.....
April 22, 2016 at 1:12 pm
ok.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 1:25 pm
you should still check what i mentioned.
last run status can still be seen in Job Activity Monitor with a big red X, but the history could be erased.
Lowell
April 22, 2016 at 1:36 pm
Lowell (4/22/2016)
you should still check what i mentioned.last run status can still be seen in Job Activity Monitor with a big red X, but the history could be erased.
Yes sir I thought that I mentioned that I did check the job history.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 1:39 pm
I executed that package interactively and it barfed at a Enable Trigger Task.
I fixed the issue, then re-deployed and the job is running and is on the last load which is 37 + million rows.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
April 22, 2016 at 5:08 pm
Welsh Corgi (4/22/2016)
Lowell (4/22/2016)
check to see how long your history retention is; it could easily have been written, but aged out by the time you look at it.if it is still a the server default, failed job steps could easily be pushed out/aged due to other job steps(ie LOG backups ever 15 minutes,for 100 database,s is 1500 messages, 500 more than the 1K default.
I just created the job yesterday.
There is an entry for the job and the step but the only information that I get is that the step failed.
Thanks.
If you go to the "advanced" tab there within each step of the job, I believe you might find the option to "Include Step Output to Log" for future debugging purposes. Can't say for sure for SSIS steps because I avoid SSIS whenever I can and have been successful at that for quite a while. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
April 22, 2016 at 5:12 pm
Jeff Moden (4/22/2016)
Welsh Corgi (4/22/2016)
Lowell (4/22/2016)
check to see how long your history retention is; it could easily have been written, but aged out by the time you look at it.if it is still a the server default, failed job steps could easily be pushed out/aged due to other job steps(ie LOG backups ever 15 minutes,for 100 database,s is 1500 messages, 500 more than the 1K default.
I just created the job yesterday.
There is an entry for the job and the step but the only information that I get is that the step failed.
Thanks.
If you go to the "advanced" tab there within each step of the job, I believe you might find the option to "Include Step Output to Log" for future debugging purposes. Can't say for sure for SSIS steps because I avoid SSIS whenever I can and have been successful at that for quite a while. 😀
Thank you Sir.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply