August 6, 2010 at 7:18 am
I have a simple insert query designed to go out and pull in prior day data and populate it into a table. When I run the query on its own it works just fine, completes, populates data. The problem is that when I take the EXACT same query and run it as a daily scheduled job, it completes successfully but doesn't populate any data.
I just did a test for yesterday's data for example. The job ran, finished, didn't fail, but didn't populate a single record. I then took the query from within the job, copied/pasted it into an SSMS query window, ran it, and it finished successfully and populated the data.
Any help is appreciated in helping me get the query to work as a job.
Thanks!
August 6, 2010 at 8:07 am
i'm guessing permissions maybe? the job when run under the SQL Agent might not have permissions to some shared resource or not? what is the job doing?
Lowell
August 6, 2010 at 8:26 am
I don't know that it is a permissions issue. I have other jobs hitting the same back-end source and they're not having a problem.
All the query does is go out and pull in data from a table for the prior day, and insert it into a local table on my server.
I have had other jobs that had permissions issues but they were running BIDS/SSIS packages. I fixed those permissions errors but when they failed, the job completely failed and gave error messages.
With this job, the sql code is embedded directly in the job step, the job completes with no errors (it does not fail), and the only issue is the fact that no data is being written into the destination table on my server.
August 6, 2010 at 8:30 am
check the [Run As] box in the SQL job step. It is probably different from the one you use for SSIS. The step type is also different. Read up on Credentials and Proxy accounts
August 6, 2010 at 8:38 am
Let me clarify...I have jobs that are executing SSIS packages AND jobs that are just embedded sql code...BOTH types are working fine. I realize the SSIS scheduled jobs are different step types and I have a proxy account setup that is used as the 'Run As' for them.
The job in question that is failing does not have a 'Run As' value. The jobs that are NOT failing (and are populating data) and hitting the same back-end ALSO do not have a 'Run As' value.
And again, the code works 100% perfectly if I run it in a query window all on its own. It's only when scheduled as a job step that it is not populating data - but not giving any failure indications either.
August 6, 2010 at 8:43 am
OK, you do not have a [Run As] then it is running as the service account.
[Shift Right click] on SSMS, select [Run As], enter the Service Account details, then run the query embedded in your SSIS package. It sounds like you'll get a permission error.
August 6, 2010 at 8:43 am
What user account are you using for the SQL job?
Have you looked at the job history? Have you looked at the SQL logs? Is there a USE statement at the top of the query that may be pointing at the wrong database?
If all else fails, drop the job and re-create it. Use one of the successful jobs as a template.
Andrew SQLDBA
August 6, 2010 at 9:22 am
There is not a USE statement at the beginning of the query. I prefaced all of my schemas/tables with the appropriate db information.
The job history simply shows it was completed successfully. As far as the entry for which user executed the job, it lists it as:
"NT AUTHORITY\SYSTEM".
I tried recreating the job using a template of one that is successfully populating data and I got the same result - job succeeds, but no data populated.
The only code difference between the job that populates and this one that doesn't (with the exception that they're used for different purposes and pulling in different data from different tables) is that this one uses openquery logic.
August 6, 2010 at 9:29 am
That is it. There is your problem. The openquery is not getting any recordset from what ever it is trying to select data from. And in turn, not populating any other tables. I am assuming you are trying to select data from a text file.
Create a user account, give that user account access to where ever the text file is sitting, and then use that same user account to fire off the SQL job. You should never use the NT Auth account, or the Network Auth accounts in SQL, or any other place really. You can also use the same Windows account that you use to run the SQL Service, and give that user account access to the directory.
I would actually create a SSIS package to handle all of this, and never use openquery.
Am I assuming correctly about the text file? and that the user account that runs the SQL Service does not have access to that directory.
Andrew SQLDBA
August 9, 2010 at 12:02 am
Mr.Corn, are u populating data into a Global Temp table ? If yes, then are there any jobs scheduled at the same time that will drop the global Temp Table ?
August 9, 2010 at 6:40 am
No, not a text file. I'm hitting an Oracle back-end db. I'm using the openquery because it seems to be the only way I can get the data extraction to run (otherwise it takes so long - hours - that it eventually would time out). And, I'm also hitting the SYS.DUAL table on the Oracle server to pull in the SYSDATE and the SYSDATE-1 for a date range of data to pull, and to also create a dynamic YYYYMM flag from the source db.
It's just weird that I can run the exact same query, letter for letter, outside of the job and it completes and populates within minutes, but from within the job it completes in about the same time but doesn't populate. :unsure:
I tried recreating the embeded query and used logic to use the date of my SQL server to create my dynamic date range and the YYYYMM value and built it w/o using an openquery - the job fails /times out after 12 hours, but the query will run if I run it via SSMS on its own.
December 6, 2015 at 10:58 am
Mr Corn Man
Several years ago you had an issue where a query in SQL Agent said it ran complete but the records were not in the table.
We have the same issue. I have a sproc run in SSIS, a stand alone query in SSIS and also a stand alone query in SQL agent. I also have a point where SQL Agent stops for 5 min before running the query.
I have other query that run, not a permission issue. This is a simple update to Product table from ProductOld table.
SSIS and SQL agent all say the qery run correctly but did not update row. IF - BIG If I run the exact same SSIS job 2 hours after the original ran, the SSIS job runs and the QUERY DOES update. We have been running crazy to find a solution.
Not permissions, no error, simple table to table in same DB update inside a set of 15 steps of a SQL Agent. The reason we know it does not update is the next step in the SQL agent will fail IF the update does not run. So when step 21 fails I know the query in step 20 did not update. Start SQl agent at step 20 to run query update, it finishes and then step 21 finishes correctly. I can see on some days where I ran the query in SQL agent the rows that did not update in the nightly, so without question the update is not occurring.
Any help, ideas appreciated.
December 6, 2015 at 12:25 pm
Well, it's been 5 years and unfortunately I don't recall the specifics around the issue or how I went about fixing it, but short of your issue being a permissions issue with the account used for running the job, and not knowing more details about what you're query is doing, I'm not sure what to tell you. Here are some thoughts:
When it fails, why do you wait 2 hours to try running it again?
--> Is it possible the source data is not available when you're running the job initially, but it is later?
When you rerun it and it is successful, how are you executing the job? Are you manually right clicking and selecting 'start job' or 'start job at step', or are you rerunning it through another scheduled start time?
--> If you're manually running it via the right click method, try scheduling it to run right away and see if it works or fails. If it fails, but you can right click and run it and it passes, it must be a permission/rights issue with the agent versus your ID.
Is it successful when you rerun it prior to 2 hours later? Meaning, can you rerun it immediately, in 5 minutes, 10 minutes, 30 minutes, etc... and it is successful?
--> If 2 hours have to pass prior to it running successfully, then it may be an issue with the source data not being available when you initially try to run the job.
December 6, 2015 at 1:01 pm
I'm thinking that it still might be a privs thing for the login that SQL Server Agent uses.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 6, 2015 at 8:49 pm
Best to start a new thread??. In new thread I suggest scripting out job, obfuscating names via Replace All, posting sanitized script as SQL, and inspecting & stating step 19's advanced options. Also post new thread's URL here??.
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply