July 21, 2010 at 9:35 am
Hi, I have a pretty simple SSIS package that uses a for each loop to look for files saved on a local server. It then takes those files and loads them into a SQL DB. The package runs perfectly in BIDS. Then when I tried to schedule it with SQL Server Agent and I ran into the usual problems. I now have it running successfully when I run it as a SQL Agent job but it executes in 2 seconds and does not do its task. A.K.A it thinks it has succeeded but no files actually get copied to the DB.
I believe the problem is permission related. I don't the SQL Agent can see any of the files in the designated folder (that's on another local machine in the domain.) I am using my Windows Authentication as owner in the Job Step and I created the package with the same username. I have the package saved on the same SQL Server that the DB it's copying to is located.
In an unrelated matter is there an easy way to delete the file after the package reads it into the DB?
Any help would be greatly appreciated, I've been stuggling with this for a few days now.
July 21, 2010 at 1:26 pm
Check to make sure the path you list in your package to the folder and files is not a relative one (mapped, or only visible that way to the local machine), I find it best to use UNC path (\\hostname\sharename\path\folder).
The service, or other account you are using for SSIS should have the proper permissions on the folder to do the tasks you are asking.
July 22, 2010 at 9:11 am
It's not the mapping, I'm using UNC. The only thing I can think of is the account running the Agent job doesn't have correct permissions to see the files in the folder on another server in the domain.
The problem with this theory is that I'm using my Domain username and login to run he Agent Job. My account has permissions to both run the job and look into that server, so theoretically it should work.
The two solutions I can think of are:
A. Sql Server Agent Proxy Account
B. Create a new domain user and give it access to the local server the files are on and SQL Server
Thoughts?
July 22, 2010 at 11:44 am
Also how do you see the detailed error log for the Agent job? The history is nto enough information.
July 22, 2010 at 11:52 am
If it were a security related issue (i.e. insufficient permissions), in many cases the package would just outright fail with a glaring indication of insufficient permissions. It might be helpful to add some sort of logging in the process to see what is going on. It sounds more like your package is suddenly looking at a folder where nothing matches the search criteria in your foreach container (it's happened to me several times). Perhaps writing out the values used for the folder and file settings in your foreach container and then writing out the files that match the criteria could give you some more insight. Maybe a configuration file or expression somewhere is changing the folder you expect to be looking at.
For deleting the file after importing, just add a File System Task inside your foreach container after your import process and set it's operation to Delete File, should do the job just fine.
Kris
July 22, 2010 at 12:15 pm
How do I create a detailed log? And could the problem maybe be the file type it's trying to read? It's saved as a weird unknown file type. Maybe I need a config file?
July 22, 2010 at 12:43 pm
Adding and Configuring Logging from MSDN should get you started. They're actually instructions on how to enable logging for a tutorial but you can probably adapt it to your needs.
Basically says: Enable logging within BIDS by selecting SSIS-->Logging, adding a logging provider (for troubleshooting like this a flat text file is a expedient as anything).
Select the containers on the left that you want to log events from (note the grayed out check boxes seem to mean that they inherit their settings from their parent container, to explicitly set them just hit the check box accordingly)
Enable the log provider that you just added.
Then for each container that you enabled, go to the details tab and select the events that you want to log. OnError and OnInformation will probably give you enough info to work with.
Be sure and disable or curtail this logging once you work out your problems since these events will really expand the file size if left unchecked.
Hope it helps.
Kris
August 18, 2010 at 2:58 pm
I have a related question. I have a package that is working and logging correctly under BIDS 2005 (SS 2005). It logs what I want to a text file. But when I run the package under the scheduler, all it logs is "Package Started" and "Package Ended". How do I get it to log the same under the scheduler as under BIDS?
Thanks,
John O'Brien
kris386 (7/22/2010)
Adding and Configuring Logging from MSDN should get you started. They're actually instructions on how to enable logging for a tutorial but you can probably adapt it to your needs.Basically says: Enable logging within BIDS by selecting SSIS-->Logging, adding a logging provider (for troubleshooting like this a flat text file is a expedient as anything).
Select the containers on the left that you want to log events from (note the grayed out check boxes seem to mean that they inherit their settings from their parent container, to explicitly set them just hit the check box accordingly)
Enable the log provider that you just added.
Then for each container that you enabled, go to the details tab and select the events that you want to log. OnError and OnInformation will probably give you enough info to work with.
Be sure and disable or curtail this logging once you work out your problems since these events will really expand the file size if left unchecked.
Hope it helps.
Kris
February 1, 2011 at 8:06 am
I have the exact same issue. My package only logs the PackageStart, PackageEnd and onError events when executed from the SQL Agent or dtexec, even though I enabled onPreExecute and onPostExecute on all containers. When I run it from BIDS or the SSIS service, it logs all events I enabled for the containers.
I noticed that when I schedule the Package in the job, the /Reporting parameter for dtexec defaults to E, which could explain why it would log the errors. But I can't override it! I've scripted the job, dropped it and then recreated it with the "/Reporting V", but when I open the job, the parameter defaults back to "/Reporting Eā?!?!?!?!?!
Does anyone have any advice?!
February 1, 2011 at 1:46 pm
What may be happening is that the service account the scheduler runs your package under may not have write access on the folder the log file is being written to. The Package Started/Ended messages come from the process that starts the job under the scheduler (different id). I had to bug my DBA to set write access to the folder for the service account.
hennie.leach (2/1/2011)
I have the exact same issue. My package only logs the PackageStart, PackageEnd and onError events when executed from the SQL Agent or dtexec, even though I enabled onPreExecute and onPostExecute on all containers. When I run it from BIDS or the SSIS service, it logs all events I enabled for the containers.I noticed that when I schedule the Package in the job, the /Reporting parameter for dtexec defaults to E, which could explain why it would log the errors. But I can't override it! I've scripted the job, dropped it and then recreated it with the "/Reporting V", but when I open the job, the parameter defaults back to "/Reporting Eā?!?!?!?!?!
Does anyone have any advice?!
February 1, 2011 at 11:55 pm
Hi, thanks for the reply, but that is unfortunately not the issue. I use the SQL Server log provider. I tried logging to a file, gave the service account access to the file system, but I get the same issue.
I've searched the web for a solution, where many have the same issue with no way forward...
February 2, 2011 at 8:46 am
I have a special domain user account reserved for this task which has the necessary windows permissions on the places it copies files to and from plus the sql permissions necessary to run the jobs.
This is then set up as the proxy account and selected when creating the jobsteps. Everything works beautifully - the usual task is to copy a backup file to another server location or to run ssis packages, for which it needs one of the dts roles in msdb.
February 2, 2011 at 9:00 am
Can you give us some details on how this account is set up and why the default servicing account jobs are run under doesn't work? And also how the proxies work?
P Jones (2/2/2011)
I have a special domain user account reserved for this task which has the necessary windows permissions on the places it copies files to and from plus the sql permissions necessary to run the jobs.This is then set up as the proxy account and selected when creating the jobsteps. Everything works beautifully - the usual task is to copy a backup file to another server location or to run ssis packages, for which it needs one of the dts roles in msdb.
February 3, 2011 at 6:58 am
The service accounts don't have the permissions across servers that the special account has and you want to keep them that way for security. BOL or Google will tell you about setting up a credential (server security section in mgmt studio)and then setting that credential as a proxy (under sql agent)
February 4, 2011 at 12:43 pm
I guess I'm just trying to avoid the hassle of getting our data security people to go along with setting up a special account and using it as a proxy. Also, the package is logging to a file on the database server itself so I don't see why the service account would have problems with permissions. Lastly, is there some way to get some kind of error message about why the logging is failing?
P Jones (2/3/2011)
The service accounts don't have the permissions across servers that the special account has and you want to keep them that way for security. BOL or Google will tell you about setting up a credential (server security section in mgmt studio)and then setting that credential as a proxy (under sql agent)
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply