Really Bizzare Flat File Destination issue

  • I have a data flow task in an SSIS package that works when I run it on my workstation in SSDT, but when one of my coworkers runs it on his machine, or when I try to run it as a deployed package on the server, the file ends up being 0kb. When I look at the messages in the all executions report on the server, the SSIS package shows a successful completion, and the OnInformation event is the message log shows the following information  :

    To CSVs:Information: "SURVEYMMDDYY CSV"
    wrote 15471 rows.


    I thought it might be some sort of file permissions error, but the same apckage successfully writes two other files to the same directory without any problems, although those are generated from the  reportgenerator SSIS task from  http://reportgeneratortask.codeplex.com/.

    I am confised, because I thought if the task could not write to the file, it would error out in some way, or at least say "wrote 0 rows" . Does anybody have any idea why this would be happening?

  • Steven.Grzybowski - Tuesday, January 17, 2017 6:50 AM

    I have a data flow task in an SSIS package that works when I run it on my workstation in SSDT, but when one of my coworkers runs it on his machine, or when I try to run it as a deployed package on the server, the file ends up being 0kb. When I look at the messages in the all executions report on the server, the SSIS package shows a successful completion, and the OnInformation event is the message log shows the following information  :

    To CSVs:Information: "SURVEYMMDDYY CSV"
    wrote 15471 rows.


    I thought it might be some sort of file permissions error, but the same apckage successfully writes two other files to the same directory without any problems, although those are generated from the  reportgenerator SSIS task from  http://reportgeneratortask.codeplex.com/.

    I am confised, because I thought if the task could not write to the file, it would error out in some way, or at least say "wrote 0 rows" . Does anybody have any idea why this would be happening?

    Are all three files using data from the same source?

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • you said it occurs when the package is run on the server, by the proxy or service account, right? or by another, specific user.....

    check the permissions on the share being written to, it might be that the permissions for READ and WRITE, but not MODIFY are in place?
    that would explain how it creates a stub file, but cannot reopen and add the real data, i'd think.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Lowell - Tuesday, January 17, 2017 7:18 AM

    you said it occurs when the package is run on the server, by the proxy or service account, right? or by another, specific user.....

    check the permissions on the share being written to, it might be that the permissions for READ and WRITE, but not MODIFY are in place?
    that would explain how it creates a stub file, but cannot reopen and add the real data, i'd think.

    I took a look at the permissions, and read, write & modify permissions are present for the proxy account.

  • Phil Parkin - Tuesday, January 17, 2017 7:07 AM

    Steven.Grzybowski - Tuesday, January 17, 2017 6:50 AM

    I have a data flow task in an SSIS package that works when I run it on my workstation in SSDT, but when one of my coworkers runs it on his machine, or when I try to run it as a deployed package on the server, the file ends up being 0kb. When I look at the messages in the all executions report on the server, the SSIS package shows a successful completion, and the OnInformation event is the message log shows the following information  :

    To CSVs:Information: "SURVEYMMDDYY CSV"
    wrote 15471 rows.


    I thought it might be some sort of file permissions error, but the same apckage successfully writes two other files to the same directory without any problems, although those are generated from the  reportgenerator SSIS task from  http://reportgeneratortask.codeplex.com/.

    I am confised, because I thought if the task could not write to the file, it would error out in some way, or at least say "wrote 0 rows" . Does anybody have any idea why this would be happening?

    Are all three files using data from the same source?

    All the data comes from the same Sql table, but 2 files are an SSRS report being exported to an xls & a PDf, while the one that is causing me trouble is from a Data flow task.

  • Steven.Grzybowski - Tuesday, January 17, 2017 7:20 AM

    Lowell - Tuesday, January 17, 2017 7:18 AM

    you said it occurs when the package is run on the server, by the proxy or service account, right? or by another, specific user.....

    check the permissions on the share being written to, it might be that the permissions for READ and WRITE, but not MODIFY are in place?
    that would explain how it creates a stub file, but cannot reopen and add the real data, i'd think.

    I took a look at the permissions, and read, write & modify permissions are present for the proxy account.

    Additionally, shouldn't it say that it wrote 0 rows if it could not write to the file?

  • Additionally, shouldn't it say that it wrote 0 rows if it could not write to the file?

    I think the Flat File Connection Manager will create a zero byte file in the PreExecute/Validation phase, before it even gets to the processes that actually connect and process rows.
    thats where i thought you'd see a zerobyte file, but no data if there was a permissions issue;
    is it possible that some parameter or constraint is not being met, so it never  runs the logic that populates the  file?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Lowell - Tuesday, January 17, 2017 8:32 AM

    Additionally, shouldn't it say that it wrote 0 rows if it could not write to the file?

    I think the Flat File Connection Manager will create a zero byte file in the PreExecute/Validation phase, before it even gets to the processes that actually connect and process rows.
    thats where i thought you'd see a zerobyte file, but no data if there was a permissions issue;
    is it possible that some parameter or constraint is not being met, so it never  runs the logic that populates the  file?

    The connection manager for the flat file connection has an expression that names the file SURVEYMMDDYY.CSV
    The file is created when the data flow task runs , so I would think that the file it is attempting to populate would be the same one it created.
    As far as the parameter/Constraints, there are none that can be set on the server. All of the variables use the current date to populate. 

  • Steven.Grzybowski - Tuesday, January 17, 2017 7:39 AM

    Steven.Grzybowski - Tuesday, January 17, 2017 7:20 AM

    Lowell - Tuesday, January 17, 2017 7:18 AM

    you said it occurs when the package is run on the server, by the proxy or service account, right? or by another, specific user.....

    check the permissions on the share being written to, it might be that the permissions for READ and WRITE, but not MODIFY are in place?
    that would explain how it creates a stub file, but cannot reopen and add the real data, i'd think.

    I took a look at the permissions, and read, write & modify permissions are present for the proxy account.

    Additionally, shouldn't it say that it wrote 0 rows if it could not write to the file?

    Modify is unticked in this screen shot.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Phil Parkin - Tuesday, January 17, 2017 8:44 AM

    Modify is unticked in this screen shot.

    I thought that at first, but that's Lowell's screenshot, not the OPs 🙂

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • After some tinkering, the hack  work-around that fixes the problem is to use a c# script task to create a blank file with the same filename, then have the data flow task run after that,  I am wondering if for some reason there was some kind of lock on the file after creating it, but having a simple File.Create(filename).dispose in c# seems to have fixed the problem.

  • Steven.Grzybowski - Tuesday, January 17, 2017 11:58 AM

    After some tinkering, the hack  work-around that fixes the problem is to use a c# script task to create a blank file with the same filename, then have the data flow task run after that,  I am wondering if for some reason there was some kind of lock on the file after creating it, but having a simple File.Create(filename).dispose in c# seems to have fixed the problem.

    We have had trouble with expressions in connections (for files and database) quite often. Sometimes delayed validation provides a fix, sometimes it keeps returning every now and then. As far as I know it is also related to some timing issue, that is why it may not occur on one machine (workstation or server) and seems almost persistent on another machine. If the run does not end properly the file 'connection' may remain open and you cannot access its content or even see that it has some content. I suggest looking for a dangling process and be sure that everything has finished running before you try to open this CSV file.

  • vliet - Wednesday, January 18, 2017 2:50 AM

    Steven.Grzybowski - Tuesday, January 17, 2017 11:58 AM

    After some tinkering, the hack  work-around that fixes the problem is to use a c# script task to create a blank file with the same filename, then have the data flow task run after that,  I am wondering if for some reason there was some kind of lock on the file after creating it, but having a simple File.Create(filename).dispose in c# seems to have fixed the problem.

    We have had trouble with expressions in connections (for files and database) quite often. Sometimes delayed validation provides a fix, sometimes it keeps returning every now and then. As far as I know it is also related to some timing issue, that is why it may not occur on one machine (workstation or server) and seems almost persistent on another machine. If the run does not end properly the file 'connection' may remain open and you cannot access its content or even see that it has some content. I suggest looking for a dangling process and be sure that everything has finished running before you try to open this CSV file.

    The best guess I have is that my workstation tends to have a lot more things running on it, which tends to make thing a bit more sluggish, while my coworkers machine is generally running one or two things at a time, as he mostly does QA/testing. Additionally, running directly on the server is faster than running it on my machine over the network.     I will most likely just go ahead and use the Create(Filename).dispose method for anything where I need to create a file on the fly when a package is run just to make sure this issue can be avoided.   

  • So as of the last few times testing this, Even after creating an empty file first, the file still ends up blank with the data flow task still saying that it has written X rows. I even tried putting two of the same data flow task in the package, just to see if the second time around would fix this, but no luck there either.    I checked over the permissions, and the proxy being used to run the agent job has full permissions on the folder I am attempting to write to. Additionally, two other files get written to this directory with no issues.    I am very confused as to why I am having this issue.

  • Steven.Grzybowski - Monday, January 23, 2017 6:39 AM

    So as of the last few times testing this, Even after creating an empty file first, the file still ends up blank with the data flow task still saying that it has written X rows. I even tried putting two of the same data flow task in the package, just to see if the second time around would fix this, but no luck there either.    I checked over the permissions, and the proxy being used to run the agent job has full permissions on the folder I am attempting to write to. Additionally, two other files get written to this directory with no issues.    I am very confused as to why I am having this issue.

    The latest thing I have done to attempt to diagnose this was to put a WAITFOR DELAY of 1 minute between the creation of the blank file and the writing of the rows.  The timestamps on the file show a gap of 1 minute between the file created and the file modified timestamps.  I am at a complete loss as to why nothing will write to this file.

Viewing 15 posts - 1 through 15 (of 21 total)

You must be logged in to reply to this topic. Login to reply