SSIS Flatfile destination

  • I have a package where after some TSQL tasks, nearly 154 rows gets written to a Flatfile destination as txt file.

    The problem i see is I see all 154 rows in the preview when i explore the package,

    I see them all when i run the TSQL via SSMS

    but when i put the code in the package it writes nothing ...

    in the console it shows some times

    154 rows written and at times it shows

    0 rows written ???

    I dont get it ? Why ?

    The task above and below it work fine .. (they create header and trailer for the text file)

    What could be the possible troubleshooting ?

    Thanks in advance

    [font="Verdana"]
    Today is the tomorrow you worried about yesterday:-)
    [/font]

  • Do you receive any errors or warnings?

    Is the flat file destination green at the end of the package?

    Are you looking at the right file?

    Do you use any variables or expressions?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • YES the package is Green.

    No i dont get any warnings.

    YES i am looking the rigth file (the tasks that are supposed to create header and footer are working fine, i see the header and footer content in the file)

    Yes i am using a varibale for creating a file name.

    [font="Verdana"]
    Today is the tomorrow you worried about yesterday:-)
    [/font]

  • I'm seeing something similar... no answer for you.

    My error lines appear as follows:

    [DTS.Pipeline] Information: "component "Error Destination" (6253)" wrote 3 rows.

    [DTS.Pipeline] Information: "component "Flat File Destination" (51695)" wrote 0 rows.

    [DTS.Pipeline] Information: "component "Destination" (54)" wrote 0 rows.

    This file is designed to make the component fail and contains only 3 rows. I'm trying to figure out how to get the information to write to the file. I've been wondering if it has to do with being a Power User and not an Admin on the box.

    Are you an admin on the box?

    Jamie

  • obviously the data records are not being sent down the flow since only headers are being written to the file. Review the T-SQL statements or the source components at the beginning of the flow.

  • Sometimes (like this instance with nearly 200 columns) it is easier to start over. Starting over works.

    Jamie

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply