Exporting Data to CSV file

  • Hi,

    When I am trying to export data from a SQL Server View onto a .csv file using the data flow task, the data in the output .csv file is always out of sync with the header i.e I am getting data ouside the last colum i.e data in 20th column is getting in the 25th column like that.

    Could anyone please help me how to get in exact sync with the header ?

    Thank You,

    Best Regards,

    SQLBuddy.

  • Unless the data you're exporting is *exactly* the same length as the headers for the columns, you're never going to 'align' the headers with the data in the rows. By it's nature/design, a row within a CSV file is variable in length, the way you discern between 'fields' is through the use of the separator (ie comma, pipe etc).

    if you headers are "a", "B", "C" and your data is anything more than 1 character long, you'll be 'out of sync'.

    HTH,

    Steve.

  • Hi Steve,

    Thank You very much.

    Could you please tell me how to overcome this problem? Which parameters should I modify in the Flatfile connection string in order to fix this problem?

    If I am missing any thing, please guide me.

    Thank You,

    Best Regards,

    SQLBuddy.

  • With a CSV (or any separated file), there is nothing to change, this is how it works 🙂

    If you truly want a file that aligns the data (say when viewed in Notepad) then you could try a fixed width file instead of csv. This will ensure that (say) column 4 *always* starts at position 50 and runs for 20 characters. The downside to this style of file is that you need to know, ahead of time, what the max width of any of the exported fields will be.

    On a different tack, what are you trying to achieve (other than your initial statement about getting headers aligned with data). Generally, in the scheme of things, the alignment of a header with the data is of little consequence. If you want to test/view the data in columns, import it into Excel and during the import, Excel will assign each valkue to their appropriate field. It's the same when importing to SQL, Oracle, any RDBMS and many OLAP tools (e.g. Cognos Transformer) - the import recognizes the csv as variable width and loads the fields correctly.

    Steve.

  • Hi Steve,

    Thank you very much for your reply.

    But my company wants data only in .csv file format.

    Could you please tell if there is any workaround for this issue?

    Thank You,

    Best Regards,

    SQLBuddy

  • I think i may have misunderstood the issue. When you say

    When I am trying to export data from a SQL Server View onto a .csv file using the data flow task, the data in the output .csv file is always out of sync with the header i.e I am getting data ouside the last colum i.e data in 20th column is getting in the 25th column like that.

    do you mean that, for some reason, data that should be wholly contained in the 20th field is, when importing/using in excle etc, appearing in the 25th column?

    Assuming the answer is yes, can you confirm that your data doesn't contain comma's? If it does, you will either want to choose a different separator (one that doesn't appear in the data itself, e.g. | [pipe] or ~ [tilde]), OR ensure that you have text quoting turned on, so any string with the separator (comma ',') are enclosed in your string quotes. so

    <field_data_1>,Here, I see you have a problem, No?,<field_data_2>, etc

    would become

    <field_data_1>,"Here, I see you have a problem, No?",<field_data_2>, etc

    Note, that this too can have problems if you chose double quotes as the string delimiter/identifier (e.g. " ) and your data includes these too (most often found when referring to inches, as in " Steve is 6'0" tall, wow!" - this will break as it has a dbl quote within the string.

    Steve.

  • Hi Steve,

    Thanks a lot for your help.

    You are absolutely correct! My data has a lot of commas. But the problem is that my company wants data to be exported to a .csv file which is then used by an application.

    Could you please help me with this?

    Thank You,

    Best Regards,

    SQLBuddy.

  • Would it be possible for you to wrap the columns using a string character e.g. " ' "?

    Sample data:

    CREATE TABLE csv(id int, data varchar(30))

    INSERT INTO csv

    SELECT 1, 'single value: a' UNION ALL

    SELECT 2, 'double value: a,b' UNION ALL

    SELECT 3, 'triple value: a,b,c'

    A bcp command would look something like:

    bcp "SELECT id, ''''+ data + '''' FROM SSC_Test.dbo.csv" queryout "result.csv" -T -c -t ,

    and the data in the final flat file

    1,'single value: a'

    2,'double value: a,b'

    3,'triple value: a,b,c'



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • What Lutz has described can also be done within SSIS also. Look at your flat file destination, and modify the settings for what character/s strings should be enclosed in.

    Steve.

  • Hi Steve and Lutz,

    Thank you very much for all your great help. Finally I was able to resolve this issue. Actually we created a view for a table. All the NULL values in the table have to be replaced by commas acording to the vendor. So all the Nulls are replaced by the commas in that view. I craeted an SSIS packege to export data from that view to a table.

    Later I came to know that we can't have commas in the data to exported into the .CSV file acording to Steve. So I replaced all the data in the view with a space.Then to my surprise I was able to get the data in the right format.

    Thank You once again for providing unparalled guidance.

    Best Regards,

    SQLBuddy.

  • Hi.

    I think you can refer to the url below:

    http://www.dotnetspider.com/resources/701-Export-Data-CSV-Excel.aspx

    RAQ Report: Web-based Excel-like Java reporting tool[/url]

  • Hi,

    Again I got a small problem. Actually in the output .csv file 70% of rows are good but in the remaining 30% of rows, data is going out of sync.

    Could anypone please help me with this?

    Thank You,

    Best Regards,

    SQLBuddy

  • sqlbuddy123 (11/23/2009)


    Hi,

    Again I got a small problem. Actually in the output .csv file 70% of rows are good but in the remaining 30% of rows, data is going out of sync.

    Could anypone please help me with this?

    Thank You,

    Best Regards,

    SQLBuddy

    I would imagine that it will be a lot easier for people to help if you post your code and attach the .csv file.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • becklery (11/22/2009)


    Hi.

    I think you can refer to the url below:

    http://www.dotnetspider.com/resources/701-Export-Data-CSV-Excel.aspx%5B/quote%5D

    Heh... not exactly a T-SQL solution...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Hi Guys,

    The issue is solved. Actually there were some commas in the data. I replaced them with white spaces and the issue was solved.

    Thank You,

    Best Regards,

    SQLBuddy

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply