November 19, 2009 at 9:42 am
Hi,
When I am trying to export data from a SQL Server View onto a .csv file using the data flow task, the data in the output .csv file is always out of sync with the header i.e I am getting data ouside the last colum i.e data in 20th column is getting in the 25th column like that.
Could anyone please help me how to get in exact sync with the header ?
Thank You,
Best Regards,
SQLBuddy.
November 19, 2009 at 10:24 am
Unless the data you're exporting is *exactly* the same length as the headers for the columns, you're never going to 'align' the headers with the data in the rows. By it's nature/design, a row within a CSV file is variable in length, the way you discern between 'fields' is through the use of the separator (ie comma, pipe etc).
if you headers are "a", "B", "C" and your data is anything more than 1 character long, you'll be 'out of sync'.
HTH,
Steve.
November 19, 2009 at 10:41 am
Hi Steve,
Thank You very much.
Could you please tell me how to overcome this problem? Which parameters should I modify in the Flatfile connection string in order to fix this problem?
If I am missing any thing, please guide me.
Thank You,
Best Regards,
SQLBuddy.
November 19, 2009 at 10:53 am
With a CSV (or any separated file), there is nothing to change, this is how it works 🙂
If you truly want a file that aligns the data (say when viewed in Notepad) then you could try a fixed width file instead of csv. This will ensure that (say) column 4 *always* starts at position 50 and runs for 20 characters. The downside to this style of file is that you need to know, ahead of time, what the max width of any of the exported fields will be.
On a different tack, what are you trying to achieve (other than your initial statement about getting headers aligned with data). Generally, in the scheme of things, the alignment of a header with the data is of little consequence. If you want to test/view the data in columns, import it into Excel and during the import, Excel will assign each valkue to their appropriate field. It's the same when importing to SQL, Oracle, any RDBMS and many OLAP tools (e.g. Cognos Transformer) - the import recognizes the csv as variable width and loads the fields correctly.
Steve.
November 19, 2009 at 12:35 pm
Hi Steve,
Thank you very much for your reply.
But my company wants data only in .csv file format.
Could you please tell if there is any workaround for this issue?
Thank You,
Best Regards,
SQLBuddy
November 19, 2009 at 1:01 pm
I think i may have misunderstood the issue. When you say
When I am trying to export data from a SQL Server View onto a .csv file using the data flow task, the data in the output .csv file is always out of sync with the header i.e I am getting data ouside the last colum i.e data in 20th column is getting in the 25th column like that.
do you mean that, for some reason, data that should be wholly contained in the 20th field is, when importing/using in excle etc, appearing in the 25th column?
Assuming the answer is yes, can you confirm that your data doesn't contain comma's? If it does, you will either want to choose a different separator (one that doesn't appear in the data itself, e.g. | [pipe] or ~ [tilde]), OR ensure that you have text quoting turned on, so any string with the separator (comma ',') are enclosed in your string quotes. so
<field_data_1>,Here, I see you have a problem, No?,<field_data_2>, etc
would become
<field_data_1>,"Here, I see you have a problem, No?",<field_data_2>, etc
Note, that this too can have problems if you chose double quotes as the string delimiter/identifier (e.g. " ) and your data includes these too (most often found when referring to inches, as in " Steve is 6'0" tall, wow!" - this will break as it has a dbl quote within the string.
Steve.
November 19, 2009 at 1:42 pm
Hi Steve,
Thanks a lot for your help.
You are absolutely correct! My data has a lot of commas. But the problem is that my company wants data to be exported to a .csv file which is then used by an application.
Could you please help me with this?
Thank You,
Best Regards,
SQLBuddy.
November 19, 2009 at 2:22 pm
Would it be possible for you to wrap the columns using a string character e.g. " ' "?
Sample data:
CREATE TABLE csv(id int, data varchar(30))
INSERT INTO csv
SELECT 1, 'single value: a' UNION ALL
SELECT 2, 'double value: a,b' UNION ALL
SELECT 3, 'triple value: a,b,c'
A bcp command would look something like:
bcp "SELECT id, ''''+ data + '''' FROM SSC_Test.dbo.csv" queryout "result.csv" -T -c -t ,
and the data in the final flat file
1,'single value: a'
2,'double value: a,b'
3,'triple value: a,b,c'
November 20, 2009 at 11:44 am
What Lutz has described can also be done within SSIS also. Look at your flat file destination, and modify the settings for what character/s strings should be enclosed in.
Steve.
November 20, 2009 at 12:01 pm
Hi Steve and Lutz,
Thank you very much for all your great help. Finally I was able to resolve this issue. Actually we created a view for a table. All the NULL values in the table have to be replaced by commas acording to the vendor. So all the Nulls are replaced by the commas in that view. I craeted an SSIS packege to export data from that view to a table.
Later I came to know that we can't have commas in the data to exported into the .CSV file acording to Steve. So I replaced all the data in the view with a space.Then to my surprise I was able to get the data in the right format.
Thank You once again for providing unparalled guidance.
Best Regards,
SQLBuddy.
November 22, 2009 at 9:47 am
Hi.
I think you can refer to the url below:
http://www.dotnetspider.com/resources/701-Export-Data-CSV-Excel.aspx
November 23, 2009 at 2:21 pm
Hi,
Again I got a small problem. Actually in the output .csv file 70% of rows are good but in the remaining 30% of rows, data is going out of sync.
Could anypone please help me with this?
Thank You,
Best Regards,
SQLBuddy
November 23, 2009 at 10:10 pm
sqlbuddy123 (11/23/2009)
Hi,Again I got a small problem. Actually in the output .csv file 70% of rows are good but in the remaining 30% of rows, data is going out of sync.
Could anypone please help me with this?
Thank You,
Best Regards,
SQLBuddy
I would imagine that it will be a lot easier for people to help if you post your code and attach the .csv file.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 23, 2009 at 10:13 pm
becklery (11/22/2009)
I think you can refer to the url below:
http://www.dotnetspider.com/resources/701-Export-Data-CSV-Excel.aspx%5B/quote%5D
Heh... not exactly a T-SQL solution...
--Jeff Moden
Change is inevitable... Change for the better is not.
November 24, 2009 at 7:49 am
Hi Guys,
The issue is solved. Actually there were some commas in the data. I replaced them with white spaces and the issue was solved.
Thank You,
Best Regards,
SQLBuddy
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply