August 9, 2007 at 3:37 pm
I have two identical scirpts to output the result set of the stored procedure to csv files. However nothing happens for procedure B. It used to work for both, and the only change between success and failure is that the number of output columns went from 664 to 1052. I don't get any error messages while running the bcp command.
I have run the stored procedure and I know I do get records back.
EXEC
master..xp_cmdshell 'bcp "exec storedprocedureB" queryout F:\path\output.csv -T -c -t,'
Thanks
KR
August 9, 2007 at 4:13 pm
Has this ever worked? According to the boys in Redmond:
http://office.microsoft.com/en-us/excel/HP052009261033.aspx?pid=CH010006341033
Important When you insert cells, rows, or columns, keep in mind that the maximum size of a worksheet is 65,536 rows by 256 columns.
How do you get the rest of the data into Excel? A single worksheet is not large enough for data over 256 columns, but you can split the data over two worksheets, or you can bring only the columns you need to work with into Excel.
August 9, 2007 at 4:18 pm
Yes, it has worked when it was 664. I just opened the file in ultraedit to make sure it looked fine. I don't need to use excel since this data is to be sent to a vendor in the csv format, and the specifications are by the vendor.
So, I need to be able to export it, I don't really care if I cannot open it in excel.
I did confirm the columns were the problem, since I modified the sp to 664 columns again, and it worked!
Hope someone can help me. I really need to be able to get this to the Vendor.
Thanks
KR
August 9, 2007 at 4:55 pm
Back in the dim and distant past, isql used to have a -w option to specify the size of the output file. It shows up as an option on the MS site:
bcp {dbtable | query} {in | out | queryout | format} datafile
[-m maxerrors] [-f formatfile] [-e errfile]
[-F firstrow] [-L lastrow] [-b batchsize]
[-n native type] [-c character type] [-w Unicode characters]
[-N keep non-text native] [-V file format version] [-q quoted id]
[-C code page specifier] [-t field terminator] [-r row terminator]
[-i inputfile] [-o outfile] [-a packetsize]
[-S server name\instance name] [-U username] [-P password]
[-T trusted connection] [-v version] [-R regional enable]
[-k keep null values] [-E keep identity values]
[-h "load hints"]
so you might give it a whirl in case it's not a column limit, just an output default.
Other than that, I'm out of ideas on this one.
August 9, 2007 at 5:00 pm
Actually, the size of the file is pretty small, although the number of columns are large. It is just a test file with limited number of rows. So I am not sure whether setting the option will help. I'll just try it, just to see.
Thanks
KR
August 9, 2007 at 5:14 pm
Is this a pivot query where each rows becomes a new column? If that's the case, then you may be able to write a vbs script to pivot it for you. I know it's not the first choice for anyone... but at least it would work that way, unless you are facing a whole other problem.
August 9, 2007 at 5:17 pm
No it is not a pivot query - straghtforward select query with a bunch of joins and outputting 1052 columns
Thanks
KR
August 11, 2007 at 11:31 am
Found the cause !
BCP will not export more than 1026 columns - I verified this through a couple of tests. Unfortunatley, I did not find this documented. Anyway, I hope this helps someone else who encounters a similar issue.
Thanks
Kr
August 11, 2007 at 8:39 pm
Heh... but it IS documented in the SQL Specifications...
Columns per base table | 1,024 | 1,024 |
Why would BCP support more than that?
--Jeff Moden
Change is inevitable... Change for the better is not.
August 13, 2007 at 8:35 am
hmmm, however SSIS seems to.
Thanks
KR
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply