May 1, 2008 at 8:12 am
We are migrating from an Access database to a SQL Server db, and our output is always to Excel.
Access happily output 1000 character columns to Excel and didn't care what the data format was, but in SSIS, I have had to cast every text column, and now my description fields are too long for the output.
1) Is there a way to get the full 1000 characters into Excel?
2) Does anyone know why my OLEDB Source is forcing me to cast 1 column (and not all the others with the same specification)?
As further background to (2), I have an OLEDB source which is a SQL query. The result then goes into a sort, a derived column, and a data conversion. The data conversion is where I cast all my text columns before output to the spreadsheet. I also have to cast 1 column in the OLEDB query, to avoid a failure there. This one column is no different to any of the others that don't need casting until output.
There is no MS Office on the server, so I can't do anything from inside Excel.
TIA
May 1, 2008 at 9:25 am
While I was waiting I tried changing the text columns in the data conversion transformation to DT_NTEXT. This got rid of the warnings but when I run the package, it now fails on validation with
[Write to AllData [714]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
[Write to AllData [714]] Error: Cannot create an OLE DB accessor. Verify that the column metadata is valid.
[DTS.Pipeline] Error: component "Write to AllData" (714) failed the pre-execute phase and returned error code 0xC0202025.
The column metadata is fine.
I've seen suggestions that I write to a flat file and then rename it, but I have to write to Excel. Because I don't have Excel on the server I am using a workbook open event to finish the formatting
May 1, 2008 at 3:00 pm
Hi,
I had the same problem...We used flatfile destination connection string for excel, though the file is pointing to actual excel file and mapped the columns in the editor. If you face data convesrsion problem, they use data convesrsio task in between. This worked fine for me...
Hope it helps !!
May 2, 2008 at 3:36 am
Thanks for the reply. Can you explain what you mean by editor please?
As far as I can see, the file connection manager couldn't write to my excel template because it wouldn't know which sheet to write to. Therefore I am assuming you mean that I should write to a temporary flat file as an interim step. That still leaves me with the problem of getting it into the workbook.
Don't forget, I need to put data into a the workbook, because it already has graphs and pivots defined. There is a macro that runs on open to refresh the pivots.
Also, there is a data conversion before the excel destination. It was putting this in that caused the 'Cannot create an OLE DB accessor" error message. If I don't put it in I am stuck with not being able to output half my data.
Thanks again
May 2, 2008 at 5:10 am
Think I'm mainly talking to myself here, but its probably worth posting a bit more, in case someone else has the problem.
My report consists of a parent package, a child package with a file system task that copies the Excel template file to the output location, and a child package to produce the report.
The report package consists of an ole db source, a sort data flow, a derived column data flow, a data conversion and the excel destination.
There are 3 columns in the data that can have up to 1024 characters. The data conversion was originally put in to convert all non unicode strings to unicode strings, and if I run the parent package in debug, it will run through until it hits the first row where one of these columns exceeds 255 characters.
If I change the data conversion to make these 3 columns unicode text streams, and then run the parent package, the child package fails on opening with the 'cannot create an ole db accessor' error. I've tried changing the 'validateexternalmetadata' property of the destination to false, with no luck.
The above 2 options seem to be the only ones available in SSis and they are clearly not acceptable to the excel destination. Neither is it acceptable to the business to output shortened fields.
As I've said above, I need the pivots, charts and macros in the template, so writing to a non-excel file is not an option either. The excel connection is set to delay validation.
My investigations so far lead me to one question. The excel connection has a property that cannot be altered that might just be the culprit. It is ExcelVersionNumber and it is set to 3.
Is this pointing to an older version of Excel than 2003? If so, how could I override it?
May 2, 2008 at 2:09 pm
Hi,
What I am saying is instead of using Excel destination, use falt file destination as dataflow destination and a Flat file connection manager (Not a interim task). At the faltfile connection manager editor, point the filename to actual excel file (In our case we have a excel file at the destination and it is overwritten everytime). Make sure mapping is all correct between source to destination.
-V
May 6, 2008 at 2:14 am
I've not tried your suggestion because I can't see how you would get it to write to the right worksheet. Did you first save the workbook with the activesheet as the one where you want to write the data?
Fortunately, I found out how to do it whilst still writing to an Excel destination. If you are writing to an existing workbook, you need to have pre-existing data as long as you need for the future. I changed my workbook to have 1024 characters in row 2 of each column that needed to output data of this size. When I write to the workbook now, it adds records from row 3, but will now output long data.
I have populated each column with 1024 * a and added an extra step in the workbook open macro. It searches for 'aaaaaaaaaaaaaaaaaaaa' and deletes the entire row if it finds it.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply