August 22, 2022 at 7:35 am
Hi everyone,
I have designed a SSIS package that transfers SQL table data into an excel file.
I am fetching data via this query:
/****** Script for SelectTopNRows command from SSMS ******/
SELECT CONVERT(NVARCHAR(4000),[LIST_TYPE]) AS LIST_TYPE
,CONVERT(NVARCHAR(4000),[LIST_CATEGORY]) AS LIST_CATEGORY
,CONVERT(NVARCHAR(4000),[LIST_SUBCATEGORY]) AS LIST_SUBCATEGORY
,CONVERT(NVARCHAR(4000),[LIST_DESCRIPTION]) AS LIST_DESCRIPTION
,CONVERT(NVARCHAR(4000),[LIST_VERSION]) AS LIST_VERSION
,CONVERT(NVARCHAR(4000),[T2_CODE]) AS T2_CODE
,CONVERT(NVARCHAR(4000),[T2_TYPE]) AS T2_TYPE
,CONVERT(NVARCHAR(4000),[T2_NAME]) AS T2_NAME
,CONVERT(NVARCHAR(4000),[FATHER_HUSBAND_NAME]) AS FATHER_HUSBAND_NAME
,CONVERT(NVARCHAR(4000),[T2_NationalID]) AS T2_NationalID
,CONVERT(NVARCHAR(4000),[T2_Passport]) AS T2_Passport
,CONVERT(NVARCHAR(4000),[T2_DOB]) AS T2_DOB
,CONVERT(NVARCHAR(4000),[YOB]) AS YOB
,CONVERT(NVARCHAR(4000),[DISTRICT_TOWN]) AS DISTRICT_TOWN
,CONVERT(NVARCHAR(4000),[T2_CITIES_FROM_ADD]) AS T2_CITIES_FROM_ADD
,CONVERT(NVARCHAR(4000),[STATE_PROVINCE]) AS STATE_PROVINCE
,CONVERT(NVARCHAR(4000),[T2_COUNTRIES_FROM_ADD]) AS T2_COUNTRIES_FROM_ADD ,CONVERT(NVARCHAR(4000),[T2_ADDRESS_FROM_ADD]) AS T2_ADDRESS_FROM_ADD
,CONVERT(NVARCHAR(4000),[T2_AKA]) AS T2_AKA
,CONVERT(NVARCHAR(4000),[NAME_ORIGINAL]) AS NAME_ORIGINAL
,CONVERT(NVARCHAR(4000),[T2_Affiliation]) AS T2_Affiliation
,CONVERT(NVARCHAR(4000),[LISTED_ON]) AS LISTED_ON
,CONVERT(NVARCHAR(4000),[T2_ORIGINAL_DATA]) AS T2_ORIGINAL_DATA
,CONVERT(NVARCHAR(4000),[SOURCE_URL]) AS SOURCE_URL
,CONVERT(NVARCHAR(4000),[FPM_ARCHIVE_URL]) AS FPM_ARCHIVE_URL
,CONVERT(NVARCHAR(4000),[COMMENTS1]) AS COMMENTS1
,CONVERT(NVARCHAR(4000),[COMMENTS2]) AS COMMENTS2
,CONVERT(NVARCHAR(4000),[T2_Designation]) AS T2_Designation
,CONVERT(NVARCHAR(4000),[LIST_CODE]) AS LIST_CODE
,CONVERT(NVARCHAR(4000),[LIST_NAME]) AS LIST_NAME
,CONVERT(NVARCHAR(4000),[AGE]) AS AGE
,CONVERT(NVARCHAR(4000),[INDIVIDUAL_TITLE]) AS INDIVIDUAL_TITLE
,CONVERT(NVARCHAR(4000),[T2_Gender]) AS T2_Gender
FROM [COMPLIANCE_DATA_PROCESSING_DB].[dbo].[ADO NET Destination] ORDER BY T2_CODE ASC
Now I am facing issue when creating an excel file which will NOT take data of more than 255 length of characters. It gives truncation related errors.
I have tried various solutions available online but nothing seems to work.
What is the correct way?
Also I have to create this file on daily basis, like for example file generated today will be named as: INDIVIDUAL_20220822.
August 22, 2022 at 2:49 pm
Try using exactly the same query as your source, but with none of the CONVERTs, and see what happens.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 22, 2022 at 6:09 pm
The only way I have managed it is to start with an excel file that already contains a row of dummy data with columns longer than 255. This is far from ideal, but instead of creating a new file, you can keep a template in a sub-folder and copy it to the destination folder before exporting. It's useful to set Delay Validation = True for the connection. If you can make this work, you may be able to delete the dummy data row, as long as you do nothing to encourage ssis to check the metadata again. Any time you double click on the connection, you need to hit cancel, not OK, otherwise it wants to validate.
In my experience it doesn't matter what you do to the source, or if you add data conversion tasks to the dataflow. The metadata of the data flow can be 4000 for every column, but the input column definitions for the excel connection will be 255. You can't edit it in advanced properties like you can with a flat file connection. SSIS believes it knows what Excel columns are, which is why is is such a pita to deal with.
August 23, 2022 at 11:16 am
If that is the only task in your ssis package, why not use Powershell ?
Dbatools
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
August 23, 2022 at 11:22 am
Exporting to CSV may also solve the problem, because then you have full control over the target datatypes.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 23, 2022 at 3:38 pm
Why not just create a stored procedure to produce the data and have Excel pull the data on demand? That way, you don't end up creating files for a bazillion years that people only used for a couple of months before they decided to use something else. 😉 It also provides a bit more security as to who can actually get to the data and you don't have a bazillion files stored because people always want a history that can go back a decade or two.
Of course, spreadsheets are their own form of a security risk and keep history for decades issue. 🙁
--Jeff Moden
Change is inevitable... Change for the better is not.
September 2, 2022 at 8:39 am
Why are you converting everything to NVARCHAR(4000) in the first place?
What does the actual source look like?
September 6, 2022 at 1:50 am
This was removed by the editor as SPAM
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply