December 15, 2016 at 8:24 am
There is nothing wrong with nvarchar/nchar fields.
Unless you look at all the double operations that must occur, and the extra disk space required to store as nvarchar/nchar.
The size of my staging and Data warehouse are enough that conversion saves gigabytes of disk space, reduces foreign key relationship times, not to mention indexes....etc...
December 15, 2016 at 8:27 am
Thanks for the article! It is good info to see the capability of SSIS. It is ok not to have specific info. I understand the point now is that if you have a source field that is NVARCHAR, it is possible to bring the data to a VARCHAR destination.
December 15, 2016 at 8:33 am
There is nothing wrong with nvarchar/nchar fields. Unless you look at all the double operations that must occur, and the extra disk space required to store as nvarchar/nchar.
This is why I would advocate that each column needs to be evaluated on it's own. In practice, I find most columns can be set to varchar (and probably should have been so in the source data). But some fields, particularly those that deal with names of any kind and addresses need to remain nvarchar to properly preserve the data.
December 15, 2016 at 9:23 am
It would be interesting to know how performance varies between the two options (CAST in source and data conversion in package).
For those with separate servers running SSIS, doing the data conversion in the package might be a useful way of keeping the load down on the main SQL instance.
Otherwise I agree with the majority: performing the CAST in T-SQL is the more straightforward solution & helps keep SSIS packages clean and streamlined.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
December 15, 2016 at 9:43 am
From my experience, not much if any.
The way that the SSIS engine does these conversion, it has to create a separate stream.
You simply move the complexity from SQL Server (well documented and tunable) to the SSIS application server that is well frankly not as much of a known entity.
It also depends on the load ie what other packages is the application server trying to process concurrently.
December 16, 2016 at 3:50 pm
Nice article Thomas, thanks.
December 19, 2016 at 1:16 am
As per the other comments it easier to maintain the SQL code than SSIS derivation task and probably better to push the load onto the SQL server.
I know this example was done for training, in a Production environment I would do the same except put the SQL select statement into a stored proc, this makes production changes easier, only the stored proc needs to change, no need to re-deploy the SSIS package(project). The stored proc should also run more optimally than the select script in SSIS.
The environment I'm working has all the sources coming in as text or spreadsheets and then need to convert the type in a data conversion or derived column transformer. Is there a faster way to get around this having to manually convert the types to deal with unicode/non-unicode conversion?
December 19, 2016 at 7:35 am
Is there a faster way to get around this having to manually convert the types to deal with unicode/non-unicode conversion?
Not that I know of.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
December 19, 2016 at 8:42 am
Sean Peffer (12/19/2016)
As per the other comments it easier to maintain the SQL code than SSIS derivation task and probably better to push the load onto the SQL server.I know this example was done for training, in a Production environment I would do the same except put the SQL select statement into a stored proc, this makes production changes easier, only the stored proc needs to change, no need to re-deploy the SSIS package(project). The stored proc should also run more optimally than the select script in SSIS.
The environment I'm working has all the sources coming in as text or spreadsheets and then need to convert the type in a data conversion or derived column transformer. Is there a faster way to get around this having to manually convert the types to deal with unicode/non-unicode conversion?
Most times I import text files into a staging table first with data types discovered by the Source component, then use T-SQL to convert the staging tables to destination tables
Thomas LeBlanc, MVP Data Platform Consultant
December 19, 2016 at 8:50 am
Thomas LeBlanc (12/19/2016)
Sean Peffer (12/19/2016)
As per the other comments it easier to maintain the SQL code than SSIS derivation task and probably better to push the load onto the SQL server.I know this example was done for training, in a Production environment I would do the same except put the SQL select statement into a stored proc, this makes production changes easier, only the stored proc needs to change, no need to re-deploy the SSIS package(project). The stored proc should also run more optimally than the select script in SSIS.
The environment I'm working has all the sources coming in as text or spreadsheets and then need to convert the type in a data conversion or derived column transformer. Is there a faster way to get around this having to manually convert the types to deal with unicode/non-unicode conversion?
Most times I import text files into a staging table first with data types discovered by the Source component, then use T-SQL to convert the staging tables to destination tables
Text files' data types can be controlled directly in SSIS, so this is less of a problem than it is within Excel.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
March 20, 2021 at 3:06 am
This was removed by the editor as SPAM
Viewing 11 posts - 16 through 25 (of 25 total)
You must be logged in to reply to this topic. Login to reply