December 7, 2010 at 8:39 am
Hi,
I need your views/thoughts/inputs on below scenario.
I have a table on one server which has around 10,000,000 rows and I want to load some of the columns from that table to another table on different server. Now the rows on server gets changed daily and to able to have those updates, I am suppose to use Slowly Changing Dimension. But unfortunately I can't because the column which gets udpated has data type as DT_NTEXT/DT_TEXT which is not supported in slowly changing dimension.
Is there any way by which I can achieve the same thing? Also to improve performance of data flow task to load these many rows, I have already played with defaultbuffermaxrows, defaultbuffersize, runinoptimizationmode etc. but of no use. Anyone knows anything apart from this, how can I achieve the objective mentioned above?
:rolleyes:
December 9, 2010 at 9:12 am
Redirect the records that need to be used to update the destination data to a separate table. Then use an Execute SQL Task after your dataflow is finished to update the destination table using the separate table as your source. If you search a bit online there are a number of alternatives to using the Slowly Changing Dimension transformation, many of them much are much faster.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply