Viewing 10 posts - 1 through 10 (of 10 total)
Thanks John,
That's what I was trying to aoid, as it takes time and make the log file very big.
Thanks anyway
March 22, 2007 at 2:34 am
Thanks john,
This what i am doing to produce the format file.
But when trying to map 1 input field to 2 output fields, it's not working, or at least i am...
March 20, 2007 at 9:36 am
Hi John,
Actually I checked the online books, they explain how to do when the source and destination have diff number of fields or when the order is diff. but they...
March 20, 2007 at 8:41 am
Yes John, Thanks for your suggestion.
I am importing around 500,000 records with every record with 1000 character wide(100 fields).
So I can do what you sugeested but this will almost double...
March 20, 2007 at 6:06 am
Sergiy,
The first example i gave was to simplify the question and make it clear to the readers.
I said later:
QUOTE:
In fact there is one table which is imported on daily basis from a legacy...
December 20, 2006 at 1:44 am
Dear Sergiy,
Just to wrap up the issue,
As i said before there is only one table, no external tables. and the table is self joined using a primary key or a...
December 19, 2006 at 6:11 am
The whole legacy file is imported using DTS. At what step do you suggest the update to be done, at insert of every record using a trigger? please bear in mind...
December 19, 2006 at 5:15 am
I guess a smart guy should read well before answering.
I said that the table is is imported on daily basis from a legacy system. I didn't say that new records are inserted....
December 19, 2006 at 4:28 am
Hi and thanks all for your replies,
Now to explain more the issue, the situation is the following:
In fact there is one table which is imported on daily basis from a legacy system.
And links...
December 19, 2006 at 2:46 am
Thanks Guys for your replies,
I guess What David and Daniel Suggested will do. but I am worried abour performance as the tables I am talking about can have up to...
December 18, 2006 at 7:58 am
Viewing 10 posts - 1 through 10 (of 10 total)