June 2, 2021 at 7:26 am
Hello Guys,
I have an SSIS job who executed this query SELECT convert(bigint,pbTextId) as pbTextId_Big, * FROM dbo.PBT and table has 344 820 003 rows.
It spend huge time in order to pass this stage and my question how I can minimize the time?or what I can change?
Thanks in advance,
Hadrian
June 2, 2021 at 10:26 am
Well your having to scan the whole table, so reading 345million rows is not going to be a trivial task, never mind what the data types are and what the row sizes are.
You need to get in and get out in the shortest time, you will want to look at what columns can be dropped from the "*" part and only pull what you need to pull.
Do something of an incremental load process when you only pull in data what has changed or loaded since the last extract.
You would also be best to follow the link on how to ideally post performance problems, an execution plan and DDL would always be good to see what is going on.
https://www.sqlservercentral.com/articles/how-to-post-performance-problems-1
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply