January 27, 2012 at 2:52 pm
Good day all -
I have a scenario where I have to select a long fixed length string into different columns. On an average the length of the string is 2040 characters, and it is stored in an nvarchar(max) column. Currently there are about 41400 rows in the table, and when I run the query that splits the string into multiple tables using substring function, the total execution time varies a lot. It has varied from 04 seconds to 2 minutes and i am not sure why is it varying at all.
Since its a dev box where I am testing my queries, there is a possibility that other developers must be using the database or other database and that would have contributed towards the total execution time for my query as well, therefore I have tried executing it when there was no one else using any database on the machine. This time my results have been between 05 seconds to 22 seconds.
How can I get an estimation range of throughput time for the query? The table that I am extract data from is going to explode once I set my query live.
Regards,
Kazim Raza
January 27, 2012 at 9:47 pm
Details help very much with this sort of question. As a minimum, please provide actual execution plans (*.sqlplan files) for quick and slow runs. Output from STATISTICS IO ON might be very useful too. Definitions for tables, indexes, and some example data would also help. We also need to see what code you are using to perform the split.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply