September 17, 2021 at 5:16 pm
Can I run extended event session against 1 SQL agent job? I have a job which runs at 7 every day, takes about 10, 12, 14 hours to run. It inserts 2.5 million records into a table thru a bat file. To me, it should run fast but I am just trying to educate myself whether I can use extended events to capture some info or not.
September 17, 2021 at 8:41 pm
I don't see why you couldn't capture the data with XE. Create the XE session to capture the data you want, have it save it to file or table or whatever makes sense to you, then have the first step of the job start the XE session and the last step of the job stop the XE session.
Alternately, what may be more beneficial, replicate it on test and capture the execution plans.
The duration could be based on any number of things and since the duration seems to vary by hours, I would assume that either you have varying amounts of data going in with your 2.5 million number being the average.
To me, a bat file feels like the wrong tool for the job. I am not certain what the job is, but a bat file that has to deal with 2.5 million pieces of information sounds like a good way to slow things down. It MAY be the only or the best method for what you are doing, but if you can transfer that over to an SSIS package or powershell you may get a performance boost. Mind you, I am just making random assumptions here that the bat file is putting things into the database 1 row at a time. If you are doing a bulk insert, that may not make much difference. I do expect that SSIS will perform better than a bat file though as bat files are very procedural and can only process 1 line of the script at a time. SSIS you can do parallel imports (if it makes sense... like you can't write to the same table in 2 different processes at the same time as that would cause blocking).
Now, the above being said, your bat file may be the only way to do what you are doing. I would not be surprised if the slowness is caused by the bat file. You could test this pretty easily though by putting in a few statements in the bat file to write to a logging file. Like "echo script start - %date% %time% >> C:\temp\job.log" at the start and then a similar line anywhere that you want to check how long it took to get there (such as right before any SQL calls and right after) and then again at the end. The advantage to this is you can use the bat file to determine if SQL is the bottleneck or not without adding overhead onto the SQL server and adding very minimal overhead to the bat file.
XE is lower overhead that profiler, but it is still overhead on the SQL instance.
So, my process to troubleshoot what you are seeing is to first modify the bat file to have some logging so I can see what is the slow step. Once I prove it is SQL, then I would replicate the problem in test and check the execution plan for the data being written. You could run an XE session during the import on the test system and you would not have to worry about impacting the live system and you would get the data you require and could review it as needed.
Personally, I wouldn't run profiler or XE against a live SQL instance as they do cause performance hits. And with a test system, I am comfortable stopping and starting jobs repeatedly to troubleshoot and debug things. On live, I would not want to run something that inserts 2.5 million records and interrupt it while debugging it. I would interrupt it as I don't want to sit for 10+ hours trying to debug it.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply