April 29, 2009 at 10:43 am
:crying:Hi all,
Been pulling my hair out on this one. I need to do a complete audit all jobs running on all my systems. Need to know what jobs are hitting what databases, and what other servers. I have been able to do sp_help_jobs to get the list of jobs and fun details but now I need to go deeper for each of these 400+ enabled jobs and for any that are SSIS jobs list what databases are involved and what other servers these are hitting on.
Does anyone know how I can get a solid list of that information without having to look at every single package? I don't see where that is stored in msdb or any views or procs that pull that information? Am I wishing for something that cannot be stored there? Just for fun note I also have to do this for all DTS jobs on 2000 boxes too.
April 29, 2009 at 12:22 pm
You can always set up a Profiler Trace and write them into a table. You can then query the details from that table. That is the easiest thing that comes to my mind.
-Roy
April 29, 2009 at 12:47 pm
A server trace that dumps to text files is the best option for that kind of thing. The reason not to use Profiler, but to use a server trace instead, is that Profiler will slow things down more. Same for the reason to dump to text files instead of tables, that'll slow down every transaction and query on the server being traced if it has to insert into tables. Writing to text files is much faster.
Querying the text files is slower, but with the functions for that, you can easily dump it all into tables once you're done running the trace, and then slice-and-dice the data by querying the tables. That's the best of both worlds.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
April 29, 2009 at 12:55 pm
Please use GSquareds idea. It is way more efficient than mine. I stand corrected. 🙂
-Roy
April 29, 2009 at 1:02 pm
The trace would be good, but of course the powers that be want an answer now. I will set this up for future usage and tracking. Thanks!
April 29, 2009 at 1:07 pm
with ssis/dts you would have to check the packages connection data and queries. With each connection, you would also have to search for linked server queries,...
Same goes for .exe of vbs stuff you start in jobs.
for local scoped jobs, profiler may help out (spid based).
It will be a heck of a job to put that all in charts.
Maybe VSTS for DBA may help out. I don't know if it has job intelligence.
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
April 29, 2009 at 2:06 pm
Yep this is my nightmare. I am only doing it because the CIO has mandateded it.
April 29, 2009 at 9:07 pm
I ran into a server that had that many jobs, once. I disabled them all (except for know server maintenace) and would only turn them back on if someone called to find out why their stuff stopped running. After two months, I got rid (archived) of about 3/4 of the jobs. Evil but effective. 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply