March 19, 2008 at 3:15 am
Apologies in advance if this question has an obvious answer - I have had a look around but have drawn a blank.
For auditing purposes, I need to capture a record of all SQL that is executed against a database (ideally just a single database, but if it has to be at server level, then so be it).
The rationale is that some users have the ability to execute random SELECT statements for their management information purposes (this is a copy of the live database, on a separate server, with only those users accessing it). The business needs are such that pre-defining what they can do by (for example) using stored procedures is not going to be acceptable. The management wants to be able to record/monitor what queries are run - mainly as a deterrent against people executing queries that could lead to large quantities of data "leaving the building".
I think Profiler could do something along these lines (but I'm not sure how yet). I'm wondering whether there is a better way - preferably something inbuilt into SQL Server 2005.
The icing on the cake would be adding the ability to record the number of rows returned by each query too (although not the actual rows).
Thanks.
March 19, 2008 at 6:29 am
Not sure, but I think "C2" auditing may do this... I've never used it so I don't know all of the details... check Books Online.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 19, 2008 at 7:32 am
This may be over-simplifying it - but isn't this something profiler does for you? Just continuously record a profiler session, preferrably tuned to only what you want recorded so that it doesn't get out of control.
It would also give you a way to know when large volumes of data are getting moved. Of course - by the time you would be reading this stuff - that data will already be out of the building, possibly out of the country. Interestingly enough - this would be something STORED PROCEDURES could help prevent.....
I've seen weirder - but I've never quite understood that thought process. Give someone unrestricted access, but don't trust them. If you're worried about large volumes of data leaving the building, then prevent that from happening - period.
Sorry - I'm ranting. You're just the implementer, not the policy maker:).
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
March 19, 2008 at 7:39 am
Thanks for the replies both - some things for me to take a look at there. The BOL documentation of C2 is fairly brief - tells you how to switch it on and that's about it. Profiler could be the way to go.
On the second point - these are trusted users that will be accessing the data, and they need pretty much free access to everything. However, the company wishes to be able to demonstrate that it took reasonable steps to monitor so that, in the event of one of those people "turning bad", they would be able to see what they had done after the event; and could not be accused of having done nothing at all up front.
March 19, 2008 at 8:01 am
Nick,
I've been looking into C2 Auditing because we need to set up this level of audit logging for PCI Compliance. There is a good article on this site which gives some starting info.
http://www.sqlservercentral.com/articles/Monitoring/basicsofc2auditing/1547/
These files can chew a lot of space on a busy system and as this article says if SQL can't write the logs it will stop the SQL service, which ain't pretty!
This is also only available at the server level. So if you are running a busy, multiple database environment, get plenty of storage and pull the log files off the machine either daily or at an interval acceptable to your company's on-site/off-site data retention policy.
Luckily for us, we have to separate our credit card data into a completely separate server to meet Compliance which means we will only be audit logging calls into this sensitive data, which hopefully shouldn't be a huge amount.
As to running Profiler to capture this data on a production server I would be wary of the performance impact on the server if there is a lot of activity. Personally I don't like running Profiler unless I'm troubleshooting and use it with caution on busy production environments.
Good luck.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply