Things to be considered before implementing database auditing

  • Hi All,

    As part of security compliance, we are told to audit DML & DDL operations on a 4TB database. Using SQL Server 2017 Enterprise Edition. Before implementing, I want to seek advice on implementing server audit+database audit on a highly volatile tables.

    There are some tables where bulk inserts and bulk deletes happen. I am little concerned on what will be disk space requirements or CPU performance overhead it will have. What points to be discussed or things to be put forward the management before implementing the auditing on large databases and high volatile tables.

    Please advise.

    Regards,

    Sam

  • First off is, does this audit need to be legally binding?

    Do you need to stand in a court of law and unequivocally say, Person X did actions, A,B,C and I can 100% guarantee that information?

    If so, well then what you want is an appliance to do the auditing for you, Imperva, Guardium etc, it must sit in-between the network and the DB server.  DON'T get auditing software, it must be an appliance and it must be controlled via your security teams, not the DBAs.

     

    Secondly how tamper proof does the audit need to be, anyone with enough rights can stop a SQL Audit, do what they want to do, enable the audit.  Anyone with rights to the path/UNC that the sqlaudit file is written to, can again tamper with the files.

    For that you may want a software like Idera SQL Audit, which is again controlled via security but its not fully legally binding as an appliance would be.

     

    Thirdly, who is actually going to be looking at this data?  What tools do you have like syslog servers, splunk etc etc, as you're going to be generating thousands of events, whos going to be taking the sqlaudit file, parsing it and reading it or whos going to be taking the security Windows log and parsing it and reading it.

     

     

    As for overhead, yes there is going to be overhead, as with anything there is overhead, how much is going to drastically depend on your workload, but expect a rough 1%-5% additional load, also you need to ensure whatever your writing the audit data too can keep up with the amount being written to it. If going to the security log, how quick is it rolling over, can syslog/splunk keep up with the amount of churn as you're security log is limited on size so if your churning more than syslog can read you'll lose events.

  • Thank you Ant  for the suggestions.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply