September 16, 2010 at 6:17 am
We have a secure shop running SQL Server 2008 EE on a couple of Microsoft Failover Clusters (Windows Server 2008 EE). The data security folks want to implement C2 level auditing as we migrate from SQL Server 2005 to SQL Server 2008.
We have 8 databases online in our Production (high-intensity write) SQL Server 2008 environment at the moment, and we are getting swamped with 200 GB .trc files, which means that C2 level auditing is pretty much functioning as designed. 🙁
I'm looking for ideas for how to zip up the old .trc files and move them to a different server for storage. Unfortunately, the audittrace*.trc files appear to be randomly named, based on a date stamp and what appears to be a sequential trace file number. At this point, I cannot tell which .trc files belong to which database, and I cannot tell (programatically) which files are open for write. (If the file is open for write, it cannot be deleted.)
Does anyone have experience with this in their shop?
Alternatively, is there a way to move the default location for the creation of the .trc files?
September 16, 2010 at 7:26 am
Since C2 audit trace files are created on the default data directory for SQL Server, you may change the Default Data directory
Pradeep Adiga
Blog: sqldbadiaries.com
Twitter: @pradeepadiga
September 16, 2010 at 10:52 am
Adiga (9/16/2010)
Since C2 audit trace files are created on the default data directory for SQL Server, you may change the Default Data directory
Actually, this part of the problem. The .trc files and the .mdf files are in the same directory. Can they be separated so that future .trc files go to a different disk?
September 16, 2010 at 1:04 pm
As far as I know they cannot be separated.
Pradeep Adiga
Blog: sqldbadiaries.com
Twitter: @pradeepadiga
September 17, 2010 at 7:02 am
In case anyone stumbles across this forum in the future, I thought I would add some information. I don't have a workaround yet, but here is what Microsoft says about how C2 level auditing works.
----------
1. Trace file is saved as <audittrace..Year..Month..Date...Timestamp hr:min:sec>
2. There is only 1 trace file per instance for all databases. There should be only 1 trace file (200mb) with the latest time stamp open for logging at any given time for particular instance.
3. C2 Log files always point to the default data files location of the instance. You cannot change the path of C2 audit files but you can change the location of default data location to some other drive. For changing the default path
right click on server->choose database setting change the database default locations to where you want to point the files
After SQL restart trace file will be created to the new default location provided in Database setting This will not change the existing database location. However you will need to make sure next time you create database point it to the right location
SQL server databases are not supported on compressed volume. As long as the availability of trace log directory can be achieved and databases are not put on compressed volume you should be able to point the trace file to compressed volume. I would recommend to test it in environment to ensure the availability of compressed volume because if the drive is not available to write to Trace log for any reason it can force to stop SQL service.
September 17, 2010 at 7:16 am
shew (9/16/2010)
I'm looking for ideas for how to zip up the old .trc files and move them to a different server for storage. Unfortunately, the audittrace*.trc files appear to be randomly named, based on a date stamp and what appears to be a sequential trace file number. At this point, I cannot tell which .trc files belong to which database, and I cannot tell (programatically) which files are open for write. (If the file is open for write, it cannot be deleted.)
Coming back to this question, What i can think of is writing a script that pulls out all .trc file from the folder(where it is stored), sorts it by creation date/timestamp, zips all files one by one except the latest one. Schedule this script to run at an interval of few hours (change it based on your observation)
I had earlier written a vb script code that sorts the file...
dim TracePath
Set fso = CreateObject("Scripting.FileSystemObject")
Set folder = fso.GetFolder(Left(TracePath,len(TracePath)-1))
Set files = folder.Files
i=0
For each fileIdx In files
Redim Preserve allFiles(i)
Redim Preserve allTimeStamps(i)
allFiles(i)=fileIdx.Name
allTimeStamps(i)=fileIdx.DateLastModified
i=i+1
Next
'Now sort as per Modified TimeStamp
for i = lbound(allTimeStamps) to ubound(allTimeStamps)
for j = lbound(allTimeStamps) to ubound(allTimeStamps)
if allTimeStamps(i)<allTimeStamps(j) then
tt=allTimeStamps(i)
allTimeStamps(i)=allTimeStamps(j)
allTimeStamps(j)=tt
tt=allFiles(i)
allFiles(i)=allFiles(j)
allFiles(j)=tt
end if
next
next
for i = lbound(allFiles) to ubound(allFiles) -1 ' -1 so that we leave the latest file
' here you write Code to zip the file
' code to move the zipped file
Next
I modified the script a bit. havent tested it.
September 17, 2010 at 10:14 am
I started working on something similar with PowerShell (our preferred scripting language--our DBA team is not skilled in VB script), but, unfortunately, PowerShell calls the zip object asynchronously and the delete statement whacks some of the files before they flow through the "queue" for zipping. (This code is loosely based on http://blogs.msdn.com/b/daiken/archive/2007/02/12/compress-files-with-windows-powershell-then-package-a-windows-vista-sidebar-gadget.aspx .) The following is a "condensed version" and has not been fully tested. However, it shows that I am trying to do at this point.
$SourceFileSpecification_STR="L:\MSSQL10.instance_name\MSSQL\DATA\audittrace*.trc"
$DestinationZipFileSpecification_STR="my.zip"
get-childitem $SourceFileSpecification_STR | Where {$_.LastWriteTime -le "$LastWrite_DTE"}
$File_NMEs = get-childitem $SourceFileSpecification_STR | Where {$_.LastWriteTime -le "$LastWrite_DTE"}
<###############################################################
#
# Create empty .zip file
#
############################################+##################>
set-content $DestinationZipFileSpecification_STR ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $DestinationZipFileSpecification_STR).IsReadOnly = $false
<###############################################################
#
# Zip and delete the files
#
############################################+##################>
$ShellApplication = new-object -com shell.application
$ZipPackage = $ShellApplication.NameSpace($DestinationZipFileSpecification_STR)
foreach ($File in $File_NMEs)
{
<###############################################################
#
# Display file information
#
############################################+##################>
write-host "Current file: $File; Length=$File.Length"
dir $File
<###############################################################
#
# Compress the file
#
############################################+##################>
$PotentialFailureReason_DSC = "Error compressing $File"
$ZipPackage.CopyHere($File.FullName)
if (!$?) {job_failure}
Start-sleep -milliseconds 9000
<###############################################################
#
# Delete the original file--note that open files will fail on
# the delete, which is what you want it to do
#
############################################+##################>
del $file
#if (!$?) {job_failure}
}
September 17, 2010 at 10:44 am
in fact its good to have it written in powershell 🙂
I suppose vbscript would have faced the same issue - of zipping asynchrous.
may be you can write the move part in a seperate job that moves all zipped files. and run this "before" the the zipping job runs.
September 18, 2010 at 4:22 pm
Can't you just write a batch file loop for this?
--Jeff Moden
Change is inevitable... Change for the better is not.
September 18, 2010 at 7:58 pm
Jeff Moden (9/18/2010)
Can't you just write a batch file loop for this?
That would be great. 🙂 Please post code for what you have in mind.
September 18, 2010 at 8:21 pm
shew (9/18/2010)
Jeff Moden (9/18/2010)
Can't you just write a batch file loop for this?That would be great. 🙂 Please post code for what you have in mind.
Heh... jeez... I guess I've got to do that instead of you looking up how to loop in batch code, huh? 😉 Alright... it's been a thousand years or so since I've had to do such a thing but let me see what I can churn up.
--Jeff Moden
Change is inevitable... Change for the better is not.
September 19, 2010 at 5:42 am
Heh... jeez... I guess I've got to do that instead of you looking up how to loop in batch code, huh? 😉 Alright... it's been a thousand years or so since I've had to do such a thing but let me see what I can churn up.
Actually, I've spent several hours on this, and I do not have a suitable solution yet. You make it sound simple. Let's see what you've got. I have pasted the help output for the "for" command so you won't have to look it up.
Looping is not the issue. Zipping is the issue. Do you know of a way to zip a file using native Windows Server commands, without having to install any third party software?
Runs a specified command for each file in a set of files.
FOR %variable IN (set) DO command [command-parameters]
%variable Specifies a single letter replaceable parameter.
(set) Specifies a set of one or more files. Wildcards may be used.
command Specifies the command to carry out for each file.
command-parameters
Specifies parameters or switches for the specified command.
To use the FOR command in a batch program, specify %%variable instead
of %variable. Variable names are case sensitive, so %i is different
from %I.
If Command Extensions are enabled, the following additional
forms of the FOR command are supported:
FOR /D %variable IN (set) DO command [command-parameters]
If set contains wildcards, then specifies to match against directory
names instead of file names.
FOR /R [[drive:]path] %variable IN (set) DO command [command-parameters]
Walks the directory tree rooted at [drive:]path, executing the FOR
statement in each directory of the tree. If no directory
specification is specified after /R then the current directory is
assumed. If set is just a single period (.) character then it
will just enumerate the directory tree.
FOR /L %variable IN (start,step,end) DO command [command-parameters]
The set is a sequence of numbers from start to end, by step amount.
So (1,1,5) would generate the sequence 1 2 3 4 5 and (5,-1,1) would
generate the sequence (5 4 3 2 1)
FOR /F ["options"] %variable IN (file-set) DO command [command-parameters]
FOR /F ["options"] %variable IN ("string") DO command [command-parameters]
FOR /F ["options"] %variable IN ('command') DO command [command-parameters]
or, if usebackq option present:
FOR /F ["options"] %variable IN (file-set) DO command [command-parameters]
FOR /F ["options"] %variable IN ('string') DO command [command-parameters]
FOR /F ["options"] %variable IN (`command`) DO command [command-parameters]
file-set is one or more file names. Each file is opened, read
and processed before going on to the next file in file-set.
Processing consists of reading in the file, breaking it up into
individual lines of text and then parsing each line into zero or
more tokens. The body of the for loop is then called with the
variable value(s) set to the found token string(s). By default, /F
passes the first blank separated token from each line of each file.
Blank lines are skipped. You can override the default parsing
behavior by specifying the optional "options" parameter. This
is a quoted string which contains one or more keywords to specify
different parsing options. The keywords are:
eol=c - specifies an end of line comment character
(just one)
skip=n - specifies the number of lines to skip at the
beginning of the file.
delims=xxx - specifies a delimiter set. This replaces the
default delimiter set of space and tab.
tokens=x,y,m-n - specifies which tokens from each line are to
be passed to the for body for each iteration.
This will cause additional variable names to
be allocated. The m-n form is a range,
specifying the mth through the nth tokens. If
the last character in the tokens= string is an
asterisk, then an additional variable is
allocated and receives the remaining text on
the line after the last token parsed.
usebackq - specifies that the new semantics are in force,
where a back quoted string is executed as a
command and a single quoted string is a
literal string command and allows the use of
double quotes to quote file names in
file-set.
Some examples might help:
FOR /F "eol=; tokens=2,3* delims=, " %i in (myfile.txt) do @echo %i %j %k
would parse each line in myfile.txt, ignoring lines that begin with
a semicolon, passing the 2nd and 3rd token from each line to the for
body, with tokens delimited by commas and/or spaces. Notice the for
body statements reference %i to get the 2nd token, %j to get the
3rd token, and %k to get all remaining tokens after the 3rd. For
file names that contain spaces, you need to quote the filenames with
double quotes. In order to use double quotes in this manner, you also
need to use the usebackq option, otherwise the double quotes will be
interpreted as defining a literal string to parse.
%i is explicitly declared in the for statement and the %j and %k
are implicitly declared via the tokens= option. You can specify up
to 26 tokens via the tokens= line, provided it does not cause an
attempt to declare a variable higher than the letter 'z' or 'Z'.
Remember, FOR variables are single-letter, case sensitive, global,
and you can't have more than 52 total active at any one time.
You can also use the FOR /F parsing logic on an immediate string, by
making the file-set between the parenthesis a quoted string,
using single quote characters. It will be treated as a single line
of input from a file and parsed.
Finally, you can use the FOR /F command to parse the output of a
command. You do this by making the file-set between the
parenthesis a back quoted string. It will be treated as a command
line, which is passed to a child CMD.EXE and the output is captured
into memory and parsed as if it was a file. So the following
example:
FOR /F "usebackq delims==" %i IN (`set`) DO @echo %i
would enumerate the environment variable names in the current
environment.
In addition, substitution of FOR variable references has been enhanced.
You can now use the following optional syntax:
%~I - expands %I removing any surrounding quotes (")
%~fI - expands %I to a fully qualified path name
%~dI - expands %I to a drive letter only
%~pI - expands %I to a path only
%~nI - expands %I to a file name only
%~xI - expands %I to a file extension only
%~sI - expanded path contains short names only
%~aI - expands %I to file attributes of file
%~tI - expands %I to date/time of file
%~zI - expands %I to size of file
%~$PATH:I - searches the directories listed in the PATH
environment variable and expands %I to the
fully qualified name of the first one found.
If the environment variable name is not
defined or the file is not found by the
search, then this modifier expands to the
empty string
The modifiers can be combined to get compound results:
%~dpI - expands %I to a drive letter and path only
%~nxI - expands %I to a file name and extension only
%~fsI - expands %I to a full path name with short names only
%~dp$PATH:I - searches the directories listed in the PATH
environment variable for %I and expands to the
drive letter and path of the first one found.
%~ftzaI - expands %I to a DIR like output line
In the above examples %I and PATH can be replaced by other valid
values. The %~ syntax is terminated by a valid FOR variable name.
Picking upper case variable names like %I makes it more readable and
avoids confusion with the modifiers, which are not case sensitive.
September 19, 2010 at 5:10 pm
shew (9/19/2010)
Looping is not the issue. Zipping is the issue. Do you know of a way to zip a file using native Windows Server commands, without having to install any third party software?
No... I don't know of any way to do this using native Windows Server commands although it looks like you've figured that out with Power Shell (which I also don't use). I was going to recommend something like "7-ZIP" which is free and does have a command line mode which is synchronus.
Is the only problem the fact that the zipping occurs asynchronusly in your current attempt?
--Jeff Moden
Change is inevitable... Change for the better is not.
September 19, 2010 at 6:25 pm
...it looks like you've figured that out with Power Shell (which I also don't use).
We didn't use PowerShell either until this project for migrating SQL Server 2005 to SQL Server 2008. It looks as if PowerShell is Microsoft's future, so we bit the bullet, and we are trying implement everything we write as PowerShell scripts. So far, we have been pleased with the results. PowerShell is a bit like Unix scripting, and it is much more powerful than cmd.exe.
Is the only problem the fact that the zipping occurs asynchronusly in your current attempt?
Yes, the asynchronous zip is the current problem. Our shop is a government site, and any third party software (including freeware) has to survive endless scrutiny by our data security folks. It's just not worth trying to bring in third party software, if we can avoid it. Do you have any idea how to make it synchronous? Right now, we are just issuing a sleep command between calls to the zip code. However, that is dangerous, there will always be a bigger file some day that will take longer than our sleep value. 🙁
September 19, 2010 at 7:41 pm
Heh.. been there and done that. Use to work for a "little" DOD shop known as Raytheon. 😛
The only way that I can think of to make this synchronus is based on what the Windows native zip thing actually does. Do you need to precreate the zip file or will windows do that when it's done compressing? I was under the impression that windows zip created a "working file" with a different name and if it was successful, would rename that working file as the final zip name. If that's actually true, then all you have to do is wait for the correctly named zip file to show up before you do your delete.
Another way to do this would be two do two jobs one after the other. The first one would do all the zips, the second one would do deletes but only for those files that had a zip file associated with them.
I just don't know enough about Power Shell to say "Here's the magic bullet". My apologies.
Considering that you're stuck with what's native (heh... would have been nice to know that earlier), the DOS loop with a nice 7-ZIP (free) or WINZIP PRO (not free but not expensive) isn't going to work for you. If you CAN use approved 3rd party software, you may find that WINZIP PRO is on the approved list (can't remember the name of that list anymore, it's been almost 20 years... I believe it starts with a "G" and is 3 letters long) of software.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 47 total)
You must be logged in to reply to this topic. Login to reply