May 24, 2007 at 9:52 am
Gentlemen not sure how to go about it but thought some one might have done something similar ..
Need a way to tell nightly if every folder on
my backup directory has a .BAK file less than two days old
Any input/suggestions/guidance are welcome
Mike
May 24, 2007 at 10:54 pm
Hi Mike...
Perhaps this will help... do read the comments... you'll figure how to use it... could turn it into a table variable function, I suppose...
USE Master
GO
CREATE PROCEDURE dbo.sp_GetDirectoryFileInfo
/****************************************************************************************
PURPOSE:
This utility stored procedure returns file information from any given directory.
INPUT PARAMETERS:
The unquoted \\MachineName\Path or d:\Path where "d:" is the drive letter. Wildcards
may be used for file names and extensions. Only path information is allowed. Inclusion
of anything not interpreted as a valid path will cause an empty result set to be
returned for security reasons.
OUTPUTS:
Column name DataType Description
----------- -------- ----------------------------------------------------------
RowNum INTEGER Sequential number
FileName VARCHAR(256) Filename and extension from the DIR command
CreateDate DATETIME Date the file was created on
LastModifiedDate DATETIME Date the file was last modified
Bytes BIGINT The number of bytes the file contains
If the path was not found, is empty, the parameter passed was not an actual path, or
the permissions to access a legitimate path do not exist for MS-SQL Server, the stored
procedure will return an empty result set. This is partially for security reasons...
if a hacker gets no return, they don't know if they're on the right track or not.
OPERATIONAL NOTES AND SAFEGUARDS:
1. This stored procedure is stored in the MASTER database for accessability from all
databases with only one or two part naming.
2. Because this stored procedure uses xp_CmdShell, the path is encapsulated in quotes
during processing to prevent SQL Injection attacks by adding other commands along
with the path.
3. Only file info for the given path is returned. Again, for security reasons, no sub-
directory information is returned to prevent an attack by using C:\ as the path to
discover the directory structure of the drive.
4. As a further safeguard, since the path is encapsulated in quotes, any switches or
flags passed will simply cause an illegal path to be formed and an empty result set
will be returned.
Revision History:
Rev 00 - 04/26/2006 - Jeff Moden - Initial creation and unit test
****************************************************************************************/
--===== Declare I/O parameters
@pPath VARCHAR(256) --The path info and wildcards to be used with a DIR command
AS
--=======================================================================================
--===== Presets
--=======================================================================================
--===== Supress the autodisplay of rowcounts for appearance and speed
SET NOCOUNT ON
--===== Declare local variables
DECLARE @Command VARCHAR (300) --Holds the dynamic DOS command for the DIR command
--===== If the temp table that holds the Directory output is not null, drop the table
IF OBJECT_ID('TempDB..#DosOutput') IS NOT NULL
DROP TABLE #DosOutput
--===== Create the temp table that holds the Directory output
CREATE TABLE #DosOutput
(
RowNum INT IDENTITY(1,1),
Data VARCHAR(300)
)
--===== If the temp table that holds the file information is not null, drop the table
IF OBJECT_ID('TempDB..#FileInfo') IS NOT NULL
DROP TABLE #FileInfo
--=======================================================================================
--===== Get the directory information and the LastModifiedDate for lines with files only.
--=======================================================================================
--===== Setup to do a "DIR" with the following switches
-- /TW = Date/Time file was last written to (LastModifiedDate)
-- /-C = List number of bytes without commas
-- Enclose the @pPath variable in quotes to prevent SQL Injection attacks
SET @Command = 'DIR "' + @pPath + '" /TW /-C'
--===== Execute the "DIR" command and save the output in #DosOutput
-- (order preserved by the Primary Key)
INSERT INTO #DosOutput (Data)
EXEC Master.dbo.xp_CmdShell @Command
--===== Parse the Dos output into the file info table.
-- The criteria in the WHERE clause ensures only file info is returned
SELECT
IDENTITY(INT,1,1) AS RowNum,
SUBSTRING(Data,40,256) AS [FileName],
CAST(NULL AS DATETIME) AS CreateDate, --Populated on next step
CONVERT(DATETIME,SUBSTRING(Data,1,23)) AS LastModifiedDate,
CAST(SUBSTRING(Data,22,17) AS BIGINT) AS Bytes
INTO #FileInfo
FROM #DosOutput
WHERE SUBSTRING(Data,15,1) = ':' --Row has a date/time on it
AND Data NOT LIKE '%<DIR>%' --Row is not a directory listing
--=======================================================================================
--===== Update each file's info with the CreateDate
--=======================================================================================
--===== Setup to do a "DIR" with the following switches
-- /TC = Date/Time File was created (CreateDate)
-- Enclose the @pPath variable in quotes to prevent SQL Injection attacks
SET @Command = 'DIR "' + @pPath + '" /TC'
--===== Clear the #DosOutput table
TRUNCATE TABLE #DosOutput
--===== Execute the "DIR" command and save the output in #DosOutput
-- (order preservation not important here)
INSERT INTO #DosOutput (Data)
EXEC Master.dbo.xp_CmdShell @Command
--===== Parse the DOS output table for the CreateDate and add it to the
-- file info table.
UPDATE #FileInfo
SET CreateDate = CONVERT(DATETIME,SUBSTRING(do.Data,1,23))
FROM #FileInfo fi,
#DosOutput do
WHERE fi.FileName = SUBSTRING(do.Data,40,256) --Filenames match
AND SUBSTRING(do.Data,15,1) = ':' --Row has a date/time on it
AND do.Data NOT LIKE '%<DIR>%' --Row is not a directory listing
--=======================================================================================
--===== Return a result set to the calling object
--=======================================================================================
SELECT * FROM #FileInfo
--=======================================================================================
--===== Housekeeping
--=======================================================================================
--===== If the temp table that holds the Directory output is not null, drop the table
IF OBJECT_ID('TempDB..#DosOutput') IS NOT NULL
DROP TABLE #DosOutput
--===== If the temp table that holds the file information is not null, drop the table
IF OBJECT_ID('TempDB..#FileInfo') IS NOT NULL
DROP TABLE #FileInfo
--===== Exit the proc with no error reporting for security reasons
RETURN
--Jeff Moden
Change is inevitable... Change for the better is not.
June 7, 2007 at 6:42 am
Thanks for your time Jeff will be looking into the script
June 7, 2007 at 9:12 am
Jeff, Im curious why you went with xp_cmdshell over using sp_oa~ to drive the file system object?
June 11, 2007 at 6:39 am
If you're only looking for the existance of a single file, xp_fileexist might be the simplest way. It's undocumented, but you can get some details here:
xp_fileexist "c:\autoexec.bat"
File Exists File is a Directory Parent Directory Exists
----------- ------------------- -----------------------
1 0 1
OR, with an output parm for existence:
declare @result int
exec xp_fileexist 'c:\autoexec.bat', @result output
June 11, 2007 at 7:21 am
Mostly habit, Andy... I heard (a very long time ago) that the sp_OA routines had memory leaks (never did test to find out if that was true, my bad) and got out of the habit of using them.
Also, I really should do a rewrite using xp_getfiledetails instead of parsing the DOS returns... xp_getfiledetails is still available in 2k5... dunno if it will be in 2k8...
--Jeff Moden
Change is inevitable... Change for the better is not.
June 11, 2007 at 7:46 am
I've never tested either, at best they are clunky, but useful. Dont know that I'd want to call it from a proc that was hit hard every day, but for jobs I think its no problem.
June 11, 2007 at 11:47 am
Haven't tested it but it looks like you may have some issues trying to run this in a 2005 environment. I know xp_cmdshell is turned off by default and would require some modifications to the 'surface area config' for it to work. Also I notice you doing some stuff with droping temp tables by doing an object_id check for the table in tempdb. This works for normal tables but I believe with temp tables SQL Server appends some extra characters to the physical name so I'm think that might not work properly. However when using a temp table inside a sp they will get dropped by default at the end of execution...
hope this helps...
Ben Sullins
bensullins.com
Beer is my primary key...
June 11, 2007 at 7:28 pm
Ya gotta trust me on this one, Ben... things like the following tidbit have been tested, tried, and trued by me and about a thousand other folks on this and other forums...
--===== If the temp table that holds the Directory output is not null, drop the table
IF OBJECT_ID('TempDB..#DosOutput') IS NOT NULL
DROP TABLE #DosOutput
You're correct that temp tables have some extra characters after the name which is why a trip to SysObjects in TempDB is a bit futile, but the above code works just fine.
And, yeaup, I agree that temp tables drop at the end of a session IF the session drops... Dropping the table at the beginning just makes life a lot easier for troubleshooting in Query Analyzer because, until you close a given window, the session doesn't drop
--Jeff Moden
Change is inevitable... Change for the better is not.
June 12, 2007 at 10:56 am
Hmmm...well if it works cool!
Ben Sullins
bensullins.com
Beer is my primary key...
June 14, 2007 at 1:55 am
With 2K5 I'd be tempted to use CLR. In 2000 I did write two extended procs, xp_direxists (check if dir exists) and xp_print (to write/append to output file).
I think a well written extended proc has better integration, less 'clunky' and less chance of memory leak.
Far away is close at hand in the images of elsewhere.
Anon.
June 14, 2007 at 8:47 am
I assume you are looking to make sure you database backups have worked.
Have you considered querying the system tables? Perhaps something like querying msdb..backupset and ..backupfile to get the latest datetime of your backup, then use the stored filename with xp_fileexists to make sure the file is there. This should be a lot easier than trying to handle lists of files.
Hope this helps
Mark
June 25, 2007 at 8:55 am
I agree that it might be easier to query the backup history tables to verify that all the backups that should have been written, were written successfully.
But this script will answer the original question the way it was stated. It assumes that the file names are the maintenance plan containing "_db_YYYYMMDD", and that the usual directory-per-database structure is used.
if
object_id('tempdb..#dirs') is not null drop table #dirs
if object_id('tempdb..#backups') is not null drop table #backups
go
create table #dirs (subdirectory varchar(1000), depth tinyint)
create table #backups (filename varchar(1000))
go
insert into #dirs
exec xp_dirtree '\\fileserver\SQLBackups'
insert into #backups
exec xp_cmdshell 'dir \\fileserver\SQLBackups\*.bak /s /b'
select Subdirectory FROM #dirs
where depth=1 and subdirectory not in (
select Folder
from (
select right(Path, charindex('\', reverse(Path)) - 1) as Folder, DateStamp
from (
select filename,
left(filename, len(filename) - charindex('\', reverse(filename))) as Path,
substring(filename, charindex('_db_', filename) + 4, 8) as DateStamp
from #backups
where filename like '%_db_20[0-9][0-9][0-9][0-9][0-9][0-9]%'
) x
) y
group by Folder
having MAX(DateStamp) >= convert(char(8), getdate()-2, 112)
)
go
if object_id('tempdb..#dirs') is not null drop table #dirs
if object_id('tempdb..#backups') is not null drop table #backups
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply