July 2, 2013 at 5:55 pm
Hi All-
I've got a block of code that is part of our build process. It already ran once then one of the objects was changed so one of the columns it was trying to insert into wasn't there anymore. That's an issue but not the one I would suspect. When trying to run the build again the entire script should have been skipped since we had already run the sciprt once and we have a version check that would catch that. However, because one of the columns is no longer in the table, the insert command fails validity check even though it should never actually process. Here's the code:
USE P4SE;
GO
SET nocount ON;
DECLARE @V_VERSION VARCHAR(10),
@V_FILENAME VARCHAR(50),
@V_SchemaChangeID INT,
@V_Description VARCHAR(275);
DECLARE @Cumulative INTEGER;
SELECT @Cumulative = 2;
SELECT @V_VERSION = '2012-13',
@V_FILENAME = 'SchemaUpd_1.03.sql';
SELECT @V_SchemaChangeID = 2,
@V_Description = 'Preload config tables';
IF NOT EXISTS (SELECT *
FROM VersionInfo
WHERE version = @V_VERSION
AND schemachangeid = @V_SchemaChangeID)
BEGIN
BEGIN TRANSACTION
-- *******************************************************************
-- Begin your SQL Code
-- *******************************************************************
INSERT INTO dbo.CfgAssessment(SortOrder, CfgAssessmentTypeId, AssessmentName, MaxLength, HasDate)
VALUES
(12,(select CfgAssessmentTypeId from P4SE.dbo.CfgAssessmentType where [Type] = 'alphanumeric'),'Other Assessments (be specific)',100,0);
-- *******************************************************************
-- End your SQL Code here
-- *******************************************************************
set @Cumulative = 0;
IF @Cumulative = 0
BEGIN
INSERT INTO VersionInfo
([Version],
[SchemaChangeId],
[Filename],
[Description],
[DateAdded])
VALUES (@V_VERSION,
@V_SchemaChangeID,
@V_FILENAME,
@V_Description,
getDate());
COMMIT;
END
ELSE
BEGIN
PRINT '!!! Rolling back';
ROLLBACK;
END
END
ELSE
PRINT 'Skipping ' + @V_Description;
I would think that the code would never run so it wouldn't fail but it does the same thing within management studio also. Any way we can bypass the check when running the script?
Thanks for any help.
July 2, 2013 at 6:00 pm
It's failing during the metadata check in the pre-compile, not during actual runtime. This is usually ran into when you're dealing with altered #tmp tables in repeated script runs when you don't explicitly drop the table at the tail.
There's no real way to avoid it, unfortunately. That you need a transaction offsets the ability to use GO between them to split the scripts apart to avoid the failing portions.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
July 2, 2013 at 6:17 pm
Thanks for the reply. I guess I'm still having a hard time with it conceptually since something like this:
if object_id('foo') is not null
BEGIN
DROP TABLE FOO;
END
ELSE
PRINT'NOT THERE'
will not fail even though the table does not exist in the database either.
July 3, 2013 at 1:13 am
That doesn't fail because of something called deferred compile. If a table isn't found at parse/compile time, the statement that refers to that table is not parsed or compiled (past basic syntax checking), instead the parse, bind and compile is deferred until the statement actually executes.
Deferred compile is only for missing objects (tables, views). If the object exists then the statement is parsed, bound and optimised regardless of whether it can ever run (the parser does not execute conditional code, the optimiser does not execute conditional code) and if there's a missing column the bind phase will fail.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply