November 10, 2022 at 11:14 am
Hi all
Background.....
We are moving to schema switching to reduce downtime on our reporting database. We are also using multiple schemas to separate the various datamarts.
Now, to try and be clever, I thought I'd create a trigger to build the schemas we are going to need.
As an example, if I create a Test schema, I want it to to create the following:-
Test_etl - temporary storage for loading data
Test_dummy - empty schema for use in the switching
Test_shadow - for loading new data
Test - this will be the schema that everyone "sees" for reporting purposes.
I've built the trigger (it's ugly but it should work) but I'm getting a strange error message.
I've attached the trigger code.
The error I'm getting doesn't make any sense.
The error states:-
2759 CREATE SCHEMA failed due to previous errors.
(That's the error code and message)
As far as I can see there are no previous errors.
I've also added in sections for sorting out who can see what (users should only be able to see the main schema) and a section for removing schemas if one is dropped (unlikely, but added it for housekeeping).
Can someone have a look and let me know where I've gone wrong please?
TIA
Richard
::edit::
Just noticed I can upload it as a ".sql" file so renamed it to ".txt"
November 10, 2022 at 4:29 pm
It looks like you are attempting to create the new schema after it has already been created.
I modified the trigger to just create the @SchemaName schema and it failed because the schema already existed.
When I changed it again to just create the @SchemaName_etl schema it worked. I did have to SET RECURSIVE_TRIGGERS OFF to prevent trigger recursion and eventual rollback.
It may work if you remove the @SchemaName schema create. I think I would create a stored procedure to do this instead of a trigger. That requires someone to use the proc to create a new schema, but I assume you don't have users creating schemas very often.
November 11, 2022 at 5:56 am
Hi @Ed - I did think about that which is why I put in the TRY/CATCH blocks. No matter what I put in for @SchemaName, the trigger fails (so the CATCH block is working).
I'll try switching off recursive triggers (thought I'd already done that but I'll check) and see what happens.
Using the store procedure is a good idea, I can use the trigger to call the stored procedure with a parameter so we get the best of both worlds.
We'll have a lot of schemas being created to start with but that will naturally slow down once a lot of the datamarts are in place.
Regards
Richard
November 11, 2022 at 3:41 pm
I think the trigger fails on error despite the presence of the try catch block. I believe that try catch causes transactions to enter an uncommittable state so the entire transaction is rolled back.
I added SET XACT_ABORT OFF to the start of the trigger and SET XACT_ABORT ON to the end and it worked. I was working with a modified version that just created the schemas and did nothing else, so I don't know if the rest works.
The modified trigger also works if you comment out the exec (@sql) for the main schema. This prevents the first catch from firing. There is no reason to try to create the main schema as the trigger fires after create.
With xact_abort_off this is the output of the schema only trigger. (I added the 'CATCH BLOCK FIRED' message).
New schema (NEW3) detected on 2022-11-11 at 08:30:39.6070000
Attempting to create main schema
CREATE SCHEMA [NEW3] AUTHORIZATION [dbo]
2759CREATE SCHEMA failed due to previous errors.
Creation failed AND CATCH BLOCK FIRED
Attempting to create etl schema
Attempting to create shadow schema
Attempting to create dummy schema
November 12, 2022 at 1:43 am
We have a reporting database. It has 2 schemas... Build1 and Build2. We don't mess around with triggers or the schemas. We created a 2 synonyms for each table... 1 that 's for reporting and one for repopulating the tables. We just rebuild the synonyms to swap between the two schemas. One day, Build1 will be pointed at for reporting and Build2 will be pointed at for repopulating. Once a re-population is complete, we just repoint (rebuild) the synonyms to point at the other database so that Build1 becomes the repopulating schema and Build2 becomes the one for reporting.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 17, 2022 at 11:20 am
Thanks all
@jeff - That would be nice but we don't have a reporting database as such. Users can connect directly to the main database through all sorts of mechanisms (SSRS, PowerBI, etc) and there's not a lot we can do about that.
Anyway.....
I've got the schema creation (and switching) working but now I've come across another snag (probably because I'm an idiot!).
When I run the sproc for the switch, I've noticed the foreign keys in the Fact table refer to the Dim in the other schema.
For example, I've got a foreign key that refers to a dim (with referential integrity).
Pre-switch:-
USE [DataWarehouseLive_RG]
GO
ALTER TABLE [theatres].[Fact_Patient_Cases] WITH CHECK ADD CONSTRAINT [FK_Fact_Theatres_Patient_Cases_Dim_Anaesthetic_Staff_Group] FOREIGN KEY([AssistingAnaesthetistGroupKey])
REFERENCES [theatres].[Dim_Theatres_Staff_Group] ([StaffGroupKey])
GO
ALTER TABLE [theatres].[Fact_Patient_Cases] CHECK CONSTRAINT [FK_Fact_Theatres_Patient_Cases_Dim_Anaesthetic_Staff_Group]
GO
Post-switch:-
USE [DataWarehouseLive_RG]
GO
ALTER TABLE [theatres].[Fact_Patient_Cases] WITH CHECK ADD CONSTRAINT [FK_Fact_Theatres_Patient_Cases_Dim_Anaesthetic_Staff_Group] FOREIGN KEY([AssistingAnaesthetistGroupKey])
REFERENCES [theatres_shadow].[Dim_Theatres_Staff_Group] ([StaffGroupKey])
GO
ALTER TABLE [theatres].[Fact_Patient_Cases] CHECK CONSTRAINT [FK_Fact_Theatres_Patient_Cases_Dim_Anaesthetic_Staff_Group]
GO
I had assumed (wrongly) that the foreign keys would update to the new schema as well.
I've tried transferring the FACT table first followed by the DIMs and vice versa but that hasn't helped.
It looks like I'll need to add in some code to drop/recreate the foreign keys unless anyone has any other ideas?
Regards
Richard
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply