January 23, 2020 at 2:09 pm
I have a table in DB [a] (a.dbo.a) that gets reloaded every 10 minutes with fresh data and I have been asked to recreate that table in a production database DB and reload that table each time that (a.dbo.a) gets reloaded. Both are on the same server. What is the most efficient way to accomplish this? I'm pretty sure that a trigger is the wrong way to go, (too many locks I think, among other things). A scheduled job might be OK, but I would prefer something that executes based on when (a.dbo.a) has completed a reload. I'm just not sure how to determine that with confidence.
January 23, 2020 at 2:43 pm
Transactional replication would probably work as well as anything, so long as your table has a primary key constraint.
John
January 23, 2020 at 2:47 pm
You could create a job and then have a trigger run sp_start_job.
I would not reload the production table but only INSERT/UPDATE/DELETE the differences.
January 23, 2020 at 7:39 pm
I have to ask - why do you need a second copy of the table in another database on the same instance? Just reference the table from the production DB directly.
Or - better yet, create a synonym in the production DB that references the table in the other database. For example:
CREATE SYNONYM dbo.a FOR a.dbo.a;
Now - from the production DB you can do this:
SELECT ... FROM dbo.a
And you will get the data from the DB a table dbo.a.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
January 24, 2020 at 3:22 pm
This was a request that came in as an attempt to eliminate cross database queries made by the stored procedure that gathers data from 11 different tables across 3 different databases and is run continuously by end users in an application.
January 24, 2020 at 7:51 pm
This was a request that came in as an attempt to eliminate cross database queries made by the stored procedure that gathers data from 11 different tables across 3 different databases and is run continuously by end users in an application.
Why would cross-database queries be a problem? What is the issue this is attempting to resolve - is it performance, security or something else?
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
January 27, 2020 at 6:39 pm
Performance.
January 27, 2020 at 7:27 pm
Copying a table from one database to another database on the same server will not have any impact on performance unless that other copy has different indexes or there are specific blocking scenarios preventing access to the table in the other database.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply