November 6, 2007 at 7:43 am
Thank you for all your suggestions, while I try to digest all the options.
Jeff can you tell me how to do what your suggesting?
November 6, 2007 at 8:25 am
Sure, John... kind of... hope you don't mind a bit of psuedo code because I don't ever use DTS except for ad-hoc transfers...
Create two identical tables to hold the data... let's just assume they're called MyTableA and MyTableB.
Create a passthrough (sometimes called a "surrogate" or "synonym") view using SELECT * from MyTableA and call it just MyTable. Of course, before either table is loaded, this will return nothing.
Ok, now the initial load...
Load MyTableB with all the data you want.
Run an ALTER script on the MyTable view that says SELECT * FROM MyTableB and you're done.
Next time you need to do the update, since MyTableB is being used, via the MyTable view for the customers to use, load up MyTableA with the new data.
Once MyTableA has all the data you need, simply Alter the view again but this time with SELECT * FROM MyTableA. That alter will take less than 65 milliseconds... since the customer facing code is using the view, no code changes on the customer code is required and it will look like the table has been instantly updated.
Obviously, the code that both populates one table or the other and the also alters the view is going to need something to keep track of which table the view is pointing at... the way I do that is I just check SysComments for the code in the view, search for the table name that it's currently pointed at using CHARINDEX, and then choose the opposite table as the table I need to load and eventually change the view to.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 6, 2007 at 10:07 am
Jeff.. OK since I'm a noivce with this I'm going to ask that you help me out in more detail.
OK this is what i've got:
1. DTS package schedule, truncate table, make calls to dll that gets the xml stream form vendor api. In the dll, the code use a stored procedure, which does the bulk insert to tableA.
sp sample { INSERT INTO tableA
SELECT * FROM OPENXML(@intPointer,'/root/job',2) ...}
2. Now I've created a sencond tableB and transfer data.
3. Create a view called mytable with the following { CREATE VIEW dbo.mytable AS SELECT *
FROM dbo.tableA }
I'm not really sure how to create the "alter" script, and do I have to do something special for a "surrogate" view. Is the "alter" script done in the View or a stored procedure?
TIA
November 6, 2007 at 12:11 pm
After TAble B is populated:
CREATE VIEW dbo.mytable AS SELECT * FROM dbo.tableB
Then do data update in tableA
After it's finished:
DROP VIEW dbo.mytable
CREATE VIEW dbo.mytable AS SELECT * FROM dbo.tableA
_____________
Code for TallyGenerator
November 6, 2007 at 12:49 pm
this is done in the stored procedure before the bulk load then dropped after the bulk load correct?
do i replicate tableA with tableB afterwards?
TIA.. very much
November 6, 2007 at 1:06 pm
Let's make sure we're on the same page.
Definition
"surrogate view": a view that is used instead of a table. In this case we're using it so that you can easily flip/flop between two tables. So - your ASPX page (or rather, the stored procedure pulling the data for the ASPX page) would be based on the surrogate view (INSTEAD of the table, thus the word "surrogate"), so you don't have to change the code on the fly.
Process:
So, assuming you were running the job a 8am, 4pm, and midnight, let's take an example:
- we start the day with the "current data" in tableA, and nothing in tableB. The view is defined to pull data from tableA
- at 8AM, your process kicks off, doing the INSERT part of your data into TableB and doing whatever it needs to get tableB's data into the right shape. Once tableB is "ready", then you run the ALTER VIEW to switch your queries to using tableB. Once that's done, THEN you delete the data in tableA.
- at 4PM, process kicks off again, doing the insert into tableA, and getting the data ready. Once A is ready, run the ALTER VIEW again, switching the view to pulling from A. Then - kill the data in B.
...and so goes the endless cycle. You would likely need to check syscomments OR have a separate table to figure out which table is "live" and which is empty each time....
The entire thing happens within your DTS (just different steps within there).
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 6, 2007 at 2:24 pm
:D, I got it!!! thanks for all the help!
DTS package--> drop view mytable-->create view mytable (tbB) -->truncate tableA -->bulk import
-->drop view mytable -->create view mytable(tbA) --> truncate tableB --> transferdata from A to B
November 6, 2007 at 7:58 pm
No, no... no transfer of data from TableA to TableB... it'll take too long. Go back and look at what we wrote...
--Jeff Moden
Change is inevitable... Change for the better is not.
November 6, 2007 at 8:17 pm
Jeff Moden (11/6/2007)
No, no... no transfer of data from TableA to TableB... it'll take too long. Go back and look at what we wrote...
I understood what you wrote earlier, I was thinking it would be easier just to use tableA all the time, and use tableB only while t.a is updating.
So my web page can't run queries against tableA while it copying to tableB?
November 6, 2007 at 8:25 pm
Sure it's easier... that's not the name of the game... name of the game is nearly-continuous availablity of the data... depending on the size of the table, you may have several seconds of "down time" during the copy using the method you posted... not good for a Web site that users are hitting. Go look at Matt's last "scenario" that he posted... that's what you want... the total down time of flopping a surrogate view like that is measured in milli-seconds... not seconds.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 10 posts - 16 through 24 (of 24 total)
You must be logged in to reply to this topic. Login to reply