Anyone had any problems in Articles selection

  • We have a lot of tables to replicate - every time i tried to set up the articles tables - it would give need to re do snapshot which is exactly what i wanted - the gui would just time out. It took me almost four times.

    Anyone know of a way to do this without the window

  • 1. create a pub with at least one article

    2. generate replication script (drop+create)

    3. edit script to have many articles

    4. execute script

    although the initial script looks pretty ugly, you can pretty it up

    for pubs with lotsa articles (or special needs like Oracle-SQL in EE) I have a control table

    - creating the pub simply involves cursor on this table (call sp_addarticle per iteration)

    however you should NOT be getting timeouts that suggests some high degree of contention/blocking

    - suggest you do sp_who2 and/or activityMonitor during such attempts

    HTH

    Dick

  • 1. create a pub with at least one article

    2. generate replication script (drop+create)

    3. edit script to have many articles

    4. execute script

    So the script you mention is the one it puts in the files.

    In here MSSQL\repldata\unc i.e the sch files.

    You man go through each one of these and then edit for each table and run each one manually.

    Timeouts - its almost like the GUI just stops responding.

  • Rather than creating the publisher/distributor/subscriber straight away, you can tell the GUI to script out the procedure instead (so it's just a something.sql file) and save this file wherever you like.

    So I think what dbaker is saying is to do this, but with only one article selected, and then you can edit the .sql file and copy and paste the bit that references the one article you selected, and then rename the article name to include the other tables/articles you want to include in the replication model.

    How many articles are you trying to include?

    Are they only tables or are you trying to include other things (such as Views) as well?

  • About 1500 tables

  • Replication is but one tool in the armoury, and there is a danger of those with hammers thinking all problems are nails !

    The business-led analysis should identify what data is REALLY needed on various downstream servers, and whether this should be all rows+columns or some partition thereof (Horizontal or Vertical).

    A technical analysis of volatility should show the "churn" and put into perspective. It would be crazy to ship a 10M table that changes 10% daily if the downstream consumer only makes one query of 10 rows at month-end.

    Ditto needs of one subscriber probably differs from another, so a one-size-fits-all approach should ring bells [especially for WANs where bandwidth is scant].

    so I am doubtful of 1500 tables (i.e. "articles") are really justified.

    but if they are then you can have a control-table of table-articles

    create table #t_arts(schName sysname, tblName sysname)

    insert into #t_arts

    select 'HumanResources','Department'

    UNIONselect 'HumanResources','Employee'

    UNIONselect 'Production','Product'

    UNIONselect 'Production','ProductInventory'

    -- etc

    and use a forward-only cursor to iterate through and call

    exec sp_addarticle @publication = @pubName, @article = @myart -- etc

    horses for courses!

    Dick

  • Never really thought about that hmmm now im thinking over here why replicate all 1500 some of these are static tables and never change perhaps i rethink that only the major tables that change drastically should be replicated.

    The rest can be just restored from a backup and stored on the replicated data.

    Hmmmm good advise now to convince the BI Group.

  • TRACEY (9/16/2008)


    Never really thought about that hmmm now im thinking over here why replicate all 1500 some of these are static tables and never change perhaps i rethink that only the major tables that change drastically should be replicated.

    The rest can be just restored from a backup and stored on the replicated data.

    Hmmmm good advise now to convince the BI Group.

    I've never had a practical reason to replicate over 255 articles, but out of curiosity have tried, and always seem to have issues creating the snapshot if I have any more than 255 articles. Subsequently if I ever needed to replicate more than 255 articles I would set up multiple publications

    Need to agree that 1500 articles sounds like a lot of articles to replicate though

    Sounds like you should really push to convince your BI Group that this is not the way to go!

    If this is a live replication model (and bearing in mind I have no idea if you have 1 subscriber or 300, or otherwise) then any big changes you make could result in having to create a new snapshot

    If static tables are already included in the replication model, then the subscribers should have these tables already and there might not be any reason to restore a backup

    You can remove the unnecessary articles from the replication model, which will generate a new snapshot, but at least this snapshot will only be the "changes" to the replication model, and won't involving creating a full snapshot of the entire DB

    Worth testing before doing anything to your live system, however (ie. don't just take my word for it! :D)

  • Hmm is thinking over here.

    (Thats a good idea set up multiple publishers - to the same subscriber database).

    That prevent time out.

    I was thinking that the static tables - i could set up just a brand new replication but use SNAPSHOT Replication only.

    Then if any static tables change - i can just reissue the snapshot.

    Keep replication on for dynamic tables.....

    MAINDB (Two publications ---one static ---- REPLICATIONDB (snapshot)

    dynamic1 ---- REPLICATIONDB (transactional)

    dynamic2 ---- REPLICATIONDB (transactaion)

    use two publications dynamic2 if more than x tables.

    (Now the only hard thing is for me if i make a change the MAINDB like drop tables you cannot

    do this without deleting the subscriptions so i have to rebuild all above each time) that mean

    going through my 1500 acticles ticking accordingly. (In sql 2005 you can do a column change)

    but not a drop table. Never worked out how to remove the replication flags of the database without deleting the subscriptions each time which is a pain.

    What you think about the snapshot for static that seems good to me.

  • If you remove the table from the replication model, which can be done via the Properties > Articles of the replication model, you should then be able to drop the table

    I've removed articles from replication using the GUI before but there's probably a stored procedure or similar that does the same thing

    The GUI will force a new snapshot to be created but this basically only contains the changes to the snapshot (ie. it just tells the subscribers that this article is no longer part of the replication) rather than being a complete snapshot of every article in the replication model

    Again, I'd test it first!

  • you can have several pubs to several subs (e.g. N:M) but no overlaps (multiple {P,S,T1})

    you might want to split a big (eg 1500 table) pub into several so that multiple Distrib Agents (DA's) can run simultaneously to help that initial outload, but my experience is that network (especially WAN) is limiting factor so you would get no gains (and have multiple distrib.exe processes on the Distributor so extra overhead).

    there is another problem if you are careless to have overlapping table article dependencies, eg Customers on P1, Orders on P2, OrderDetails on P3 since an "INSERT Customers; INSERT ORDERS; INSERT ORDERDETAILS" may not get replayed in the same order thus breaking DRI (bad) although DA's are smart enough to attempt retries (hooray). but bad practice and would generate an alert (assuming you knew+remembered to enable these!).

    if you do decide to segment your 1500 articles into several pubs for admin convenience, you can elect to set

    sp_addpublication @publication='P1',@independent_agent='FALSE'

    so P1..Pn could actually share the same DA (less overhead) even if mix of snapshot/transactional. NB default is each pub to be independent (unwise).

    apart from the initial peak of outloading all the snapshot, there would be no subsequent overhead having articles for static tables (ie if they don't change the DA has nothing to do for them), so you could leave within single homogeneous transactional pub. However I would get scared by seeing a 1500-table pub, so I would prefer to split into Snapshot and Transactional (e.g. PS1, PT1).

    You will gather that having multiple transactional pubs is harder to justify (see dependency overlap as above which can give rise to obscure race conditions), so I would advise against having PS1+PT1+PT2 (but I suppose PS1+PSn+PT1 would be OK).

    As I said at the outset, you should examine your dataflows and business needs before you start making a frankenstein (in PROD anyway)!

    As ever (!) once you have used the SSMS GUI to create the pub(s), you are strongly advised to get it to generate scripts that you can fine-tune for final environments (eg DEV, UAT, PROD+DR).

    Dick

    remember the definition of "Pioneer" - they're the ones with the arrows in their backs - been there, done that [sometimes self-inflicted wounds due to sloppy UI work, hence pushing folk to use deterministic scripts] !

  • Thanks for your advice greatly appreciated.

    Yes after reviewing the 1500 tables - there are 800 empty tables - and 380 or so static that gets be down to around 320 of all of these i know about another 100 are not reported off so that back with BI Group what ones you need.

    Awesome - i get these down to 220 tables..........and a snapshot for static even better.

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply