Change Data Capture.

  • Hello All,

    This is a new subject for me. I am at the moment reading up on Change Data Capture.

    What we want is, get all data from a database (around 100 Gigabyte), to another 'location', not real time, but with a mimimum delay. We are hoping to use the Change Data Capture capability of SQL-server to keep the other 'location' up to date with a slight delay.

    The change rate of the database is 'fairly' limited. A rough estimate would be around a 100 Mb each day by 'normal' transactions. (As said this is a rough estimate). Batches might generate a bit more of data, but batches are the exception not the rule.

    As said at the moment I am reading up on this.
    Issues:
    -- All tables (around a 1000) is this realistic/feasable.
    -- Latency of the new location 10 minutes.
    -- Versions of SQL-server (standard/enterprice 2008/2012/2014) issues.
    -- Loads of other issues.

    Yes, I do realise this is rather an open question. So at the moment I am just comming to terms with the task. Any comments will be welcome. So 'how' huge is this task? How realistic is this scenario? Are we on the rigth track (or totaly off) ?

    Thanks for your time and attention,
    Ben

  • I'm not sure that CDC is really the best way to attack this.  

    What's the purpose of the second server, is it going to be a DR failover, reporting server or something else?

    How up to date does the database need to be?

    You might find that replication or log shipping might be easier ways to do what you're after.

    https://sqlundercover.wordpress.com/

  • david.fowler 42596 - Thursday, August 10, 2017 8:37 AM

    I'm not sure that CDC is really the best way to attack this.  

    What's the purpose of the second server, is it going to be a DR failover, reporting server or something else?

    How up to date does the database need to be?

    You might find that replication or log shipping might be easier ways to do what you're after.

    In the end the data goes to a BI/Reporting environment.
    If only changed data is captured, this would (could) be around 100 Mb for normal transactional processing.
    A volume which can be handled without any problem. So we can control/extract/transform/load without a problem.

    So we prefere and probably rely on tot get our hands on only the Change data.

    With Log shipping or replication, there is the complete database, if that has to be handled this is around 100 Gb, which has to be handled every 10 minutes. This is probably far to much to handle in that timespan. So we rely on some sort of Change Data Capture.

    Suggestions are offcourse welcome.

    Ben

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply