SQL 2008 database replication

  • Dear all,

    we have an installation with sql 2008 express, without Internet connection. What we need is to replicate the database on a regular time line. We use express edition and we dont have Internet connection to the server.

    One way to do it, is to ask from users to send us the sql log files everyday via email. The problem is, these files, generally speaking, are big files and may be we could have restrictions with email.

    Are there any other ideas, how to replicate the db, since we use sql express ?

    Ionas

  • Sending transaction logs via email is not going to work. Forget about that.

    What I can suggest is setting up a merge publication on the server (I assume the data you want to replicate in on a server which is an edition other than Express). If you don't have direct connection to the publisher, use web synchronization to synchronize to the SQLExpress subscriber.

    Would that work for you?

    -- Gianluca Sartori

  • What we need is to replicate the data base somewhere outside of the customer db and server. The customer db is on vessels, so we do not have Internet connection, only email with some limitations !!!!

  • Then you simply cannot do that.

    -- Gianluca Sartori

  • One idea we have is to export data to a script file and then send file via email to build remote db. What do you think ?

  • You could always download to thumb drive then get someone with Wi-Fi on the lap top. (Okay security would have a heart failure)

  • Some problems with that approach:

    A script file would be much bigger than the data in the database. How are you planning to send that via email?

    How are dealing with updates, conflicts, constraints, deletes?

    I used to work for a company which reinvented merge replication (many years ago) and I can tell you that it never worked as expected. Replication is a complex problem, there's no way you can handle that with a couple of scripts.

    -- Gianluca Sartori

  • What I plan to do is:

    1. We install the db to customer with some initial data (db 1 )

    2. We install the some db (as above) to a remote location (db 2)

    3. On a daily basis or every 3-4 days we create script data files from db 1 and send them to remote location

    4. We enter script files to db 2

    What do you think ?

  • Might work as long as:

    1) You wipe your local database and import all the data from the remote database, without trying to syncronize only the changes

    2) The scripts are small enough to fit in an email attachment

    Let me ask, why do you think that scripts will be better that a database backup? If you have access to email, you must have access to internet. Place a backup in a shared place like dropbox or something and grab it from there. Why do you want to cause yourself trouble with fancy synchronization magic?

    -- Gianluca Sartori

  • Good question...

    Because, the db is installed to an 'isolated machine' - without any access to the outside word. So, we take scripts from one machine, we copy them to other machine that has email capabilities.

    DB backup maybe is a big file.....

  • I infer from your description of the problem that you need a copy of the client's data at your site, without the need to synchronize changes.

    In this case what I would do is:

    1) Take backup of the database at the isolated machine, using compression

    2) Copy the backup to a thumb drive and bring it to a connected machine at the client's site

    3) Upload the backup to a shared drive on DropBox, GoogleDrive, OneDrive, <insert your file sharing provider>

    4) Download at your site

    5) Restore the database

    You're assuming that the scripts will be smaller than a backup, but I'm having a hard time believing that.

    The scripts would be smaller only if you're exporting a subset of the data, which you can do more relyably with BCP.

    Just my 2 cents, hope this helps

    -- Gianluca Sartori

  • I am a bit late to this conversation, but here goes...

    First, what are the business requirements. How much data will change and how often will it change.

    If you are capturing data from a merchant vessel then you already know a lot of the problems you will have, the trick is to reconcile the limitations with the requirements.

    Do you only need to capture status after a vessel has left port, or do you need to capture data continuously throughout the voyage. Or can you send all changes when the vessel is next in port. Will there be peak times for data change followed by periods of low change, or will data change at a steady rate all the time. When you know the pattern of change and the amount of data likely to be generated you can design your transport infrastructure.

    You will also need to work out how changes you send can be acknowledged by the recipient. If there is no acknowledgement, how do you plan to identify missed data.

    Do not over-engineer your system. If all data can be sent when the vessel is in port, do not design a system to send data during the voyage - costs and complexity will be significantly different.

    If you need to send data throughout the voyage, then you need a store-and-forward method of dealing with data transmission. You have suggested email, which does store-and-forward, and although this has many limitations it may form part of the solution. Other parts might be to use Service Broker to cope with the storage, and Microsoft Replication Objects to package the data, then email to transmit. You will obviously need dedicated email addresses, preferably a unique address for each vessel for send and another one for receipt. You will need to include a check-sum process, and probably also encryption, so that you can be certain that a complete message has been received.

    This process would allow you to create Service Broker messages to capture data changes. When you have a transmission window, then read the Service Broker messages, package them with Replication Objects, and output to an email.

    You should also store the contents of each email sent until you get an acknowledgement. Give each mail a unique Id (GUID), and when the recipient acknowledges receipt of a complete copy (that is, after it has verified the checksum) of an email, then that email Id can be archived or deleted from the database on the vessel. Each email should also contain the GUID of the preceeding email plus data and time it was sent. If the recipient discovers a missing email, it should send a message stating the last complete email Id, and the vessel can then retransmit all missing emails.

    This approach means you will need custom code to package your data, and custom code at the receiving end. I suggest that all data received is simply written to a database, and any required data merging, etc, done as a downstream process.

    Don't forget this idea is free, so there may be a number of holes in it. However, it does outline a method of sending data to a central location and being able to acknowledge receipt of data back to a vessel. It can cope with satellite transmission, and the gaps you will have with the signal. It will get the data to the recipient whenever you do have a signal and hold data on the vessel when there is no signal.

    You should also look at what is on the market. Other industries have similar issues - mineral exploration often relies on satellite to send data home - so you may find a framework on the market to deal with data storage and transmission via satellite, allowing you to plug in the data you need.

    Best of luck with your system, you are working in a tough environment.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • Thanks a lot for your time and your suggestion. We do need data during voyage. We will set up a test environment with your suggestion and I will come back to you if we have any questions.

  • Thinking about my reply a bit more, Service Broker possibly just adds complexity for little gain. You could get much the same effect by using an 'OutboundMessage' to hold details of each message you wanted to send. This table might have the following structure:

    (MessageId Int Identity Not Null

    ,DateCreated Date Not Null

    ,TimeCreated Time Not Null

    ,MessageType Int Not Null

    ,MessageText nvarchar(max) Not Null

    ,MessageChecksum Int Not Null

    ,BatchId Uniqueidentifier Null)

    The BatchId column would only get filled in when a transmission batch is created. You would also want a 'BatchControl' table to hold details for each batch:

    (BatchId Uniqueidentifier Not Null

    ,DateCreated Date Not Null

    ,TimeCreated Time Not Null

    ,MessageCount Int Null

    ,BatchStatus Int Not Null

    ,BatchChecksum Int Not Null

    ,DateSent Date Null

    ,TimeSent Time Null

    ,DateAcknowledged Date Null

    ,TimeAcknowledged Time Null)

    Also for clarity, email is used to allow transmissions to be batched. You would send one email with an attached file that contained a number of messages. The table designs are just suggestions, you will probably need more or different columns.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply