December 23, 2014 at 9:07 am
Hello,
I need to sync several tables from a purchaes software to our own tables.
Not the complete tables, only selected fields.
Each source table has a timestamp, so I could use this to track what has changed.
But, how do I store locally the last timestamp from the source database.
A local timestamp field would register my localtimestamp instead of the one I would like to remind.
Any suggestion ?
December 23, 2014 at 9:38 am
HLEBOEUF (12/23/2014)
Hello,I need to sync several tables from a purchaes software to our own tables.
Not the complete tables, only selected fields.
Each source table has a timestamp, so I could use this to track what has changed.
But, how do I store locally the last timestamp from the source database.
A local timestamp field would register my localtimestamp instead of the one I would like to remind.
Any suggestion ?
Do you mean a column with the timestamp datatype or a datetime column?
If it is a datetime column I would just store a last load date as part of my process and then my ETL would be WHERE source.timestamp > LastLoadDate.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
December 24, 2014 at 1:17 am
Hi,
Just a thought..!!
Why not use MERGE to get the data from these tables in your local database.
You can schedule a Job using Sql server Agent to run the Merge command or you can develop a SSIS package to do the same task.
Hope it Helps...!!
Regards,
Shafat Husain
🙂
And Your Lord Never Forgets...!! (64:19 -- Quran)
December 29, 2014 at 2:02 am
Nice 🙂
December 29, 2014 at 6:33 am
Shafat Husain (12/24/2014)
Hi,Just a thought..!!
Why not use MERGE to get the data from these tables in your local database.
You can schedule a Job using Sql server Agent to run the Merge command or you can develop a SSIS package to do the same task.
Hope it Helps...!!
YOu could use MERGE, but if you are pulling the entire set every time and using MERGE, then performance is going to degrade pretty quickly. If you have a timestamp (datetime column) that you can use to know what is new/changed, doing incremental loading will be faster then using MERGE.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
December 30, 2014 at 6:27 am
timestamp is equal to binary(8).
January 1, 2015 at 5:07 pm
You could include both of the timestamp datetime fields in your table design. Call one like DateTimeModified, the other like DateTimeLoaded. In your (I assume) daily ETL you could select every thing that has changed since midnight the previous day to midnight start of today. Since you know there is a change here you dont need to rely on the MERGE operator.
----------------------------------------------------
January 2, 2015 at 8:38 am
I always want a datetime here so that I can detect latency or potential issues with my process. I'd store this on the destination side, when I pull the data. The reason is that I may end up distributing this later, or using multiple servers to get changed data. In that case, each server needs to know what data it will update.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply