August 16, 2011 at 1:39 pm
This may not be the correct terminology for what I am describing and if that’s the case please let me know.
We are in the process of implementing a nightly SQL Job that will perform automatically a process we currently perform manually which updates Bank Deposit information. I have the logic worked out so that for each row of data that needs to be updated there is a DML statement created. So if the logic finds 25 deposits that need to be updated I get 25 DML Updates. There will be cases where all items to be updated could be done in a single update using more generic criteria but because the accounting software system only allows for updating 1 bank deposit at a time I want to model our job to work the same way and so I’m sticking with 1 update per deposit.
My question is with how to setup a DML/query execution queue that would store 1 or more DML commands (these updates) and go thru the queue executing the commands one at a time. In the past I’ve always handled updates thru batches or single updates whether they updated 1 row or many. This time we need an audit trail and the storage of each DML update seems like a good way to accomplish this.
Thought’s? If anyone knows of an article or story that outlines this I’d love to get that. I just don’t; know if I am using the correct terms/words to describe what I’m looking for.
Thanks
Kindest Regards,
Just say No to Facebook!August 16, 2011 at 1:46 pm
Are these DML statements constantly morphing, or are they pretty much the same structure with different parameters?
If they're the same structure, have you looked into Service Broker? Pretty much one of the things it was built for.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
August 16, 2011 at 2:05 pm
I’m not sure if the way I interpret what you are asking is right so let me give a quick break down (in pseudo code) of what happens and that should provide the answer you are seeking.
The current issue (what spawned this need to begin with) is the accounting software system we use generates a lot of paper/electronic deposits in our database while the Banks themselves record but a few deposits, generally 1 deposit per day per type where Type is a 3 character code indicating if the item was a check or cash or other.
As an example, at Bank A for the date of 2011/08/01 I may have listed on my Bank Statement (the paper statement from the bank) 3 deposits where as my accounting software shows 35 deposits across the 3 possible deposit types. This makes reconciling the bank statement by our controllers unnecessarily long. The solution of merging deposits helps but its limited to one-at-a-time so it too is tedious .
THE SOLUTION: Automate the manual Bank Deposit Merge via SQL Job using the same logic the current manual process uses.
The logic steps are:
1) Find the Min(DepositNumber) per Bank Account + Deposit Date + Category (Category is a unique 3 character code shared across many deposits but not all )
2) Find all Deposits not currently set to the Min(DespoitNumber) and merge them to the Min(DepositNumebr) by updating the BankDepositNumber field of the Bank Deposit transaction.
3) Repeat for steps 1 -2 for each bank account at each bank.
For the example at Bank A for 2011/08/01 I should end up with no more than 3 Bank Deposits at most and that assumes I get at least 1 deposit per bank deposit type of the 3 possible types we have.
Hopefully then answers your question, if not please let me know.
BTW – I looked at Service Broker a while back (not for this but something else) and quickly stopped as it started to remind me of SSIS and how it seemed like it was designed to intentionally be difficult to follow/use. Then again it could just have been a bad article on Service Broker.
Kindest Regards,
Just say No to Facebook!August 16, 2011 at 4:02 pm
I'm not sure if I'd go that route and update the original BankDepositNumber. I'd rather either add another column to the original table or even a separate table that would match the original number and the "grouped" one. But in your scenario there might be reasons not to do so...
What you could do is to to store the parameter values of the original SQL statement in a table together with the values you actually want to to modify.
Something like
col_BankDepositNumber =12346
col_new_BankDepositNumber: 12345
col_BankAccount: someValue1
col_DepositDate: someValue2
col_Category: someValue3
col_update_time: datetime
You could then build sql statement and run the code including all the data validation and possible additional handling (e.g. notification or the like).
I would also prefer to run it as a single statement rather than row by row (RBAR). But again: I don't know enough about the business case....
August 16, 2011 at 4:05 pm
Let me break that down to what I think I understood. I'm going for the simplest format:
1) Find all accounts with a transaction on a date.
2) For that account, find the minimum DepositNumber for each category the transaction can fall into.
3) Assign that DepositNumber in BankDepostNumber to all other Transactions on that day in that category.
You are looking to generate a unique update statement for EACH of those secondary transactions to assign the BankDepositNumber the DepositNumber of the first entry for the day in that category.
Sound about right?
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
August 16, 2011 at 5:19 pm
Evil Kraig F (8/16/2011)
Let me break that down to what I think I understood. I'm going for the simplest format:1) Find all accounts with a transaction on a date.
2) For that account, find the minimum DepositNumber for each category the transaction can fall into.
3) Assign that DepositNumber in BankDepostNumber to all other Transactions on that day in that category.
You are looking to generate a unique update statement for EACH of those secondary transactions to assign the BankDepositNumber the DepositNumber of the first entry for the day in that category.
Sound about right?
The only small detail left out is to note the unique ID (The Pirmary Key) of the row updated. Otheriwse you got it. Also the field being updated is not any kind of key bit a text field so theres no triggers or constraints to deal with.
Thanks
Kindest Regards,
Just say No to Facebook!August 16, 2011 at 5:21 pm
LutzM (8/16/2011)
I'm not sure if I'd go that route and update the original BankDepositNumber. I'd rather either add another column to the original table or even a separate table that would match the original number and the "grouped" one. But in your scenario there might be reasons not to do so...What you could do is to to store the parameter values of the original SQL statement in a table together with the values you actually want to to modify.
Something like
col_BankDepositNumber =12346
col_new_BankDepositNumber: 12345
col_BankAccount: someValue1
col_DepositDate: someValue2
col_Category: someValue3
col_update_time: datetime
You could then build sql statement and run the code including all the data validation and possible additional handling (e.g. notification or the like).
I would also prefer to run it as a single statement rather than row by row (RBAR). But again: I don't know enough about the business case....
Thanks for taking time to reply but the update I'm doing is not of my own deisgn but something already built into the system and so I know its the proper change to amke. just want to find a way to automate it so that no one has to do it manually 1 deposit at a time as we have throusands each month that would fall into this scenario.
Kindest Regards,
Just say No to Facebook!August 16, 2011 at 6:06 pm
YSLGuru (8/16/2011)
Evil Kraig F (8/16/2011)
Let me break that down to what I think I understood. I'm going for the simplest format:1) Find all accounts with a transaction on a date.
2) For that account, find the minimum DepositNumber for each category the transaction can fall into.
3) Assign that DepositNumber in BankDepostNumber to all other Transactions on that day in that category.
You are looking to generate a unique update statement for EACH of those secondary transactions to assign the BankDepositNumber the DepositNumber of the first entry for the day in that category.
Sound about right?
The only small detail left out is to note the unique ID (The Pirmary Key) of the row updated. Otheriwse you got it. Also the field being updated is not any kind of key bit a text field so theres no triggers or constraints to deal with.
Thanks
What needs to happen with the UniqueID of the updated row? If I'm understanding this, you're leaving the rows in place.
Here's what would be my personal approach (mostly psuedocode):
SELECT
AccountNumber,
Category,
Min( DepositNumber) AS MinDepNum
INTO
#tmp
FROM
BankingTable
WHERE
TransactionDate > = @DateToCheck
AND TransactionDate < @DateToCheck + 1
Group BY
AccountNumber,
Category
SELECT
bt.DepositUniqueID,
t.MinDepNum AS NewBankDepositNumber
INTO
#tmp2
FROM
#tmp AS t
JOIN
BankingTable AS bt
ONt.AccountNumber = bt.AccountNumber
and t.category = bt.category
AND t.MinDepNum <> bt.BankDepositNumber
WHERE
bt.TransactionDate > = @DateToCheck
AND bt.TransactionDate < @DateToCheck + 1
--- Drop the results of #tmp2 into a service broker as an XML message for each row. Probably via Cursor.
-- Have a receiver do the following:
UPDATE BankingTable
SETBankDepositNumber = servicebroker.NewBankDepositNumber
WHEREDepositUniqueID = servicebroker.DepositUniqueID
Let me know if that makes sense.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
August 17, 2011 at 11:34 am
Thanks for the code suggestions, I will look at it when I next work on that project. As for the Unique ID nothing is done to it and the rows are also not changing, just a text field within each row.
I know this sounds far from "Best Practices" and it is but it’s just mimicking what the product (the accounting software system) does to merge deposits and so I’m just trying to automate it. I can't change how the data is changed or even works else we risk loosing product support so this is not the first time we've had to implement a less then desirable solution with this software.
Thanks
Kindest Regards,
Just say No to Facebook!October 5, 2011 at 12:15 pm
Evil Kraig F
I'm revisiting this project and after going back over the thread I had a question. For some time now we (in IT) have been manually updating data as needed and recording said changes to a log we keep. While this works I’d like to have a more automated process that would store DML commands, execute them and make a note of the change along with whether it succeeded and how many rows were updated and so on. This way we could focus less on re-creating the same code (with different keys or IDs of rows that must be changed) and also promote a consistent change log of what happened.
When this Bank / Deposit project came up I decided that it might be time to look into automating the DML's we've been done for years. This is why I asked about how to implement a generic DML queue. The bank deposit change was simply the most recent DML statement we'd been asked to implement.
I've not worked with Service Broker any but of the little I have read I get the impression it’s like SSIS in that it’s a lot of overhead for something simple and straight forward. I'm not against using SB I just don't want to invest too much time in learning it only to realize in the end it was not the right choice.
That said what are the benefits to using Service Broker to execute DML commands verses using SQL Server Agent and a SQL Job?
Thanks
Kindest Regards,
Just say No to Facebook!Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply