August 24, 2005 at 1:22 pm
Hi,
I need a set based solution that will incrememnt a column based on the Max value of another column in another table, this will be used with an insert statement.
The database has poor design and uses this as a key for a table.
Thanks,
Chris
August 24, 2005 at 1:27 pm
You're really better off using identity for this... I think it would be time to do the transfer. If you chose this method you can end up having 100s of dead locks/day. Not to mention that it can fail on inserts if 2 users manage to get the same max id at the same time.
August 24, 2005 at 1:31 pm
Remi yes I know everything you said is true. As I said this database had poor design by the vendor that created it, I can't change it. This is not for users this for processing batch data off hours.
Thanks
August 24, 2005 at 1:34 pm
Then there's another... what are you trying to do... and what code are you using ATM?
August 24, 2005 at 1:42 pm
Currently there is user created code in an access db that gets the
Max(someid) and in VBA Code loops thru and adds one to the corresponding column in Access table, the records from this access table are then inserted into a table in SQL Server where the MAX id came from.
There are actually 5 tables that get inserted into to create this new entity but I was trying to simplify what I need to do.
I use a local temp table with related data to process the inserts as one logical unit of work.
I am having a hard time trying to increment this column for insert with a set based solution.
August 24, 2005 at 2:00 pm
I forgot to say that the access code is being converted to T-SQL so that it can run as a SQL job during off hours.
August 24, 2005 at 2:11 pm
Well if you need to run batches here's how I would do it ::
Fetch next valid id
Insert into Table (Select columns, @validId + Count(*) from table self join on id <= id)
fetch the new max id and update ids table.
but you need to ru all that under a single transaction, which can cause dead locks, or at least locks.
August 24, 2005 at 3:31 pm
THanks
August 24, 2005 at 5:15 pm
Do you need help with the query??
August 24, 2005 at 7:10 pm
First, I agree with Remi 100% but those darned 3rd party dummies just keep writing junk and selling it. Worse yet, we keep buying it
I'll probably get a real blast for using temp-anything for this but we had a similar problem with some batch jobs an, apparently, close relative of your 3rd party vendor wrote. As Remi indicated, we had 100's (620 per day average) of deadlocks because of it and similar problems.
We used many different methods to try to get around this problem-that-we- couldn't-change... if you are inserting a large batch of new records like we needed to, you could write those records to either a permanent working table (must be truncated, not "delete"d at the begining of each run) or a temp table or even a table variable. It should have an extra column of the IDENTITY flavor starting at 1 and incrementing by 1.
The procedure steps would be as follows
TempTable.RowNum + @MyMax
Do understand that with these types of 3rd party requirements of using MAX to get the last used ID, there is always a chance of a couple of new records sneaking in... as a result, they usually compound the problem by making an Oracle-like sequence table (a NextID table of sorts). YOU NEED TO MAKE SURE THEY AREN"T USING ONE OF THESE BEFORE ATTEMPTING TO INSERT INTO THE TABLE(s) DIRECTLY or you'll really make a mess of things. AnyWay, they normally don't know how to write the code for that without creating the potential for deadlocks. This was our big fix that knocked out 100% of the NextID related problems we had and I'll be happy to share that bit of knowledge if you find a sequence table anywhere in the DB.
Hey Farrell and Remi... we couldn't find BSOFH but do a search on BOFH... some guy wrote a whole diary-like set of columns on the subject.
--Jeff Moden
--Jeff Moden
Change is inevitable... Change for the better is not.
August 24, 2005 at 9:48 pm
BOFH = Bastard [system] operator from hell.
The kind that doesn't want to spare 50K for a numbers table even if you prove that I'll lessen the load on the server .
August 25, 2005 at 12:43 pm
Will performing the select inside the transaction prevent the id from increasing while the batch writes occur? Will it lock and block itself?
Begin Transaction
SELECT @MaxShipToCode = MAX(SomeID)
FROM MyTable
Insert Into MyTable
Insert Into AnotherRelatedTable
--You know the rest
End Transaction
August 25, 2005 at 7:58 pm
I don't beleive that moving the SELECT into a transaction will prevent others from writing to the target table and, therefore, will not prevent you from inserting a duplicate ID in the target table.
Have you determined whether or not a sequence table is in use? That is the most important first step...
--Jeff Moden
Change is inevitable... Change for the better is not.
August 26, 2005 at 7:52 am
No sequence table is used.
August 26, 2005 at 8:21 am
If the method of generating new id's for the users is to just take the max(id) +1 from the prod table, then you have no choice but to close down the prod table while the batch is doing it's thing to prevent some user sneaking in a record between the max(id) you have read for the batch to start with and (the same) max(id) a user will read while batch is grinding away.
To get rid of such problems, you actually do need a sequence table (assuming identity is not used) to gain complete control over id assignments.
The trick lies in how the new id's are fetched, to prevent the same id being read by more than one user. To achieve this, you need to use a conformant way of generating id's by a procdure call (which I assume you can't do ) , but I thought I'd mention it anyway.
Bottom line is, to gain 100% control over your homegrown counters, and still have good concurrency, you need a counter table.
Using MAX(id) method from the base table is very very hard to code efficiently (sometimes even impossible) and is most likely to generate errors in form of duplicate assignments and abysmal performance.
though, just my .02
/Kenneth
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply