July 20, 2017 at 2:43 pm
Instead of dropping the base table, can I do something to the existing table? I kind of find that the table is continuously being inserted with records, now if I drop they may loose some records until the rename of the table is completed. No, the table does not have any foreign key.
Thanks.
July 20, 2017 at 2:56 pm
SQL-DBA-01 - Thursday, July 20, 2017 2:43 PMInstead of dropping the base table, can I do something to the existing table? I kind of find that the table is continuously being inserted with records, now if I drop they may loose some records until the rename of the table is completed. No, the table does not have any foreign key.
As others have suggested, you could ignore it? You have a long ways to go before you will run out of values.
If you MUST remove the gaps for whatever reason, I'm thinking a manual process is likely going to be your best bet... You'd insert a high value into the gap, and then delete the one you moved and repeat until all of the gaps are filled.
It would be tricky (although not impossible) to script it, but you'd end up blocking the table if you scripted it.
But this also leads to the next question (that has been brought up already) - do you need that column? That is, does this identity column help you in any usefull manner?
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
July 20, 2017 at 2:56 pm
just a thought.........probably way off base !
SELECT TOP 30 PG_ID = IDENTITY( INT, 1, 1)
INTO #PG
FROM sys.all_columns;
SELECT * FROM #PG;
DELETE FROM #PG WHERE pg_id % 5 = 0;
SELECT *FROM #PG;
SELECT PG_ID,
ROW_NUMBER() OVER(ORDER BY pg_id) AS newid
FROM #pg;
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
July 20, 2017 at 3:47 pm
Please explain the business case for trying to reuse the gaps created by deleting data. You are far from running out of identity values so, as someone else said, it is as if you have a solution looking for a problem.
July 21, 2017 at 1:16 pm
Just going to ignore my question?
July 21, 2017 at 2:06 pm
Lynn Pettis - Friday, July 21, 2017 1:16 PMJust going to ignore my question?
He did explain that his IDENTITY column was approaching the limits of BIGINT, which is a pretty good reason to re-sequence. A better thing to do, though, might be to just get rid of the IDENTITY column on this audit table altogether.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 21, 2017 at 2:09 pm
Jeff Moden - Friday, July 21, 2017 2:06 PMLynn Pettis - Friday, July 21, 2017 1:16 PMJust going to ignore my question?He did explain that his IDENTITY column was approaching the limits of BIGINT, which is a pretty good reason to re-sequence. A better thing to do, though, might be to just get rid of the IDENTITY column on this audit table altogether.
If you look up I asked him what the max value of the identity column was and he replied: 2377688866 (2,377,688,866). Don't think he is any where near the limit.
July 22, 2017 at 5:47 pm
Lynn Pettis - Friday, July 21, 2017 2:09 PMJeff Moden - Friday, July 21, 2017 2:06 PMLynn Pettis - Friday, July 21, 2017 1:16 PMJust going to ignore my question?He did explain that his IDENTITY column was approaching the limits of BIGINT, which is a pretty good reason to re-sequence. A better thing to do, though, might be to just get rid of the IDENTITY column on this audit table altogether.
If you look up I asked him what the max value of the identity column was and he replied: 2377688866 (2,377,688,866). Don't think he is any where near the limit.
Heh... you're right. He claimed he was approaching the limit for BIGINT but 2,377,688,866, as you and Phil have both pointed out, that's no where close to the limit of BIGINT. Maybe he's confused about what the datatype for the column in question actually is or what the limits of INT and BIGINT actually are.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 16 through 22 (of 22 total)
You must be logged in to reply to this topic. Login to reply