April 17, 2024 at 3:16 pm
I have a table that I need to run off a different server\db and looking for the most efficient way to call from an SP. I need to execute.
the field(sqlToExec) as it's loaded with 1000's of Insert statements like below to update this same table on a different server and DB.
If something error could we catch in an errorlog file or, table and continue processing, and should the Inserts be Batched by ID say 30,000 at a time.
Sample data:
INSERT INTO EngData (PartNbr,CostKey,PlantCd,Operation,Machine,RunHrs,SetHrs,ScrapRate,Tasking,Efficiency,Speed,NomOD,Weight,AvgOrder,Com1Wt,RecovCode,RecovDescr,RecovRate,SpecDesc1)
VALUES('814000000','Def','Tst','COLORING','FCOLOR',0.00193493,null,0.008,2.66,47,2100,0.255,0.05004,0,0,'05','No Copper',0,'1-RWP NAT')
Any help much appreciated.
CREATE TABLE [dbo].[crtInsert](
[id] [int] IDENTITY(1,1) NOT NULL,
[sqlToExec] [nvarchar](4000) NULL,
[timeAdded] [datetime] NULL
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[crtInsert] ADD DEFAULT (getdate()) FOR [timeAdded]
GO
April 17, 2024 at 8:50 pm
Looks like you are going to need to use dynamic SQL and you are going to have a bad time with that I fear. Why? because what happens if someone puts in something like "DROP DATABASE PRODUCTION" in the "sqlToExec" statement?
BUT if this is what you HAVE to do and you don't care about the risks, then look up dynamic SQL and I expect you'll also need to use a cursor to loop through the table.
BUT I do strongly advise against this approach as it is high risk.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
April 17, 2024 at 10:44 pm
The query that feeds this table is very specific (Only Insert) in what gets added to the sqltoexec can it be done without using a cursor?
Thanks for your reply.
April 19, 2024 at 6:06 pm
I tried this but it didn't delete record
DECLARE @Counter INT = 1;
DECLARE @MaxCount INT = 1; -- Set this to the number of times you want to run the insert
DECLARE @SQLString NVARCHAR(MAX);
DECLARE @ParmDefinition NVARCHAR(MAX);
WHILE @Counter <= @MaxCount
BEGIN
-- Construct the SQL command
SET @SQLString = N'select sqlToExec from sv1.cb.dbo.crtDelete where ID =1';
-- Execute the SQL command
EXECUTE sp_executesql @SQLString
-- Increment the counter
SET @Counter = @Counter + 1;
END
April 20, 2024 at 1:44 pm
Why would it? Your dynamic SQL starts with a select, so the quiet getting executed is a select. You would want something more along the lines of
Select @SQLString = sqlToExec
From sv1.cb....
Sorry for the bad code example... I am doing this from my phone and it is not letting me edit the code but you get the Idea (hopefully).
Also, just an FYI, a while loop in SQL is essentially a cursor.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
April 20, 2024 at 9:40 pm
I must have something wrong as it's error'ing
This is what a single sqlToExec where ID=1 looks like
delete from cstSums WHERE PartNbr='81K000000' AND PlantCd='CORP' AND CostKey='Standard'
Thanks for reply.
April 20, 2024 at 10:29 pm
You day it is erroring, what is the error?
Also try printing you dynamic SQL before you run it so you can verify it is what you expect.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
April 20, 2024 at 11:32 pm
DECLARE @Counter INT = 1;
DECLARE @MaxCount INT = 1; -- Set this to the number of times you want to run the insert
DECLARE @SQLString NVARCHAR(MAX);
DECLARE @ParmDefinition NVARCHAR(MAX);
WHILE @Counter <= @MaxCount
BEGIN
-- Construct the SQL command
Select @SQLString = N'sqlToExec from cb.dbo.crtDelete where ID =1';
-- Execute the SQL command
Select @SQLString
EXECUTE sp_executesql @SQLString
-- Increment the counter
SET @Counter = @Counter + 1;
END
The select of @sqlstring is:
sqlToExec from cb.dbo.crtDelete where ID =1
This is what the sqltoexec should be running.
delete from cstSums WHERE PartNbr='81K000000' AND PlantCd='CORP' AND CostKey='Standard'
April 22, 2024 at 2:22 pm
Yes, that is what I was saying. Your command for setting the SQL string is wrong.
Again, I STRONGLY encourage you to re-evaluate your solution as I am not sure if you and your team can support the solution. I'm not trying to question your SQL ability, but I want to make sure you can support this going forward. What will you do if/when this process fails? This forum can help, but it may take days to help you or we may not be able to help (we have no access to your systems or data).
BUT if you are 100% confident this is the solution you want to use and are 100% confident supporting it, the solution to your problem is:
DECLARE @Counter INT = 1;
DECLARE @MaxCount INT = 1; -- Set this to the number of times you want to run the insert
DECLARE @SQLString NVARCHAR(MAX);
DECLARE @ParmDefinition NVARCHAR(MAX);
WHILE @Counter <= @MaxCount
BEGIN
-- Construct the SQL command
Select @SQLString = CAST(sqlToExec AS NVARCHAR(MAX)) from cb.dbo.crtDelete where ID = 1;
-- Execute the SQL command
PRINT @SQLString
-- EXECUTE sp_executesql @SQLString
-- Increment the counter
SET @Counter = @Counter + 1;
END
Things I changed:
1- the SELECT for the @SQLString now gets the correct value back
2- I changed the SELECT to a PRINT so you can review the results and it will handle larger data sets better
3- I commented out the EXECUTE as you should ALWAYS validate that your SQL is valid prior to running dynamic SQL.
Another thing to note - your sqltoexec sample is a bit risky because you are NOT including the schema in the database name. This is risky because you CAN have 2 tables with the same name but different schemas. IF user A is expecting mySchema.cstSums to have data deleted as that is the table they usually access BUT your job runs as a user with dbo as the default schema AND you have a table called "cstSums" in the dbo schema as well, it'll delete from the dbo schema. IF there is no such table, you will get an error. You should ALWAYS (100% of the time) include the schema when you work with a table. Including the database should ONLY be done IF you are running a cross-database query. Otherwise it runs the risk of you running the query against the wrong database.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
April 22, 2024 at 2:29 pm
Including the database should ONLY be done IF you are running a cross-database query.
Otherwise it runs the risk of you running the query against the wrong database.
… Or of failing, if the database (or a copy of it) is ever renamed.
… Or of thoroughly confusing Visual Studio from within a database project.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
April 22, 2024 at 3:00 pm
Thanks for replies and alarms of this type of approach -- defiantly something that has been discussed.
With the approach above how it will scale when removing the where ID =1 .. since this was just for testing... There are going to be 1000,s of requests in this queue(sqltoexec) query... Can it be batched... how about catching failures can that be logged?
Many thanks again
April 22, 2024 at 5:09 pm
You can do this however you like. It's your system. You want it batched? then write your code to batch it. You want it to capture failures, toss in some error handling. As for scaling, it will not scale well. Cursors and loops are HUGE performance sucks which is why database people avoid them as much as possible.
If you want this to scale, you need to change the approach and likely the application.
One big way to get it to scale better - don't throw your SQL into a table to be run at a future date. Have the application run the SQL directly. OR, better yet, have a stored procedure that takes parameters of what is to be deleted (or inserted or selected or updated) and have the application call the stored procedure. That way your query can hit the plan cache rather than being a bunch of ad-hoc queries.
Also, with your approach, you would want to do "something" with a query after it has been run. What I mean is right now, lets say you have 10 rows in that table and you run your process to execute those. You will loop through 10 times. Now, you run your process again immediately after it completes - still running 10 times. Now users add 90 rows and you run it again, do you want it to run 100 times (old rows + new rows) or 90 times (just new rows)?
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
April 22, 2024 at 7:04 pm
The table will be cleared out before this new process runs, but I do need to process by ID..
Could it multi process the table?
Thanks
April 22, 2024 at 7:41 pm
The table will be cleared out before this new process runs, but I do need to process by ID..
Could it multi process the table?
Thanks
I would not recommend "multi-processing" the table (if I understand what you mean by handling multiple ID's concurrently). The reason being potential for a missed ID or duplicate ID's being run (low risk if you code it well) and blocking (medium risk if you have multiple scripts deleting from the same table, but it'll just pause one processing until the other completes. Worst case though, you'll hit a deadlock and something will get rolled back that you will need to re-process. I would NOT recommend "multi processing".
BUT if you want to, nothing me (or anyone else on this forum) can do to stop you! It's your code :). Just set things up so you can control the start and stop ID and run multiple queries in multiple SSMS windows or with multiple jobs.
Still not something I'd do though. IF I had to do something like this, I'd do it as a data export to Excel where I can review all the commands and put some sanity checking in there. Then if all scripts pass my sanity checks, I can copy-paste them to SSMS and run them and capture all errors. I would not have an automated system running arbitrary SQL which could be manipulated to be malicious and break my system. What if someone forgets a WHERE clause for example? You OK with the script just deleting ALL records from a table? Can you recover in that scenario? Or what if a malicious person puts a drop database in there? can you recover? or they sneak a CREATE LOGIN and grant it sysadmin access... soooooo much can go wrong with automated dynamic SQL. At least if I dump it to Excel, I have a record of what was run and I can verify before execution. I can run it through all my sanity checks (does LOGIN exist in the script? does DROP exist in the script? If DELETE exists, is there a WHERE? etc). Doesn't help recover from errors, but helps me make sure the scripts are accurate and complete prior to running them.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
April 23, 2024 at 11:52 pm
Thanks Brian for all of your replies. and suggestions can you look at this code and tell me if this would handle processing the table with a batchsize of 50,000 and process my entire input table?
I ran it a few times with Print statement but doesn't look like I'm doing the entire table and doesn't seem to remove records ..
Thanks again
DECLARE @BatchSize INT = 50000;
DECLARE @IdControl INT = 1;
DECLARE @Results INT;
DECLARE @SQLString NVARCHAR(MAX);
DECLARE @ParmDefinition NVARCHAR(MAX);
WHILE (@Results IS NULL OR @Results > 0)
BEGIN
-- Insert records in a batch
Select @SQLString = CAST(sqlToExec AS NVARCHAR(MAX)) from CB_Staging.dbo.crtDelete where
ID > @IdControl AND ID <= (@IdControl + @BatchSize) order by ID;
Print @SQLString
-- Execute the SQL command
Begin Try
EXECUTE sp_executesql @SQLString
END TRY
BEGIN CATCH
-- Error handling code
SELECT
ERROR_NUMBER() AS ErrorNumber,
ERROR_SEVERITY() AS ErrorSeverity,
ERROR_STATE() AS ErrorState,
ERROR_PROCEDURE() AS ErrorProcedure,
ERROR_LINE() AS ErrorLine,
ERROR_MESSAGE() AS ErrorMessage;
-- You can log the error details or take other actions as needed
END CATCH
-- Update the @IdControl for the next batch
SET @Results = @@ROWCOUNT
SET @IdControl = @IdControl + @BatchSize
END
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply