January 6, 2020 at 5:41 pm
Hi
I have got a business need where a user can execute a proc but if there are any DML commands in it, then those should not be executed(error can be thrown and is fine with me or message can be printed.)
Now I can't create triggers as I don't know which table DML operation might come in the proc.
I also can't use BEGIN TRAN\Rollback as huge amount of data is involved and doing the DML and rolling it back will take huge amount of time. It's equivalent EXPLICIT Transaction is also ruled out.
Can some trick be done by setting permissions inside proc? or some other way for it?
thanks
Online Trainer For SQL DBA and Developer @RedBushTechnologies with 18 yrs exp.
January 6, 2020 at 5:47 pm
If the procs don't contain DML, what do they contain?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
January 6, 2020 at 6:33 pm
Seems to me, that rather than giving them access to this proc, there should be a view created with what they supposedly need access to.
January 7, 2020 at 5:56 am
If the procs don't contain DML, what do they contain?
ok, let me elaborate a bit more.
The idea that I am working on, is to add a debug flag as parameter to my procs. when the value of debug flag is 1, then most of the important queries timing will be printed and with the help of IF ELSE, DML commands will be run as SELECT COUNT(*). If Debug flag is 0, then all DML operations will be done and no timing will be printed.
Now the idea is to run the copy of proc with debug flag=1 in prod before deployment, so that we are sure of the timing it is going to take when actually deployed. We are doing this as we have burned our hands many times when a proc was working fine in Performance env. but ran extremely slow in Prod because of index mismatch, variation in data volume, server load etc.
But we have a risk here what if the copy of proc is ran with debug 0 by the person who runs it in prod for performance check? it will modify the actual data and will be a disaster. So i wanted to prevent the copy proc to do any DML even though it has DML commands in it.
I hope I made it clear. Pls ask questions if not.
Online Trainer For SQL DBA and Developer @RedBushTechnologies with 18 yrs exp.
January 7, 2020 at 7:30 am
Hi S_Kumar_S,
Even if you provide explicit DENY Insert/Update/Delete in the user securables within the database to the login-user, the procedure once executed by the same user will perform the DML.
As an alternative suggestion -
Create a replica of the database in the prod environment and do the testing during the off hours. You can also make use of profiler or below TSQL to capture the cpu time and total elapsed time of the SP.
SET STATISTICS TIME ON
SET STATISTICS IO ON
EXEC (your SP)
SET STATISTICS TIME OFF
SET STATISTICS IO OFF
January 7, 2020 at 10:27 am
For now, I have came up with this as I couldn't find a nice way to achieve it:
Create Proc MytestProc_V1 @DebugFlag bit =0
AS
BEGIN
If @DebugFlag =0
BEGIN
Print 'You are running the proc with @DebugFlag =0. Pls run it with @DebugFlag =1'
RETURN
END
--Keep doing other DML things
SELECT 1
END
GO
Online Trainer For SQL DBA and Developer @RedBushTechnologies with 18 yrs exp.
January 7, 2020 at 5:07 pm
So, what you are saying is you create procedures that are tested in development, and when moved into production, various issues cause them to perform poorly, or cause incorrect results, and so forth?
If so, then you are solving the wrong problem. It sounds as if you are not performing proper QA and testing prior to moving things into production.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
January 13, 2020 at 12:14 pm
So, what you are saying is you create procedures that are tested in development, and when moved into production, various issues cause them to perform poorly, or cause incorrect results, and so forth?
If so, then you are solving the wrong problem. It sounds as if you are not performing proper QA and testing prior to moving things into production.
I would say YES for some points. It has happened the performance was fine in Performance environment but got degraded in PROD. It's always a challenge to keep the Perf in sync with PROD. So we wanted to have a way to actually run it in prod without doing any DML there so that we are convinced that it won't perform poorly when deployed in PROD.
We have figured one more way to do it: Do only the DML in a Parent proc( as we just have one insert at the end of proc) and all calculations and filling temp tables in Child PROC. We give the child proc only for testing so that a DML is never done accidentally. This seems to serve our purpose fine.
Online Trainer For SQL DBA and Developer @RedBushTechnologies with 18 yrs exp.
January 13, 2020 at 12:48 pm
So ... you think that 'calculations and filling temp tables' are not DML?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
January 13, 2020 at 4:20 pm
Michael L John wrote:So, what you are saying is you create procedures that are tested in development, and when moved into production, various issues cause them to perform poorly, or cause incorrect results, and so forth?
If so, then you are solving the wrong problem. It sounds as if you are not performing proper QA and testing prior to moving things into production.
I would say YES for some points. It has happened the performance was fine in Performance environment but got degraded in PROD. It's always a challenge to keep the Perf in sync with PROD. So we wanted to have a way to actually run it in prod without doing any DML there so that we are convinced that it won't perform poorly when deployed in PROD.
We have figured one more way to do it: Do only the DML in a Parent proc( as we just have one insert at the end of proc) and all calculations and filling temp tables in Child PROC. We give the child proc only for testing so that a DML is never done accidentally. This seems to serve our purpose fine.
It really sounds as if you are solving the wrong problem. If changes to production are not in sync with changes in lower environments, then your change control process needs to be looked at.
If your performance is fine in the testing environments, but not production, then your testing is flawed.
Michael L John
If you assassinate a DBA, would you pull a trigger?
To properly post on a forum:
http://www.sqlservercentral.com/articles/61537/
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply