September 8, 2009 at 11:34 am
Good morning/afternoon everyone
I'm new to the community, and pretty new to MS SQL not having used it in several years and being immersed in the LAMP environment for far too long.
My situation is this. I have a lot of SQL queries that return 1 row/1 column (pass/fail/or some other message). I need to run all these queries against my MS SQL 2008 environment and populate a table with their results.
Is there a method (say through a table that lists the queries that need to be run with the filename of the query) to run these queries and have their results inserted.
A simple point in the right direction should be more than enough and I can figure it out from there if this is possible.
Thanks in advance and if I'm asking this in the wrong place or in the wrong format please feel free to slam me however needed to get the point across 🙂
Thanks!
September 8, 2009 at 11:48 am
Before I answer I want to ask you a question of my own. Why do they have to exist as files, do they change a lot, why wouldn't you roll them up into a stored procedure? It seems really inefficient to have to call out to SQLCMD to execute SQL files in the file system.
Without further information from you I stongly question the methodology you are considering, and think that it might be because of your inexperience that you are looking at it this way.
Now, the answer to your question. Yes you could have a table with the filenames and using xp_cmdshell call out to SQLCMD (because you are in a SQL 2008 topic) to execute the files one by one. Those queries should contain the logic to insert the record into the table themselves.
No slams, just constructive questions..
CEWII
September 8, 2009 at 11:56 am
Elliott W (9/8/2009)
Before I answer I want to ask you a question of my own. Why do they have to exist as files, do they change a lot, why wouldn't you roll them up into a stored procedure? It seems really inefficient to have to call out to SQLCMD to execute SQL files in the file system.Without further information from you I stongly question the methodology you are considering, and think that it might be because of your inexperience that you are looking at it this way.
Now, the answer to your question. Yes you could have a table with the filenames and using xp_cmdshell call out to SQLCMD (because you are in a SQL 2008 topic) to execute the files one by one. Those queries should contain the logic to insert the record into the table themselves.
No slams, just constructive questions..
CEWII
Thanks for your reply Elliott. Looking at xp_cmdshell it looks like that will do what I was after. But your questions regarding the methodology that I'm using actually makes me hesitant now to go down this path and I'd hate to start something going the wrong way when there are better ways to do this.
The reson for the files is that they are changed frequently and are contributed by a lot of seperate individuals and not just myself so they would all need a way to easily add their queries into the queue that is run. So there are always going to be new ones, some removed and some changed.
September 8, 2009 at 12:02 pm
Do you have any change control methodology? Who decides these queries are ok?
From my perspective what you describe sounds a lot like chaos..
Why are they changing so much?
What is the purpose of this process?
What is the problem you are trying to solve?
It could be I don't understand what you are really after but as a DBA I could never accept what you are describing in a production environment.
CEWII
September 8, 2009 at 12:24 pm
Elliot has some real valid questions here. It's one thing to just build something that works and another to build out a production process that is best for system integrity, maintenance, and performance.
September 8, 2009 at 1:28 pm
Just to throw another wrench in the mix. You will also have to work out the order the scripts are processed. Either by standard naming conventions or by using the last modified / create date. If your environment is anything like ours then you will have scripts that have to be ran after other scripts complete.
September 8, 2009 at 2:38 pm
Matt Wilhoite (9/8/2009)
Just to throw another wrench in the mix. You will also have to work out the order the scripts are processed. Either by standard naming conventions or by using the last modified / create date. If your environment is anything like ours then you will have scripts that have to be ran after other scripts complete.
Also true, inter script dependencies or order might be important.
I'm hoping he responds soon. I really like to understand what he is really trying to accomplish..
CEWII
September 8, 2009 at 3:20 pm
Elliott W (9/8/2009)
Do you have any change control methodology? Who decides these queries are ok?From my perspective what you describe sounds a lot like chaos..
Why are they changing so much?
What is the purpose of this process?
What is the problem you are trying to solve?
It could be I don't understand what you are really after but as a DBA I could never accept what you are describing in a production environment.
CEWII
Sorry for the delay in response. I should have mentioned that this is not for production but for a staging environment.
The purpose of the process is to have a harness that QA can load in all the SQL queries that were written for a release (each SQL is reviewed and approved) that just does data/change validation for ETL processes. Each SQL would return its status (pass or a fail message)
The problem right now is that for validation its a manual process where each query is run by hand one at a time.
In the past in my non-MS SQL days I'd have just written a harness that would handle this in perl or any other language that would make read only connections to the database then log everything seperately but it would be nice if it could be self contained within MS SQL itself. (as its only running pre-written pre-approved queries)
Edit: Looking at xp_cmdshell looks like it won't actually do what I need and might have to do it the old fashioned way unless all the queries are just wrapped up together as these queries would never be placed on the actual SQL server itself. They just need to be executed and the results logged.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply