May 13, 2008 at 3:10 am
Hi,
I would like to know if there is a way to execute stored procedures every second. I have tried with jobs and the minimun execution schedule is every minute. So the unique solution that I have found is to execute once and controle it inside the stored procedure with a delay.
But doing that I do not know how many time is needed. Do you know if there is any function in sql server 2005 to know how many milliseconds is needed by a stored procedure when a job executes it?
In my case I have an stored procedure that makes a bulk of a csv file to one table each second, and executing it manually I can see how many time it takes in milliseconds (using a function inside the stored procedure). But using a job that executes it, I just can see when it starts and when it finishes, so I do not know how many time is needed to execute the stored procedure by the job.
Thank you in advance:P
May 13, 2008 at 6:12 am
There's no way to know in advance how long a query will take. You can look at an execution plan for the estimated cost, but because of I/O contention, CPU, memory swapping, blocking, locking... a query can run in 5ms one time and 500 seconds the very next time. You just can't know in advance which one you're going to get.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 13, 2008 at 6:31 am
The stored procedure that makes the bulk takes 120-500 milliseconds. But as I am doing the bulk each second it would be 12%-50% of a second that is too much, because, at the same time there are other queries to the database. So I would like to know if there is a function on the sql server to measure that.
I know that I cannot measure the needed time in advance, because is not always the same. But I just want to know, how many time does a job need to execute an stored procedure.
Thanks
May 13, 2008 at 6:39 am
You could use Profiler to capture the actual execution time.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 13, 2008 at 2:49 pm
In case if you want to get exact actual execution time of a stored procedure to calculate its average, minimum, maximum execution time, I can also sugget using sys.dm_exec_query_stats system view. This is indirect measurement because it reads data from object cache. It has fields min.., max.., total_elapsed_time, execution count.
In order to filter for aspecific SQL statement join (oficial term is cross apply) with function sys.dm_exec_sql_text
May 13, 2008 at 2:58 pm
You can get relatively accurate with the run time by setting a variable to getdate() before you execute, and another to getdate() after you execute, and comparing the difference. Would that do what you need?
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
May 13, 2008 at 9:57 pm
etxeb (5/13/2008)
Hi,I would like to know if there is a way to execute stored procedures every second. I have tried with jobs and the minimun execution schedule is every minute. So the unique solution that I have found is to execute once and controle it inside the stored procedure with a delay.
But doing that I do not know how many time is needed. Do you know if there is any function in sql server 2005 to know how many milliseconds is needed by a stored procedure when a job executes it?
In my case I have an stored procedure that makes a bulk of a csv file to one table each second, and executing it manually I can see how many time it takes in milliseconds (using a function inside the stored procedure). But using a job that executes it, I just can see when it starts and when it finishes, so I do not know how many time is needed to execute the stored procedure by the job.
Thank you in advance:P
If the existing scheduler does not help you can create your own app which would do a similar thing.
Using VS create a Windows Service application (File->New->VB->Windows->Windows Service)
Google how to connect to the DB and execute a SQL script, somthing like below
objAdapter = New SqlDataAdapter
objCon = New SqlConnection(sconnection)
objCon.Open()
objAdapter.SelectCommand = cmdCommand
objAdapter.SelectCommand.Connection = objCon
objAdapter.SelectCommand.CommandType = CommandType.StoredProcedure
objAdapter.SelectCommand.CommandText = sSPName
look here
Then check the beginning time, check the time and difference in seconds and run a SP when a second goes past.
If the execution took longer than a second you might skip the time taken by the execution.
Otherwise you would need to create a thread and place the SP execution code in there and fire it from the main scheduler.
Then you compile the code and add it to Windows as service. Again google for registering a Windows service.
The stuff above is out of scope of this forum and also the whole MSSQL2005 area.
Good luck
M.Sc.IT, M.B.A, MCTS BI 2008, MCITP BI 2008, MCTS SQL Dev, CSM, CDVDM
May 14, 2008 at 4:34 am
Hello,
My purpose is to read some csv files from an application. And to do that I thought that the best way was to bulk that csv to the data base that is used by the same application. Maybe is not the best option to do that.
So, as that csv file can change at any time, I must bulk the csv each second(using delays and whiles). And that bulk is written in a SP that is executed by a JOB.
I make the time control inside the SP usin this function:
select getdate() - @start as ElapsedTime
With that function we can only measure the time inside the SP. But my question was, if there is an application or something that measures the milliseconds (it has to be very precise) that a JOB needs to execute an SP.
I have been learning something about PROFILER but I do not how to use it, I did not know that there was a tool like this. I must apply some filters to watch the useful information.
Sorry for my written english, I continue learning it.
May 14, 2008 at 5:34 am
It sounds like you may be on the right track. Just remember that Profiler is only going to show you what happened, not what could happen.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 14, 2008 at 6:49 am
I know that we cannot know what is going to happen, because a database is non deterministic. So as you have told me, a querie can last between 5 or 500 milliseconds.:)
May 14, 2008 at 7:34 am
Not 100% sure I follow,
1. You have an application that drops csv files to a specific folder on an ad-hoc type of basis
2. You need to read those csv's as soon as possible after they are created
Correct?
If the application puts out a different csv each time it writes, why not use SSIS to do a file looping for that location to read in the files?
May 14, 2008 at 9:10 am
mtassin (5/14/2008)
Not 100% sure I follow,1. You have an application that drops csv files to a specific folder on an ad-hoc type of basis
2. You need to read those csv's as soon as possible after they are created
Correct?
If the application puts out a different csv each time it writes, why not use SSIS to do a file looping for that location to read in the files?
I do not understand the meaning of ad-hoc type of basis. It is a local computer.
But the fact is that every day a csv file is created (only one per day). And the number of lines of that csv file goes increasing during the day. So the problem is that I cannot know when is the csv file going to increase.
That is why I thought that the best option was to bulk the csv file each second, because the information of the csv file is very important, and I cannot be waiting. Moreover, I cannot use xp_filesize because I am not allowed to install any DLL, by the owner of the data base.
😛
May 14, 2008 at 9:24 am
etxeb (5/14/2008)
mtassin (5/14/2008)
Not 100% sure I follow,1. You have an application that drops csv files to a specific folder on an ad-hoc type of basis
2. You need to read those csv's as soon as possible after they are created
Correct?
If the application puts out a different csv each time it writes, why not use SSIS to do a file looping for that location to read in the files?
I do not understand the meaning of ad-hoc type of basis. It is a local computer.
But the fact is that every day a csv file is created (only one per day). And the number of lines of that csv file goes increasing during the day. So the problem is that I cannot know when is the csv file going to increase.
That is why I thought that the best option was to bulk the csv file each second, because the information of the csv file is very important, and I cannot be waiting. Moreover, I cannot use xp_filesize because I am not allowed to install any DLL, by the owner of the data base.
😛
If the CSV show up once per day/gets modified just once in a while, running something continuously in the hopes of catching the mod time is, well - overkill. Something firing every second to maybe catch something once a day is a god-awful waste of resources. That's worse that the old memory resident programs that would take over your DOS-based machines.....
I'd consider taking it on from the other side. Set up a .NET application, which uses a FileSystemWatcher event (which watches that particular CSV or the folder it's in). When the file is copied in/modified, etc..., it fires the FSW event which triggers your bulk insert/whatever. Damn near NO resource utlization except when the file is actually triggered.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
May 14, 2008 at 9:52 am
Matt Miller (5/14/2008)
If the CSV show up once per day/gets modified just once in a while, running something continuously in the hopes of catching the mod time is, well - overkill. Something firing every second to maybe catch something once a day is a god-awful waste of resources. That's worse that the old memory resident programs that would take over your DOS-based machines.....
I'd consider taking it on from the other side. Set up a .NET application, which uses a FileSystemWatcher event (which watches that particular CSV or the folder it's in). When the file is copied in/modified, etc..., it fires the FSW event which triggers your bulk insert/whatever. Damn near NO resource utlization except when the file is actually triggered.
It does not catch an only line per day, it can catch lots of lines each second, but sometimes, they are minutes that there is no change on the csv file. I know that it can be a waste of resources.
And I think that your idea is better than mine. Tell me more about those triggers. It is interesting!!
I have understood that I have to control when is the file modificated and at this moment execute the stored procedure. Can be the best solution for my case:w00t: thanks
May 14, 2008 at 10:00 am
etxeb (5/14/2008)
Hello,My purpose is to read some csv files from an application. And to do that I thought that the best way was to bulk that csv to the data base that is used by the same application. Maybe is not the best option to do that.
So, as that csv file can change at any time, I must bulk the csv each second(using delays and whiles). And that bulk is written in a SP that is executed by a JOB.
I make the time control inside the SP usin this function:
select getdate() - @start as ElapsedTime
With that function we can only measure the time inside the SP. But my question was, if there is an application or something that measures the milliseconds (it has to be very precise) that a JOB needs to execute an SP.
I have been learning something about PROFILER but I do not how to use it, I did not know that there was a tool like this. I must apply some filters to watch the useful information.
Sorry for my written english, I continue learning it.
Instead of constantly re-importing the file into the database, have you considered querying it through OpenRowset or a linked server connection? That might be considerably better, and definitely easier to set up.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply