SQL Server Memory Problems

  • Hi,

    We are having trouble the SQL Server using memory and then not releasing the memory back to the OS. If you look at the performance monitor, you can see how the memory used just gets more and more. We are using user defined functions and dynamically generated SQL queries (sp_executesql).

    Any ideas on what is going wrong?

    Thanks

  • What issues do you have after seeing memory is consumed by SQL Server? Any other applications run in same server?

    SQL Server memory allocation has benn configured dynamically to use all of available memory by default and it consumes memory as much as it needs untill other applications or OS need memory.

    Quoted from BOL.

    "When an instance of SQL Server starts, it typically acquires 8 to 12 MB of memory to complete the initialization process. After the instance has finished initializing, it acquires no more memory until users connect to it and start generating a workload. The instance then keeps acquiring memory as required to support the workload. As more users connect and run queries, SQL Server acquires the additional memory required to support the demand. The instance will keep acquiring memory until it reaches its memory allocation target, it will not free any memory until it reaches the lower limit of the target.

    To acquire as much memory as possible without generating excess paging I/O, each instance of SQL Server sets a target of acquiring memory until free physical memory on the computer is in the range of 4 MB to 10 MB. This range was chosen because testing has shown that Windows NT and Windows 2000 have minimal memory swapping until the memory allocations equal the available physical memory minus 4 MB. An instance of SQL Server that is processing a heavy workload keeps the free physical memory at the lower end (4 MB) of the range; an instance that is processing a light workload keeps the free memory at the higher end of the range (10 MB).

    An instance of SQL Server will vary its target as the workload changes. As more users connect and generate more work, the instance will tend to acquire more memory to keep the available free memory down at the 4 MB limit. As the workload lightens, the instance will adjust its target towards 10 MB of free space, and will free memory to the operating system. Keeping the amount of free space between 10 MB and 4 MB keeps Windows NT or Windows 2000 from paging excessively, while at the same time allowing SQL Server to have the largest buffer cache possible that will not cause extra swapping."

  • I had a similar problem in one of the project and found the solution as follows.

    Thought it might be of help to you.

    This project was similar to yours where it was running a backend process which used to run for hours. The process was calling hundreads of stored procedures and each stored procedure had full of dynamic sqls (sp_executesql). There was severe memory consumption and was never got released even after connection is closed.

    The problem was as below.

    All the sqls were constructed dynamically.

    For example "Select * from table1 where key = key1"

    where key1 changes depending on current context. Sql server caches the all the sqls in the Sys proc cache before executing the query. And when a new query need to be executed, sql server checks in this cache whether the query already exists. If so it will not compile the query instead it takes the existing query plan in the sysproccache and executes the query.

    Now this works fine for normal scenario. Think of the above example. When the above dynamic query runs in a loop, because key1 different and it is concatenated to sql statement, each time the query will be different and hence sql*server does not find a matching data in sysproc cache. Hence it stores the new sql again. This goes on and on and there will be thousands of isntances of same sql which differ only in key values.

    This will have following effects

    - Memory consumption increases (Due to growth in sysproc cache)

    - The process becomes slower and slower because the search in sysproccache takes longer and longer.

    So what is the solution?

    The main bug in the dynamic sql was to concatenate the values with the sql. This should not be done. Alternatively parameterized sqls have to be used. ie., for the above example "Select * from table1 where key = @p1". Call this sql with sp executesql by supplying parameter values separately. This will make sure that only one instance of sql exists in cache and plan is reused.

    Hope this helps

  • SQL Server will stabilize at some point, but it usually expects and likes to be the only app on the server. Is this a shared server? Other applications running?

    How much memory on the server?

    Steve Jones

    sjones@sqlservercentral.com

    http://www.sqlservercentral.com/columnists/sjones

    http://www.dkranch.net

  • Sometimes it might be due to long running queries which lock the resources and these are released only on commit. If its sql 2000, see if you can set the 3GB switch in the boot.ini file. It cannot be set on standard edition.

    Good luck.

  • Have you seen any specific issues?

    SQL Server will eat up as much memory as possible based on queries and number of common executions. This is keeping data in memory to speed access time and execution plans. As the server reaches a point it will start makin decisions to release data but not memory. If the OS determines an application or itself needs memory it will force SQL to release memory as needed (as long as there are no outstanding issues).

    I have a server with 1GB of memory and SQL stabalizes at around 828MB of memory.

    I would not be concerned with memory usage unless ou start seeing major issues with delays or server hangs for long periods. If you are suffering these problems then look at buffer cache hit ratio and average queue length. Plus use Profiler to catch poor queries that need to be rewritten. Otherwise don't sweat it.

  • Rajesh,

    Great post. Whether it will solve the problem or not remains to be seen, but adding the parameter as you noted will give you quite a boost. One query plan generated vs hundreds or thousands.

    Andy

    http://www.sqlservercentral.com/columnists/awarren/

  • I had posted the same problem, recently, under SQL Server - General, as shown below:

    "Has anybody come across this curious problem:

    I am running SQL Server 2000 on various P3 and Celeron based PCs without any problems, except on one PC; this PC is an IBM NetVista P4 1.5GHz with 512MB RAM on Windows 2000 (with no SP's for either Win2K or SQL Server 2K).

    Whenever I run a query using QA, the memory used up by QA is not released after exiting. I need to stop the Server Service Manager and then restart it, in order to release the memory (which I monitor through the Windows task manager)."

    Thanks to Allen_Cui who had helped me. The rest of the responses to this topic have also helped. However, I am running concatenated SQL queries from within VB and the program typically runs for 1-3 days. The program usually crashes (before the run completes) with an ODBC timeout.

    What happens is that SQL Server just hangs. This is what led me to find out about the memory not being released. I don't know how this can be solved from VB, using Rajesh's proposal. Any ideas?

    Edited by - mahgoub on 04/21/2003 03:45:13 AM

  • This is with reference to mahgoub's first query regarding releasing memory when using QA.

    There is a option to clear the unreleased memory without restarting Sql*Server

    DBCC FREEPROCCACHE

    DBCC DROPCLEANBUFFERS

    The first one releases all memory allocated procedure cache which is not currently in use.

    The second one releases all the memory allocated for plans and adhoc sqls.

    This helped me when I had problem with dynamic sqls.

  • This is regarding mahgoub's second query

    <<However, I am running concatenated SQL queries from within VB and the program typically runs for 1-3 days>>

    I am not clear on this. Let me give you one example. Suppose that there is an employee table with key empid and there are 10000 rows.

    In VB you write code to process each employee record

    Recordset.Open "Select * from employee where empid = " & lngEmpId

    If you are doing this and this is in a loop where lngempid keeps changing then this is certainly an issue.

    Changing the above to use adocommand object and changing the query into parameterized query boosts performance.

    Note that if the loop executes 10000 times, in the first case there will be 10000 plans in cache and in second case there will be only one plan.

    Hope this helps

  • Thanks Rajesh for your very valuable help

  • mahgoub, you are most welcome. I am glad it helped

  • For this reason I think its best that SQL Server Should be the only thing running on a Production Box. As a DBA you have a better grasp on resourses like memory etc.

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply