August 25, 2010 at 10:56 pm
Hi,
I am trying to run a 'SELECT' query that joins three tables, one of which has 41 million rows and the other two in, say hundred-thousands. My query uses the primary keys defined on all the tables but also builds a concatenated string based on a CASE statement! After 16 million rows I keep getting the above error. The server is quite powerful with 32 GB RAM running Windows Server 2008 and SQL Server 2008 64-bit! Can anyone suggest how to overcome this anomaly? There was one more user that was accessing the server but I do not think it really matters for such a powerful hardware! I even tried to use a query hint to make the optimizer choose the correct index but...
Any help is appreciated.
Thanks
Venky
August 25, 2010 at 11:21 pm
System.OutofMemoryException is a .NET error message. Are you using a CLR procedure or function in your query? How are you running the query?
August 26, 2010 at 12:18 pm
Hello Mr. Denny,
Thanks for your response. To answer your question, no, I am not using any CLR, function or a T-SQL stored procedure. I just ran the query (as a batch) on query window and this is the exact error I get:
[font="Courier New"]An error occurred while executing batch. Error message is: Exception of type 'System.OutOfMemoryException' was thrown. [/font]
Does this help?
Thanks again for your response.
Regards,
Venky
August 26, 2010 at 12:34 pm
Sounds like a client issue -- like your client doesn't have enough memory to handle the result set that is returned from the server.
August 26, 2010 at 12:53 pm
That's a problem with SQL Server Management Studio running out of memory. How much RAM is on the machine you are running management studio on? How much data is coming back from the query you are running?
August 26, 2010 at 1:15 pm
Considering Jeff's suggestion, I ran the query on the server itself but with the same results. As Denny points out it is the problem with SSMS (which is built on .NET technology :-(, and as corrrectly pointed out, the error message might be reflecting that).
I even tried to run the same query through sqlcmd, but the results came up with nothing but the header...I am flummoxed! BTW, the query is supposed to return about 42 million rows with just three columns..
The server has 32 GB RAM, 8 CPUs, and SQL Server has been configured to utilize as much as possible, yet...I even searched other sites where one of which had some kind of pointers by a Microsoft expert. He suggested to test it with TOP n rows (I tried 10000 and it works perfectly fine and very fast...) It stops around 33 mil! Sigh!
Should you have any other ideas that I can explore (like would running the query as an SP make difference. Ths SP can probably write the results in a table), please feel free to suggest.
Thanks again for your support.
Venky
August 26, 2010 at 1:18 pm
By "client" I did mean client app, so I was saying the same thing about SSMS. Why do you need 16 million plus rows to appear on your client? Wouldn't outputting to a file be sufficient? Client apps aren't designed to hold that much data.
August 26, 2010 at 1:28 pm
Venky Subramaniam (8/26/2010)
The server has 32 GB RAM, 8 CPUs, and SQL Server has been configured to utilize as much as possible
That's most likely your issue. There is not enough room outside of the cache to store all the rows. You need to leave some space for the OS to breathe. Also keep in mind having a logged in session + SSMS open requires a few hundred megs before you even run a query.
August 26, 2010 at 3:19 pm
Thank you all guys for your prompt responses. Yes, despite the initial failure of the process, I could at least gauge how much can SSMS hold in terms of data for display. before it bursts at the seams 🙂
BTW, I got my work done through a simple transfer to a table.
Thanks again,
Venky
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply