February 5, 2010 at 10:03 pm
Hi, I want to know if there is any limit of data with in one select stored procedure. Is it right to bring all tables in one Stored procedure or should I use multiple SPs in case of large amount of data. Is there any performance issues. Is it depends upon Network capability or any other issues. please help about this topic.
February 6, 2010 at 3:10 am
Since your question is vague, my answer will be too: it depends.
Usually, the number of tables or number of rows that a reaffected has not really an influence if there will be one procedure or more.
If the proc deals with data on the same server, Network issues won't matter.
If you need help on a specific scenario, please describe it more detailed.
February 6, 2010 at 3:33 am
Hi lmu92, I have same server and having suppose more than 1000 tables and each table can have millions of records. I've to show these records in my .NET applications.
What is exactly right, should I use one SP for this or more than one, in performance point of view.
If I use one SP to do so will network bandwidth effect it.
February 6, 2010 at 4:21 am
The answer to that question is: It Depends.
You will more than likely need numerous queries to support the various different aspects of your application. Some queries should require the joining of multiple tables into the proc, some will not.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
February 6, 2010 at 6:57 am
sanjeev.k (2/6/2010)
Hi lmu92, I have same server and having suppose more than 1000 tables and each table can have millions of records. I've to show these records in my .NET applications.What is exactly right, should I use one SP for this or more than one, in performance point of view.
If I use one SP to do so will network bandwidth effect it.
I'd question the business case in general!
Exactly right would be NOT to show thousands of tables with millions of rows each!
You should think about what data your application really needs, define some business rules that will define the details and work from there.
Moving that amount of data across the network to transform it with a .NET app seems the wrong way to go to me....
February 6, 2010 at 7:11 am
sanjeev.k (2/6/2010)
I have same server and having suppose more than 1000 tables and each table can have millions of records. I've to show these records in my .NET applications.
Who is going to read thousands of millions of rows?
Seriously, this is not a SQL Server capacity issue. It's a common sense issue.
While you can use a single stored proc to send thousands of millions of rows to a single client you will likely kill any network in existence by doing so, you will bring just about any client PC to its knees (memory usage) and no user in their right mind will read all of that, even if their machine could handle it.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
February 6, 2010 at 8:39 am
I believe there might be a bit of a language barrier problem here... I believe the OP is saying that, yes, he has thousands of tables and millions of rows, but the OP probably only needs to show 1 set of information based on criteria at a time. I don't believe he actually wants to display millions of rows all at the same time.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply