April 9, 2018 at 7:44 am
saptek9 - Thursday, April 5, 2018 8:57 AMGrant Fritchey - Thursday, April 5, 2018 6:22 AMThere is absolutely ZERO way to tune a query that is selecting all columns and all rows. You can't do it.What you can do, is tune the hardware and the network. By tune I mean of course, buy bigger, faster hardware. Throw lots of money at this problem and you'll fix it.
Otherwise, go back to the business and try to figure out what they really need, because it's very seldom 400 million rows of data.
Is there any way i can show them the screenshot to prove that they need faster hardware or memory?
They don't need faster hardware or memory.
What they need is some brains.
_____________
Code for TallyGenerator
April 9, 2018 at 8:26 am
Teach them how
select top 10 * from mybigtable
works. I bet someone just wants to look at some data and wonders why it takes so long.
April 9, 2018 at 11:15 am
Sean Lange - Monday, April 9, 2018 7:28 AMJeff Moden - Friday, April 6, 2018 2:09 PMsgmunson - Friday, April 6, 2018 12:49 PMSean Lange - Thursday, April 5, 2018 9:44 AMsaptek9 - Thursday, April 5, 2018 9:01 AMsaptek9 - Wednesday, April 4, 2018 1:58 PMI have a table with following stats:
443114382 rows,
reserved 45511728 kb,
680616 kb index size and
248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
Execution plan is showing Index scan as user querying select*
Please advise if you have any suggestions.My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?
Ask them if it is ok to email them a spreadsheet with the data being returned by that query. They will likely freak out and tell you that is entirely too much data for an email. To which your response is "exactly".
Spreadsheet with a BILLION rows ?
1.) Won't fit on one tab.
2.) Will need ~ 1,000 tabs, plus or minus a few.
3.) They can forget about any kind of analysis across the entire data set.
4.) Who's going to spend the time to figure out how to generate those 1,000 different tabs? (here, maybe you get lucky with bulk copy, but I doubt it...)
5.) What possible value would such a behemoth of a spreadsheet do for anyone, assuming that it even manages to avoid any kind of number of tabs limitation, or for that matter, windows file size limitation?
6.) Anyone care to try and add a formula to all of those rows, on all tabs, and once you have, anyone wanna press F9 (calc) ? After all, surely with a spreadsheet that size, you DID remember to turn off automatic recalculation, didn't you?
7.) And 6 assumes you can even OPEN the spreadsheet without waiting a few years for it to load, or maybe just crashing Excel or your computer as a whole...
8.) You better have at least 64GB of RAM... and a lot of $$$, ... and even then you might well be in trouble... heck, you might even need a server machine so that you would be capable of having enough RAM... current desktop and laptop motherboards are limited to 64GB max, and only at the very high end do they accommodate that much.Need I go on? You can reference the limits on Excel specs here: https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3
Be careful what you ask for, Jeff... someone out there is bound to be foolish enough to try and take you up on it. Maybe you could make this a fun project, just to see if it's even possible... (ha ha, tee hee, tee hee) :hehe:
It wasn't me that made the suggestion... although I thought the suggestion was appropriately humorous. 🙂
It was me that made the suggestion. And all the absurdity of how completely stupid and unmanageable it would be as a spreadsheet was exactly the point.
+443 million... same as the number of rows. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
April 10, 2018 at 10:46 am
Jeff Moden - Monday, April 9, 2018 11:15 AMSean Lange - Monday, April 9, 2018 7:28 AMJeff Moden - Friday, April 6, 2018 2:09 PMsgmunson - Friday, April 6, 2018 12:49 PMSean Lange - Thursday, April 5, 2018 9:44 AMsaptek9 - Thursday, April 5, 2018 9:01 AMsaptek9 - Wednesday, April 4, 2018 1:58 PMI have a table with following stats:
443114382 rows,
reserved 45511728 kb,
680616 kb index size and
248 kb unused. One of the reporting app running directly select * from this table with no where condition. It's taking too long time to execute. It's just simple select statement. when i talk to those they want to query that way and asking me to find the reason and solution to fix it.
Execution plan is showing Index scan as user querying select*
Please advise if you have any suggestions.My max memorey size is 64 gb and min memory size is 32 gb. I am asking them to increase the memory to do this kind of queries successfully. I saw 4 direct select queries retrieving more than billion records. But they want that way. How can i prove that we need more memory or hardware to do so?
Ask them if it is ok to email them a spreadsheet with the data being returned by that query. They will likely freak out and tell you that is entirely too much data for an email. To which your response is "exactly".
Spreadsheet with a BILLION rows ?
1.) Won't fit on one tab.
2.) Will need ~ 1,000 tabs, plus or minus a few.
3.) They can forget about any kind of analysis across the entire data set.
4.) Who's going to spend the time to figure out how to generate those 1,000 different tabs? (here, maybe you get lucky with bulk copy, but I doubt it...)
5.) What possible value would such a behemoth of a spreadsheet do for anyone, assuming that it even manages to avoid any kind of number of tabs limitation, or for that matter, windows file size limitation?
6.) Anyone care to try and add a formula to all of those rows, on all tabs, and once you have, anyone wanna press F9 (calc) ? After all, surely with a spreadsheet that size, you DID remember to turn off automatic recalculation, didn't you?
7.) And 6 assumes you can even OPEN the spreadsheet without waiting a few years for it to load, or maybe just crashing Excel or your computer as a whole...
8.) You better have at least 64GB of RAM... and a lot of $$$, ... and even then you might well be in trouble... heck, you might even need a server machine so that you would be capable of having enough RAM... current desktop and laptop motherboards are limited to 64GB max, and only at the very high end do they accommodate that much.Need I go on? You can reference the limits on Excel specs here: https://support.office.com/en-us/article/excel-specifications-and-limits-1672b34d-7043-467e-8e27-269d656771c3
Be careful what you ask for, Jeff... someone out there is bound to be foolish enough to try and take you up on it. Maybe you could make this a fun project, just to see if it's even possible... (ha ha, tee hee, tee hee) :hehe:
It wasn't me that made the suggestion... although I thought the suggestion was appropriately humorous. 🙂
It was me that made the suggestion. And all the absurdity of how completely stupid and unmanageable it would be as a spreadsheet was exactly the point.
+443 million... same as the number of rows. 😀
Okay, ... 443 Million... so there's maybe four to five hundred tabs instead of 1,000. Pretty sure that's gonna be irrelevant in the long run....
I'm still pretty sure I'd love to see you try it though, Jeff...
ya know... just for gits and shiggles ... :hehe::hehe::hehe:
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
Viewing 4 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply