October 18, 2016 at 9:07 pm
Comments posted to this topic are about the item One Million
October 19, 2016 at 12:04 am
What impresses me is when Teradata was formed, what they set out to do and when they achieved it! 1992!
I was installing 20mb Western Digital file cards into PCs and we couldn't imagine ever filling one of those! We still backed up to 1.44mb floppy disks. You probably booted from a floppy disk if you didn't have a hard drive and both Word and Multiplan (forerunner to Excel) would fit on a single floppy!
Teradata remains deeply impressive but they don't seem to shout about it
October 19, 2016 at 7:22 am
1 Mil is a pretty good number. When we look at Sentry we do see around that range write bytes on our Env.
On another note, will you be at the PASS this year?
-Roy
October 19, 2016 at 8:32 am
Roy Ernest (10/19/2016)
1 Mil is a pretty good number. When we look at Sentry we do see around that range write bytes on our Env.On another note, will you be at the PASS this year?
Yes, see you there Roy. Looking forward to it.
Is your 1mm sustained or burst? Is this 1mm write bytes/sec? 1MB writes/sec?
October 20, 2016 at 7:45 am
It seems a bit of an arbitrary number. One million. In some cases very impressive but then again in others not at all.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
October 20, 2016 at 8:14 am
I have a demo I gave in a recent precon (first precon... YAY!) that processed all 7 million rows (each row was about 1040 bytes wide) in a table in 4 seconds. It included the creation of 5 different aggregates based on 5 different temporal ranges, used 1 compared to the other 4 to come up with "percent of period" calculations, and then ranked the 4 resulting percentages... on my laptop! It's nothing special, either... just a little ol' I5 based system with 6GB of ram and no SSDs.
One of the other things that Ed Wagner (my co-presenter) and I taught was how to build million row test tables in seconds. While a million rows isn't really that big any more, the people that don't have a million rows to test with really appreciated it because, if it works on a million rows, it's going to work on something less. And, if you need to test against more than a million rows, the same techniques make it all easy and fast especially if you use "Minimal Logging" techniques to build the test tables.
Long live the "million row test table". 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
October 20, 2016 at 10:48 am
In the small state that I work in, 1 million of anything is a big number.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply