December 4, 2006 at 9:36 pm
Comments posted here are about the content posted at temp
December 18, 2006 at 5:46 am
Nice article, David, as usual...
As you are aware, I suspect that any performance differences will only occur where joins of different data types occur due to implicit conversions... but I'm with you... size the columns correctly if for nothing else than to reduce the size of backups (the OTHER thing that designers always forget about).
--Jeff Moden
Change is inevitable... Change for the better is not.
December 18, 2006 at 6:00 am
What happened to TinyInt? Was it too small and thus overlooked? It wasn't even mentioned in the article wven though it was included in the tag line.
December 18, 2006 at 7:35 am
I like the article. But I think the important of a JOIN performance, too.
In my database, I have a table that has 14 foreign keys. So, I've a view that gets the information of this table joined with the 14 other tables. Maybe then the performance difference between these types are shown.
So I try to use always the smaller type possible.
Josep.
December 18, 2006 at 7:49 am
I think what you are missing though is that the point about the different datatypes is that you should use the smallest datatype that makes sense in order to keep your row size to a minimum. The smaller your row size the more rows you can get on a page, which definately does help performance.
See http://www.sql-server-performance.com/nb_saving_space.asp for more info
Cheers
Ian
December 18, 2006 at 8:29 am
However I would say plan for the long term. If you know you have a potential you will need the larger row size be prepared so you don't have to reconsider decission later. And as for the - I have use the following to del with printouts so I could still use the - side of the spectrum.
A = -
B = +
Your application can handle the translation or even handle in a Stored Procedure, butthe later means a longer string must be passed instead of the shorter interger value.
There are many ways to conserve space that are often overlooked and interger size is no different. Consider the fact that no matter how big or small your data is the value that is stored will always be the entire length of the datatype. So tinyint will be 1 byte, smallint 2, int 4 and bigint 8 event if the values is 0 it will be represented by all bytes for the datatype.
Decent article I would just say it probably needs to be a little more focus on it's point.
December 18, 2006 at 9:00 am
Worthless in my opinion?
December 18, 2006 at 9:05 am
Care to formulate a more complete idea?!?!?
December 18, 2006 at 11:07 am
I wouldn't overlook the "natural" integer size of the processor( i.e. 32bit ). During my numerical analysis classes they always stressed staying with float unless you needed the extra precission of a double. Later when I got my first 386 I compiled my libraries with floats and doubles. The functions using doubles where actually smaller because to do a float calculation, the internals would promote the float to a double, do the calc then demote the answer.
December 18, 2006 at 11:47 am
Thanx for the tip. Is there any link you can provide with information on the low level performance tuning like this one?
December 18, 2006 at 12:12 pm
Good article, but needs a spell check.
December 18, 2006 at 12:27 pm
No, I don't have any links handy. You may want to try some "C" performance tuning articles. Unless, number range or memory size dictated otherwise you were generally better of just using the processor size of the int to avoid all the extra handling involved in other sizes(promoting, demoting, word alignment ...) . Infact in C, the size of int is not fixed it is tied to the size of the processor.
December 18, 2006 at 12:40 pm
Along time ago I came across something that said that VB always stored integers as 32 bit values regardless of whether they were Byte, Short, or Long (long in those days being 32 bit) and therefore you might as well use 32 bit integers in any case.
I believe that this is no longer the case but I haven't got anything to back this up.
Why didn't I run tests on TINYINT? Because with only 256 possible values I didn't think I could get any meaningful measurements with such a small sample. It would be a bit like trying to measure the time taken for an industrial water pump to empty a teacup
December 18, 2006 at 1:05 pm
So what ?! just use microseconds instead of milliseconds in the results .
December 18, 2006 at 1:13 pm
By publishing incomplete results, you may mislead readers. I would have preferred you did a more complete analysis of the issue you raised.
Viewing 15 posts - 1 through 15 (of 29 total)
You must be logged in to reply to this topic. Login to reply