July 27, 2004 at 12:16 pm
I have a table in my database and the table has almost 45 columns and the rowsize is 10468 bytes.in that most of the colums have varchar datatypes and and i think coz of poor knowledge of the data most of the columns with varchar data were given more column length. Now i want to decrease the size of those columns and to see the row size would be around 8k Bytes.If i do this now, does it affect the table performance much....Infact can i do this as there is lot of data (almost 2 million rows) in the table.If it is possible is there anything to be taken care before changing the column lenghts.
Thanks.
July 27, 2004 at 1:19 pm
It will improve the performance but be sure do not truncate existing data.
Try to find out the longest data in a column by select max(len(columnname)) from your table.
July 27, 2004 at 2:58 pm
I doubt if it will have any effect on performance. The size defined is just the MAX, not the actual. And Varchar is Varchar wether (20) or (8000). So all you will do is limit the field sizes for any future data loaded.
A row cannot exceed 8k (roughly) no matter what.
Maybe send the designers to class so they don't do something like this again.
KlK
July 28, 2004 at 12:54 am
You'll have to check with your developer(s) to see if they realy need all possible bytes.
Meybe they've worked out a combination so they don't exceed 8060 bytes in a row (e.g. in case 1 they use columnX and not columnY so rowlength is under control).
You can remind them to the fact that rows longer than 8060 bytes can not be inserted (or updated ! ) .
So, if you can, bring it down to the minimum needed ! (it may prevent problems later on, when you no longer expect them (update   !)
Once you know with which accuracy they've designed their datamodel, you know which other things to keep in mind :
- avoid usage of datatype float, text, ...
- avoid NULL if to the applications the datatype's default value has the
same effect as NULL (e.g. null or space does not make a difference
for the users,..)
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
July 28, 2004 at 2:03 am
Hi,
If the rows size is longer than 8060 bytes, update and insert will be failed, either with warning or without warning!
If the programmer really need the big field size, let said varchar(5000) for one field, try to normalize the design to another table linked by a key.
For example, if ref is primary key for table 1. Create another table to store the remarks and link with the primary key.
create table tbl1 (
ref int,
dob char(8),
crt_dt datetime
)
create table tbl2 (
ref int,
remarks varchar(5000)
)
Regards,
kokyan
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply