November 19, 2008 at 8:05 am
I use nvarchar field to identify records in a database table due to an Identity Field is used as Primary Key.
My partner considers we must change our model to use the primary key as the way to identify my records but this is not a constant value. He told me this model has lot performance problems.
I don’t think so, because this table doesn’t grow, it just has at most 100 rows.
I want to know if with this model I could have significant performance problems.
The problem is if I must change this model I need to change a lot code to fix this issue. I need a good reason to continue with my model because we are in the project timeline
Thanks a lot 🙂
November 20, 2008 at 9:58 am
Honestly, from my experience, a 100 row table won't be a major factor in performance. If it's never expected to grow to more than 100 rows, it should be fine. If your row size is less than 80 bytes per row, then the entire table can fit on one database page, and even a full table scan would still be fairly efficient in the big picture of things.
(oops bad math day, 100*80 = 8000, so row size less than 80 bytes, not 800)
November 20, 2008 at 10:01 am
HI all,
This is a double post.
arluna as Jeff and some others mentioned in your other post.
This shouldn't be a problem, and it's best to make sure your column is indexed.
Thanks
Chris
----------------------------------------------
Try to learn something about everything and everything about something. - Thomas Henry Huxley
:w00t:
Posting Best Practices[/url]
Numbers / Tally Tables[/url]
November 21, 2008 at 4:00 am
I agree with the previous writers.
But there is a question you could consider in your next designs:
"Try to use always the smaller data type possible. "
- The rows will be smaller
- The index will be smaller
- The performance will be better.
November 21, 2008 at 7:10 am
Thank you so much!
Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply