September 24, 2019 at 10:38 am
Hi All,
Having inherited a poorly performing database we've redesgined and come up with some potential issues.
One table we have a surrogate key made up of 16 columns, previously there was clause on the join for all 16 columns. We had changed this to a checksum however it became apparent that there was an instance where a duplicate value appeared. As a result we are now moving to make this a HashByte column using SHA2_256 algorithm.
What we are now concerned with performance of the join given the column is varbinary(8000
September 25, 2019 at 11:10 am
Thanks for posting your issue and hopefully someone will answer soon.
This is an automated bump to increase visibility of your question.
September 25, 2019 at 1:22 pm
Wow! Can you find the original designers and yell at them a bit
I would look for a completely different alternative if possible. Is this a data warehouse? IF POSSIBLE: I would create a key on a single identity column. Then, within a load procedure/process, I would manage data integrity. In other words, I would shift the responsibility of managing data integrity from database objects like keys to the loading process. I would also offload referential integrity to the loading process. Again, since I don't know the context you are operating within, none of this may be possible.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy