Database Optimization

  • Hey guys,

    My company is undertaking a database optimization project. Optimization the schema, the code, etc. I would like to ask, if you guys could help out, the following:

    1. What risks are there? What are the pitfalls?

    2. My company is hesitant to do a database freeze and stop all new development until our vendor (who's restructuring tables and changing database objects) has a stable database for us to obtain, then, and only then can we continue development on this newer copy. My question to this: how can we either reduce the database code freeze or work in parallel?

    3. Can anyone point me to other sources of information? Another thread? A book? A URL?

    Thanks a bunch!

  • I'll reply to your first post of this question in order to consolidate the responses for you.

    -PatP

  • JessJenn (4/4/2008)


    Hey guys,

    My company is undertaking a database optimization project. Optimization the schema, the code, etc. I would like to ask, if you guys could help out, the following:

    1. What risks are there? What are the pitfalls?

    2. My company is hesitant to do a database freeze and stop all new development until our vendor (who's restructuring tables and changing database objects) has a stable database for us to obtain, then, and only then can we continue development on this newer copy. My question to this: how can we either reduce the database code freeze or work in parallel?

    It depends. If I understand correctly, there is a vendor with database model and actively developing it. On top of it, your company is developing custom model, which it is now trying to optimize.

    Such concept is always PITA, unless both designers (teams) work together to minimize incompatibilities. If level of dependencies is low, you can make them abstract to nullify nightmares of parallel development synchronization.

    Freeze? If the vendor is making changes in milestones without previews/notifications, you should try to fix this relationship first.

  • I will suggest unit testing all the sp and udf (assuming you don't do much direct access to tables or views). This will ensure that you are only optimizing the structure of the data but not the output. You can use nUnit with a little bit of code to hook up all sp for unit testing. This worked for me on my last project and greatly helps maintenance down the road.

    BY

  • I think to a extend you need to understand coupling between modules and the module you are looking for optimisation.

    Mostly optimisation ends with database alteration as mostly database suffer with poor database design and poor queries written

  • What are your objectives in doing this work?

    If you simply want to get naming conventions and coding standards to comply with current site requirements, IMHO this is a total waste of company time and money. You spend a lot to stand still.

    If you want to improve the performance of critical business queries, then go for it! Providing of course that time spent on this is more beneficial then the cost of lost opportunities of doing something else.

    If fixing performance issues is the best use of time, them you need to analyse your business queries to see what causes the most pain. Include talking to your end-users to find where the pain is, as taking 50% off an online query that responds in 2 seconds may give more business benefit than taking 50% off a batch report that responds in 2 hours.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • Thanks for the great advice, people. The aim of this project is refactoring and optimization, through the use of schema changing, stored proc, function, etc. changes. Ideally, we would look at this from the business process level perspective, but that would take too long. It is a big database and doing a rearchitecture based on optimized business processes would be a lenghty project (2-3 years).

    So, why do I bring up the db freeze? Because the vendor says it will take 3 months to wrap up his job (he's been working on this for a while now). In those 3 months, we will make many changes of db objects to the current database. Meaning that when the 3 months are up, we can't use that new db right away. We have to now make 3 months of changes workable on the vendor's database. What if then, it takes another 2 months (as opposed to 3 months) to apply those changes to the new database while there is still changes being done to the old database? Those 2 months or changes are implemented to the new database, and yet again it's not synchronized with the old because changes were still taking place.

  • Actually, the solution is not so ugly.

    Simply log all your development changes into a script and when you get a new version from vendor, simply reapply the script, fixing eventual incompatibilities.

    Unless the vendor is modifying the same objects as you do. In such case the vendor should give you the change script, so you can review both and adjust accordingly.

    Also, you can use a database compare tool.

  • Hello,

    I have to say I would wait until the vendor has finished working on the DB. Unless they are not touching the same areas of the DB as you intend to change I cannot see how you can get maximum benefit out of a re-op.

    [Assuming like Robert that you have a system that you have developed that is working on top of the vendor database]

    Trivial example would be a re-organization of an index; the vendor has optimized an index to improve insert times for a transactional process. You've optimized a stored procedure based on the original indexes and table stats and when you go to use it for the first time on the upgraded system it performs like a dog because the index you were relying on no-longer exists.

    If you attempt to add the original index back you risk wasting your time and money the vendor has spent to improve the transactional performance so you end up refactoring the sp all over again.

    Realistically can you not wait 12 weeks for this work to be done, given that you say this is a large db - and I’m assuming by that you mean it covers a wide business scope - and therefore been in place quite some time I can't believe it's suddenly become essential to do this work?

    If one or two sp's are causing you serious grief that really can't wait then Robert's advice is the way to go otherwise I would say the risks out way the benefits at this point in time. If it were me I’d spend those some of those 12 weeks working out exactly what the vendor has done so that you have a plan/code in place to tackle any potential problems bought about by the upgrade - and there are bound to be some - and then worry about making improvements later.

    K.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply