• Hello,

    Thanks guys for your answers: I will answer the questions above:

    The current size of the database is around 2 Terabytes, and the monthly file groups hold about 300Gb of data, all of which are mostly static and unnormalised.

    At the moment, the fastest full backups takes around 30Hours, this is the best we can get at the moment.

    Performance jobs in place includes updating the indexes and de-fraging the indexes as well.

    With normalisation, this is always the best thing to do, but the problem is that it would take a considerable amount of time to do, we are talking of months here, and besides a lot of things has been tied into the unjnormalised data, any changes now will mean a complete rebuild of the entire framework, scheduled reports, all user reports, DTS/SSIS packages, crucial system processes, training users to new structure etc.

    Taking all the above into consideration, I hope that someone can come up with a solution here.

    Thanks