• I mean this is 2nd time I'm doing this. Its pain to check on the Shrinkfile SPIDS if there are any blockings or something. It takes 6-7 hours for 1 data file and I have 8 files to go on 1 database. I have 6 databases on different servers like that in same condition.

    In your experience, how can I identify to maintain these huge databases that gets fragmented too often.