We currently maintain a denormalized table which gets updated from 3 tables twice a day. The data in the 3 tables changes a lot and the denormalized table has over 150 Million records. We currently use an SSIS package to gather new data + changed data from the 3 tables and load them to a look up tables and then apply the changes to the denormalized table every 12 hours using batch inserts and deletes. The process runs in about 15 min in a normal run (less than 100000 updated records), but twice a month we run a different process which increase the data volume to about 3 to 4 million records to be denormalized at once, which sometimes takes over 15 hours. The denormalized table currently has over 10 indexes because it gets hit a lot by other processes, and the 10 hour delay that happens occasionally is becoming a major problem right now. I am wondering if there are more efficient ways of maintaining a large denormalized table other than what we currently have. I read about using indexed views, but they are not recommended for tables where data frequently changes. I would appreciate any recommendation.