Thanks! what i am trying to accomplish is to update the stats since they are not being kept up. I have thought about rebuilding the indexes since some are pretty fragmented >50%. but one concern is that our tables are huge like over 10Million rows of data so transaction log growth can grow pretty crazy. Is it rebuilding an indexes recommended based on how fragmented it is? Our trans log backups are done every 15 mins.
1) It is funny what people think is "huge" these days. I have a 131M row table on my laptop I use for data warehousing and column store index demos.
2) You cannot be successful managing a 1TB+ database on any RDBMS system without doing quite a few things right. Index mx and statistics updates are two of those things. The proper solution on HOW to manage those things best for YOUR SYSTEM cannot be determined without a lot more knowledge about your apps, data access/processing, mx window(s), SLAs, etc, etc. I could tell you things that would be general best practices that could totally fubar things for your system. I recommend you get a professional to help you determine what your needs really are and get things set up and mentor you on how to monitor/react to your systems needs.
3) Large tlogs are part and parcel of managing a large database. Plan for it or suffer the consequences.
4) Given log shipping to another datacenter you mentioned, are you compressing before shipping (assuming you have bandwidth/latency issues that usually come with that scenario)?
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru at GMail