Lowell (1/25/2013)
also, my rule of thumb is that any @TableVariable with more than a few hundred rows should be a temp table instead, to allow the system to automatically do things like generate statistics;From the estimated # of rows from the estimated execution plans i see 200K+ rows for every query in there.
i see a missing index suggestions for every query in the plan as well.
Since this was not an actual execution plan, i'm concerned that the high number of estimated rows might be due to bad statistics.
I see Non-SARG-able items that force a table scan, like:
AND(DATEDIFF(DD,AF.FCT_DT,AF.LST_MFS_EVNT_DT)>=60)
AND(DATEDIFF(DD,AF.FCT_DT,AF.LST_MFS_EVNT_DT)>=30)
since they are all going into the same table with the same structure, i'd change this to be a CTE with UNION ALL isntead of a @TableVariable.
Also, the single highest cost of the highest-costed queries is table insert - into the table variable. So why not perform the aggregate - the final query - on each individual query, like this:
;WITH TMP AS (
SELECT
[userType]= 'Total Registered Subscriber Base', -- 31%
[status]= '',
[bank]= PED.PYMNT_ENTTY_NM,
[date]= AF.FCT_DT,
[WEEK]= dt.MIC_WK_OF_MO_NM
FROM DBO.AR_MFS_CL_FCT AF
INNER JOIN DBO.AR_MFS_PRFL_DIM PD
ON AF.AR_MFS_PRFL_KEY = PD.AR_MFS_PRFL_KEY
INNER JOIN DBO.PYMNT_ENTTY_DIM PED
ON PED.PYMNT_ENTTY_KEY = PD.PYMNT_ENTTY_KEY
INNER JOIN DBO.DT_DIM dt
ON dt.DT_KEY = AF.FCT_DT_KEY
WHERE (dt.YR = @MLLCMBI_GH_YR)
AND (dt.MO_OF_YR_NM = @MLLCMBI_GH_MNTH)
)
SELECT
[BankName]= TMP.bank,
[YearName]= @MLLCMBI_GH_YR,
[MonthName] = @MLLCMBI_GH_MNTH,
TMP.[WEEK],
TMP.[date],
'Total Registered Subscriber Base' = COUNT(*)
INTO @TMPVW-- preferably a #temp table
FROM TMP
GROUP BY
TMP.[date],
tmp.bank,
TMP.[WEEK],
TMP.userType
For better assistance in answering your questions, please read this[/url].
Hidden RBAR: Triangular Joins[/url] / The "Numbers" or "Tally" Table: What it is and how it replaces a loop[/url] Jeff Moden[/url]