• fergfamster (7/22/2016)


    Thank you for the effort and trials...! The Temp Table is 23 Million but the Primary table is 23 billion which i why i thought the date scan would make it faster. However, What I have found is that Merge will break all of this up into separate upsert statements and still scan the entire thing regardless. So what took your side 14 seconds is still taking mine 20-30 minutes even with enhanced resources. I am trying to now play with the following to compare the 2 approaches just to get a better overall testplan. If you have any other ideas that might be better would love to hear them. Merge and OUTPUT I also tried with little luck.

    SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;

    BEGIN TRANSACTION;

    UPDATE Meter_Read_KWH_Hourly SET SDP_EffBegDate=temp.SDP_EffBegDate,

    Meter_Read_Status_Id=temp.Meter_Read_Status_Id,

    Src_Del_Flag=temp.Src_Del_Flag,

    ETL_Batch_Id=temp.ETL_Batch_Id,

    Meter_Read=temp.Meter_Read

    FROM Meter_Read_KWH_Hourly_Temp temp, Meter_Read_KWH_Hourly AS Hourly

    WHERE hourly.SDP_Id = temp.SDP_Id and Hourly.Meter_Read_Dttm = Temp.Meter_Read_Dttm and Hourly.Meter_Read_Dttm >= '2015-05-28 07:00:00';

    IF @@ROWCOUNT = 0

    BEGIN

    INSERT Meter_Read_KWH_Hourly (

    SDP_Id,

    SDP_EffBegDate,

    Meter_Read_Status_Id,

    ETL_Batch_Id,

    Src_Del_Flag,

    Meter_Read_Dttm,

    Meter_Read) values(temp.SDP_EffBegDate,);

    END

    COMMIT TRANSACTION;

    Perhaps you should look at partitioning your table(s) in question. If you really only need to look at the previous days data partitioning it out by day or even hour would allow you to break up that 23 billion row table for your merge operation....If I get the time I will try to come up with a construct based on what I already got but if you could post the DDL, it would be very helpful.