How to get rid of implicit conversions ??

  • Hi Experts,

    How to get rid of implicit conversions from execution plans.

    CREATE TABLE dbo.T1 (

    BigIntColumn BIGINT NOT NULL,

    IntColumn INT NOT NULL ---/// int datatype

    );

    CREATE TABLE dbo.T2 (

    BigIntColumn BIGINT NOT NULL,

    VarcharColumn varchar(10) NOT NULL, ---///varchar datatype

    );

    --- insert data into t1

    -- load 1000 row(s)

    ;

    WITH Nums

    AS (SELECT TOP (1000)

    ROW_NUMBER() OVER (ORDER BY (SELECT 1

    )) AS n

    FROM master.sys.all_columns ac1

    CROSS JOIN master.sys.all_columns ac2

    )

    INSERT INTO dbo.t1

    (BigIntColumn,

    IntColumn

    )

    SELECT Nums.n,

    Nums.n

    FROM Nums;

    go

    --- insert data into t2

    -- load 1000 row(s)

    insert into t2

    select * from t1

    select top 10 * from t1

    select top 20 * From t2

    --- type conversion warnings in Actual Execution plan. how to avoid it??

    --try1: implicit warning

    select

    t1.IntColumn,

    t2.VarcharColumn

    from t1 inner join t2 on t1.IntColumn = t2.VarcharColumn

    go

    --try2: still seeing the warning. how to getrid of it ??

    select

    t1.IntColumn,

    t2.VarcharColumn

    from t1 inner join t2 on t1.IntColumn = cast(t2.VarcharColumn as int)

    go

    --try2: still seeing the warning. how to getrid of it??

    select

    t1.IntColumn,

    t2.VarcharColumn

    from t1 inner join t2 on t1.IntColumn = convert(int,t2.VarcharColumn)

    Thanks,

    Sam

  • Try using a Tally table where you define the numeric type explicitly in the table definition. Normally INT is big enough... then when you join to another table with the type INT, you're okay. If you need a BIGINT for some reason, I'd declare and store it in the Tally table - then at least you're joining on numeric columns that are the same size.

  • Quick thoughts:

    This is normal and not much one can or has to do about this as the optimiser may skip explicit conversions and introduce implicit conversions, nothing says it has to do exactly what's outlined in the query as long as it produces the same results. This is apparent if we look at two examples, first one without an insert and the second one with an insert.

    Sample 1, no implicit conversion found in this plan.

    USE TEEST;

    GO

    SET NOCOUNT ON;

    DECLARE @SAMPLE_SIZE BIGINT = 10;

    SET STATISTICS PROFILE ON;

    -- Evaluate the data types and conversions

    ;WITH T(N) AS (SELECT N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) AS X(N))

    , NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE + 1) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS N FROM T T1,T T2,T T3,T T4,T T5,T T6)

    SELECT

    NM.N AS BINT_N

    ,CONVERT(INT,NM.N,0) AS INT_N -- INT

    ,DATALENGTH(NM.N) AS DLEN -- BIGINT is 8

    ,DATALENGTH(CONVERT(INT,NM.N,0)) AS IDLEN -- INT is 4

    ,CONVERT(VARCHAR(20),NM.N,0) AS VCH_VAL

    FROM NUMS NM;

    SET STATISTICS PROFILE OFF;

    Profile output

    Rows Executes StmtText StmtId NodeId Parent PhysicalOp LogicalOp Argument DefinedValues EstimateRows EstimateIO EstimateCPU AvgRowSize TotalSubtreeCost OutputList Warnings Type Parallel EstimateExecutions

    -------------------- -------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------- ----------- ----------- ------------------------------ ------------------------------ --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ ------------- ------------- ------------- ----------- ---------------- ----------------------------------------------------------- ------------------ ---------------------------------------------------------------- -------- ------------------

    0 0 |--Compute Scalar(DEFINE:([Expr1069]=CONVERT(int,[Expr1068],0), [Expr1070]=datalength([Expr1068]), [Expr1071]=datalength(CONVERT(int,[Expr1068],0)), [Expr1072]=CONVERT(varchar(20),[Expr1068],0))) 1 2 1 Compute Scalar Compute Scalar DEFINE:([Expr1069]=CONVERT(int,[Expr1068],0), [Expr1070]=datalength([Expr1068]), [Expr1071]=datalength(CONVERT(int,[Expr1068],0)), [Expr1072]=CONVERT(varchar(20),[Expr1068],0)) [Expr1069]=CONVERT(int,[Expr1068],0), [Expr1070]=datalength([Expr1068]), [Expr1071]=datalength(CONVERT(int,[Expr1068],0)), [Expr1072]=CONVERT(varchar(20),[Expr1068],0) 100 0 1E-05 41 0.0006276352 [Expr1068], [Expr1069], [Expr1070], [Expr1071], [Expr1072] NULL PLAN_ROW 0 1

    11 1 |--Top(TOP EXPRESSION:([@SAMPLE_SIZE]+(1))) 1 3 2 Top Top TOP EXPRESSION:([@SAMPLE_SIZE]+(1)) NULL 100 0 1E-05 15 0.0006176352 [Expr1068] NULL PLAN_ROW 0 1

    11 1 |--Sequence Project(DEFINE:([Expr1068]=row_number)) 1 4 3 Sequence Project Compute Scalar DEFINE:([Expr1068]=row_number) [Expr1068]=row_number 100 0 0.08 15 0.0006076352 [Expr1068] NULL PLAN_ROW 0 1

    11 1 |--Segment 1 5 4 Segment Segment NULL NULL 100 0 0.02 15 0.0005996352 [Segment1074] NULL PLAN_ROW 0 1

    11 1 |--Nested Loops(Inner Join) 1 6 5 Nested Loops Inner Join NULL NULL 100 0 4.18 9 0.0005976352 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    2 1 |--Nested Loops(Inner Join) 1 7 6 Nested Loops Inner Join NULL NULL 10 0 0.418 9 7.429929E-05 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | |--Nested Loops(Inner Join) 1 8 7 Nested Loops Inner Join NULL NULL 1 0 0.0418 9 1.7168E-05 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | | |--Nested Loops(Inner Join) 1 9 8 Nested Loops Inner Join NULL NULL 1 0 0.00418 9 1.1831E-05 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | | | |--Nested Loops(Inner Join) 1 10 9 Nested Loops Inner Join NULL NULL 1 0 0.000418 9 6.494E-06 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | | | | |--Constant Scan 1 11 10 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    1 1 | | | | |--Constant Scan 1 12 10 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    1 1 | | | |--Constant Scan 1 13 9 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    1 1 | | |--Constant Scan 1 14 8 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    2 1 | |--Constant Scan 1 15 7 Constant Scan Constant Scan NULL NULL 10 0 1.0157E-05 9 1.0157E-05 NULL NULL PLAN_ROW 0 1

    11 2 |--Constant Scan 1 16 6 Constant Scan Constant Scan NULL NULL 10 0 1.0157E-05 9 0.000110156 NULL NULL PLAN_ROW 0 10.9999

    Sample 2, the optimiser ignores the explicit conversion to varchar(20)

    USE TEEST;

    GO

    SET NOCOUNT ON;

    IF OBJECT_ID(N'dbo.T2') IS NOT NULL DROP TABLE dbo.T2;

    CREATE TABLE dbo.T2

    (

    BigIntColumn BIGINT NOT NULL

    ,VarcharColumn varchar(10) NOT NULL ---///varchar datatype

    );

    DECLARE @SAMPLE_SIZE BIGINT = 10;

    SET STATISTICS PROFILE ON;

    -- Using inline Tally table rather than sys.all_columns, the output column (N) is BIGINT

    ;WITH T(N) AS (SELECT N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) AS X(N))

    , NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE + 1) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS N FROM T T1,T T2,T T3,T T4,T T5,T T6)

    INSERT INTO dbo.T2(BigIntColumn,VarcharColumn)

    SELECT

    NM.N

    ,CONVERT(VARCHAR(20),NM.N,0)

    FROM NUMS NM

    SET STATISTICS PROFILE OFF;

    Profile output has implicit conversion

    Rows Executes StmtText StmtId NodeId Parent PhysicalOp LogicalOp Argument DefinedValues EstimateRows EstimateIO EstimateCPU AvgRowSize TotalSubtreeCost OutputList Warnings Type Parallel EstimateExecutions

    -------------------- -------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------- ----------- ----------- ------------------------------ ------------------------------ ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ ----------------------------------------------------------------------------- ------------- ------------- ------------- ----------- ---------------- ----------------------- ------------------ ---------------------------------------------------------------- -------- ------------------

    11 1 |--Table Insert(OBJECT:([TEEST].[dbo].[T2]), SET:([TEEST].[dbo].[T2].[BigIntColumn] = RaiseIfNullInsert([Expr1071]),[TEEST].[dbo].[T2].[VarcharColumn] = RaiseIfNullInsert([Expr1072]))) 1 2 1 Table Insert Insert OBJECT:([TEEST].[dbo].[T2]), SET:([TEEST].[dbo].[T2].[BigIntColumn] = RaiseIfNullInsert([Expr1071]),[TEEST].[dbo].[T2].[VarcharColumn] = RaiseIfNullInsert([Expr1072])) NULL 100 0.01 0.0001 9 0.01072764 NULL NULL PLAN_ROW 0 1

    0 0 |--Compute Scalar(DEFINE:([Expr1072]=CONVERT_IMPLICIT(varchar(10),CONVERT(varchar(20),[Expr1071],0),0))) 1 3 2 Compute Scalar Compute Scalar DEFINE:([Expr1072]=CONVERT_IMPLICIT(varchar(10),CONVERT(varchar(20),[Expr1071],0),0)) [Expr1072]=CONVERT_IMPLICIT(varchar(10),CONVERT(varchar(20),[Expr1071],0),0) 100 0 1E-05 24 0.0006276352 [Expr1071], [Expr1072] NULL PLAN_ROW 0 1

    11 1 |--Top(TOP EXPRESSION:([@SAMPLE_SIZE]+(1))) 1 4 3 Top Top TOP EXPRESSION:([@SAMPLE_SIZE]+(1)) NULL 100 0 1E-05 15 0.0006176352 [Expr1071] NULL PLAN_ROW 0 1

    11 1 |--Sequence Project(DEFINE:([Expr1071]=row_number)) 1 5 4 Sequence Project Compute Scalar DEFINE:([Expr1071]=row_number) [Expr1071]=row_number 100 0 0.08 15 0.0006076352 [Expr1071] NULL PLAN_ROW 0 1

    11 1 |--Segment 1 6 5 Segment Segment NULL NULL 100 0 0.02 15 0.0005996352 [Segment1074] NULL PLAN_ROW 0 1

    11 1 |--Nested Loops(Inner Join) 1 7 6 Nested Loops Inner Join NULL NULL 100 0 4.18 9 0.0005976352 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    2 1 |--Nested Loops(Inner Join) 1 8 7 Nested Loops Inner Join NULL NULL 10 0 0.418 9 7.429929E-05 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | |--Nested Loops(Inner Join) 1 9 8 Nested Loops Inner Join NULL NULL 1 0 0.0418 9 1.7168E-05 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | | |--Nested Loops(Inner Join) 1 10 9 Nested Loops Inner Join NULL NULL 1 0 0.00418 9 1.1831E-05 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | | | |--Nested Loops(Inner Join) 1 11 10 Nested Loops Inner Join NULL NULL 1 0 0.000418 9 6.494E-06 NULL NO JOIN PREDICATE PLAN_ROW 0 1

    1 1 | | | | |--Constant Scan 1 12 11 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    1 1 | | | | |--Constant Scan 1 13 11 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    1 1 | | | |--Constant Scan 1 14 10 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    1 1 | | |--Constant Scan 1 15 9 Constant Scan Constant Scan NULL NULL 1 0 1.0157E-05 9 1.157E-06 NULL NULL PLAN_ROW 0 1

    2 1 | |--Constant Scan 1 16 8 Constant Scan Constant Scan NULL NULL 10 0 1.0157E-05 9 1.0157E-05 NULL NULL PLAN_ROW 0 1

    11 2 |--Constant Scan 1 17 7 Constant Scan Constant Scan NULL NULL 10 0 1.0157E-05 9 0.000110156 NULL NULL PLAN_ROW 0 10.9999

  • This is a datawarehouse system where we have Cleansing database where we pull data as it is from source systems. so, i cant change the table design and this is just an example.

    So, we have lot of joins similar like this. So, trying to figure out a way to avoid implicit datatype conversions.

  • The best approach would probably be using a CTE on the lower cardinality table to do the conversion before the join, hence providing the opportunity to the server to utilise the indices with greater impact. The truth is that one has to live with some implicit conversions, some are bad but others may not bear any extra cost, the trick is to identify the costly ones.

    😎

  • The only way to avoid implicit conversions is to change the data types in the tables or making them explicit which isn't an improvement over implicit ones.

    You could try Eirikur's method, but it's not guaranteed that if you write the query as a CTE to prioritize some process, it will be interpreted like that by the query optimizer.

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2
  • Thanks for sharing your thoughts Eric.

  • Hi Eirikur Eiriksson thanks for your inputs.

    If you don't mind, could you please provide me a demo script for my example using CTE's to avoid implicit conversions?

  • Cast and Convert Functions are found here.

    CTE examples are found here.

    __________________________________________________

    Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
    Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills

  • Here is a quick example of using a CTE to avoid an implicit conversion and producing a decent execution plan.

    😎

    USE TEEST;

    GO

    SET NOCOUNT ON;

    IF OBJECT_ID(N'dbo.T1') IS NOT NULL DROP TABLE dbo.T1;

    CREATE TABLE dbo.T1

    (

    BigIntColumn BIGINTNOT NULL

    ,IntColumn INT NOT NULL ---///varchar datatype

    );

    IF OBJECT_ID(N'dbo.T2') IS NOT NULL DROP TABLE dbo.T2;

    CREATE TABLE dbo.T2

    (

    BigIntColumn BIGINT NOT NULL

    ,VarcharColumn varchar(10) NOT NULL ---///varchar datatype

    );

    DECLARE @SAMPLE_SIZE BIGINT = 100000;

    -- Using inline Tally table rather than sys.all_columns, the output column (N) is BIGINT

    ;WITH T(N) AS (SELECT N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) AS X(N))

    , NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS N FROM T T1,T T2,T T3,T T4,T T5,T T6)

    INSERT INTO dbo.T1(BigIntColumn,IntColumn)

    SELECT

    NM.N

    ,CONVERT(INT,NM.N,0)

    FROM NUMS NM

    WHERE NM.N % 17 = 0;

    -- Using inline Tally table rather than sys.all_columns, the output column (N) is BIGINT

    ;WITH T(N) AS (SELECT N FROM (VALUES (0),(0),(0),(0),(0),(0),(0),(0),(0),(0)) AS X(N))

    , NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS N FROM T T1,T T2,T T3,T T4,T T5,T T6)

    INSERT INTO dbo.T2(BigIntColumn,VarcharColumn)

    SELECT

    NM.N

    ,CONVERT(VARCHAR(20),NM.N,0)

    FROM NUMS NM

    CREATE UNIQUE NONCLUSTERED INDEX UNQNCLIDX_DBO_T2_VARCHARCOLUMN_INCL_BIGINTCOLUMN ON [dbo].[T2] ([VarcharColumn]) INCLUDE ([BigIntColumn]);

    -- SIMPLE TEST QUERY, RESULTS IN AN INDEX SCAN AN A MERGE JOIN, NO IMPLICIT CONVERSION IN THE PLAN

    SET STATISTICS PROFILE ON;

    ;WITH BASE_DATA AS

    (

    SELECT

    TA.BigIntColumn

    ,CONVERT(VARCHAR(20),TA.IntColumn,0) AS VCVAL

    FROMdbo.T1TA

    )

    SELECT

    TB.BigIntColumn

    ,BD.BigIntColumn

    ,TB.VarcharColumn

    FROM dbo.T2 TB

    INNER JOIN BASE_DATA BD

    ON TB.VarcharColumn = BD.VCVAL

    WHERE BD.BigIntColumn > 5000

    OPTION (MAXDOP 1);

    SET STATISTICS PROFILE OFF;

    Profile output

    Rows Executes StmtText StmtId NodeId Parent PhysicalOp LogicalOp Argument DefinedValues EstimateRows EstimateIO EstimateCPU AvgRowSize TotalSubtreeCost OutputList Warnings Type Parallel EstimateExecutions

    -------------------- -------------------- ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- ----------- ----------- ----------- ------------------------------ ------------------------------ ---------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------- ------------- ------------- ------------- ----------- ---------------- --------------------------------------------------------------- -------- ---------------------------------------------------------------- -------- ------------------

    5588 1 |--Merge Join(Inner Join, MERGE:([TB].[VarcharColumn])=([Expr1005]), RESIDUAL:([TEEST].[dbo].[T2].[VarcharColumn] as [TB].[VarcharColumn]=[Expr1005])) 4 2 1 Merge Join Inner Join MERGE:([TB].[VarcharColumn])=([Expr1005]), RESIDUAL:([TEEST].[dbo].[T2].[VarcharColumn] as [TB].[VarcharColumn]=[Expr1005]) NULL 5587.599 0 0.2278927 32 0.7910935 [TB].[BigIntColumn], [TB].[VarcharColumn], [TA].[BigIntColumn] NULL PLAN_ROW 0 1

    99996 1 |--Index Scan(OBJECT:([TEEST].[dbo].[T2].[UNQNCLIDX_DBO_T2_VARCHARCOLUMN_INCL_BIGINTCOLUMN] AS [TB]), ORDERED FORWARD) 4 3 2 Index Scan Index Scan OBJECT:([TEEST].[dbo].[T2].[UNQNCLIDX_DBO_T2_VARCHARCOLUMN_INCL_BIGINTCOLUMN] AS [TB]), ORDERED FORWARD [TB].[BigIntColumn], [TB].[VarcharColumn] 100000 0.286088 0.110157 24 0.396245 [TB].[BigIntColumn], [TB].[VarcharColumn] NULL PLAN_ROW 0 1

    5588 1 |--Sort(ORDER BY:([Expr1005] ASC)) 4 4 2 Sort Sort ORDER BY:([Expr1005] ASC) NULL 5587.599 0.01126126 0.1314461 29 0.1669528 [TA].[BigIntColumn], [Expr1005] NULL PLAN_ROW 0 1

    0 0 |--Compute Scalar(DEFINE:([Expr1005]=CONVERT(varchar(20),[TEEST].[dbo].[T1].[IntColumn] as [TA].[IntColumn],0))) 4 5 4 Compute Scalar Compute Scalar DEFINE:([Expr1005]=CONVERT(varchar(20),[TEEST].[dbo].[T1].[IntColumn] as [TA].[IntColumn],0)) [Expr1005]=CONVERT(varchar(20),[TEEST].[dbo].[T1].[IntColumn] as [TA].[IntColumn],0) 5587.599 0 0.0005587599 29 0.02424543 [TA].[BigIntColumn], [Expr1005] NULL PLAN_ROW 0 1

    5588 1 |--Table Scan(OBJECT:([TEEST].[dbo].[T1] AS [TA]), WHERE:([TEEST].[dbo].[T1].[BigIntColumn] as [TA].[BigIntColumn]>(5000))) 4 6 5 Table Scan Table Scan OBJECT:([TEEST].[dbo].[T1] AS [TA]), WHERE:([TEEST].[dbo].[T1].[BigIntColumn] as [TA].[BigIntColumn]>(5000)) [TA].[BigIntColumn], [TA].[IntColumn] 5587.599 0.01423611 0.0066272 19 0.02086331 [TA].[BigIntColumn], [TA].[IntColumn] NULL PLAN_ROW 0 1

  • Thats Awesome Eirikur Eiriksson. Other thing, it came into my mind looking at the Execution Plan, I see sort and the Merge join is happening.

    Is there an way, we can avoid sort+merge instead go for nested loop or hash physical join. If so, how can we achieve it?

  • vsamantha35 (8/3/2016)


    Thats Awesome Eirikur Eiriksson. Other thing, it came into my mind looking at the Execution Plan, I see sort and the Merge join is happening.

    Is there an way, we can avoid sort+merge instead go for nested loop or hash physical join. If so, how can we achieve it?

    You can force a type of join with a hint, but I would advise you against it. Probably 99.9% of the time, the Optimiser knows better than you the most expeditious way to implement a join. It's basing its choice on table statistics and users are usually just guessing. You will probably see performance suffer due to a hint, and wind up undoing it.

    If you truly believe a join is incorrect, it is likely the result of parameter sniffing. You can google a dozen articles on that problem. The net of it is, if you are usually going to be running a query to get a dozen rows, don't make the first run for 10,000 rows, because the plan produced may differ depending on volume. If a query is sometimes used to get a single row and sometimes used to get 100,000 rows , the RECOMPILE option can help.

    __________________________________________________

    Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
    Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills

  • Thank you Dixie.

  • vsamantha35 (8/3/2016)


    Thats Awesome Eirikur Eiriksson. Other thing, it came into my mind looking at the Execution Plan, I see sort and the Merge join is happening.

    Is there an way, we can avoid sort+merge instead go for nested loop or hash physical join. If so, how can we achieve it?

    There's nothing wrong with a Merge Join, if it's appropriate for the data and the size of the data.

    As far as not being able to change the datatypes in a DW, I'd find out how many of the people involved like pork chops especially if the change was the right thing to do instead of all these workarounds. Part of performance relies on proper design and it sounds like the DW has some design problems that people created for the sake of display purposes.

    There are exceptions to everything but, usually, the absolute best and most reliable way to get performance out of SQL Server is to enable it to do what it does best instead of trying to force it past design flaws that cause such things as implicit conversions. To convince others, don't talk about it because they're convinced they did the right thing to begin with. Instead, write a coded demonstration that proves they didn't and that they need to make a change.

    If you want to quickly create large volumes constrained random data to help with your demonstrations, here are a couple of articles to help you if you need such help.

    http://www.sqlservercentral.com/Authors/Articles/Jeff_Moden/80567/

    http://www.sqlservercentral.com/articles/Test+Data/88964/

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Thanks Jeff for the detailing.

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply