Yeah, I'm not surprised.
If the deepest # of iterations isn't that high, the while loop performance would not be that bad.
The recursive cte can't tell which matches it has already found (I tried to integrate that but it did not go well), so it keeps linking down to max iterations, and then fixes it in the grouping.
I don't think the dynamic sql would be a performance boost either. Maybe if hundreds of iterations were possible one of these could become worth it.
You might be better off tweaking the loop for performance gains instead of replacing it.
Try out (loop only below, needs the rest of your code):
WHILE @increment < 10 AND (@didinsert > 0 OR @increment <= 2)
BEGIN
INSERT INTO #orderPartyIds (orderid, partykey, incrementNbr)
SELECT DISTINCT o.orderid, p1.partykey, @increment
FROM #orderPartyIds o
INNER JOIN #personIds p ON o.partykey = p.partykey
INNER JOIN #personIds p1 ON p.personId = p1.personId
WHEREincrementNbr = @increment - 1
AND NOT EXISTS (
SELECTorderid
FROM#orderPartyIds o1
WHEREo1.orderid = o.orderid
ANDo1.partykey = p1.partykey
)
SET @didinsert = @@ROWCOUNT
/* set the variable back to one if data was inserted */
--SELECT@didinsert = 1
--FROM#orderPartyIds
--WHEREincrementNbr = @increment
/* increment the counter by one */
SET @increment = @increment + 1
END /* end of WHILE loop */
In other words, you should not need to query your temp table to find out if you are at max depth. @@rowcount will tell you if anything got inserted on the current increment.