To Inifinity and Beyond With 1=1

  • Comments posted to this topic are about the item To Inifinity and Beyond With 1=1

  • One of my absolute pet hates.  I'm not quite sure whether this is because it's an abomination in itself (it just smells of "I couldn't be bothered to actually think about what I'm doing") or learned behaviour from the abysmal quality of the code you generally find it in.

    I'm a DBA.
    I'm not paid to solve problems. I'm paid to prevent them.

  • You've got to be damn sure of your exit condition that's for sure.  Can be very expensive for certain cloud functions such as AWS Lambdas or GCP Cloud Functions.

     

  • I wouldn’t throw it out with the bath water yet. I deal with a lot of legacy code in both Oracle and MS-SQL. My bane is cursors. The client or the timeline prevents refactoring into set operations. Oracle has an elegant FOR…LOOP statement that handles these situations well.  In MS-SQL, the client usually followed the standard cursor pattern with two fetch statements. For maintenance reasons, I prefer to fetch only once, and the WHILE (1=1) facilitates this pattern. The one and only FETCH is the first statement within the WHILE loop, the status is tested, and the loop exited on a non-zero result.

    I agree with Mr. Poole, you have to know all your exit conditions.

  • I still think WHILE (1=1) is poor coding, even with cursors. Using a test that can end is a better practice.

  • Well there's a difference between a poor design choice and a legitimate bug.  And I don't see how using while(1=1) without a proper break condition is any worse than while ( x < y) and not incrementing x properly.

  • ok, theoretical you could for the cursor "problem" declare an additional variable, increase it inside the loop and write a WHILE @i < 999999999 instead of 1=1

    But what, if someone comments out the SET @i +=1; line by mistake or a misplaced IF or if he thinks it fits better to the end instead of the begin of the cursor and uses a CONTINUE somewhere in the code?  The chances for this are almost the same or even higher as if he invalidates the IF @@fetch_status <> 0 BREAK; (after the FETCH) but you have the additional overhead of a variable (that someone could misuse in future code) and no real benefit.

    Even when using WHILE @@fetch_status <> 0 it can break, if the cursor is not declared static and someone writes inside the wrong stuff in the base table. And WHILE EXISTS (SELECT ...) is even more prone to break, if you do not update / delete the base line (because of error handling / CONTINUE / outcommenting / whatever).

    So I still think, that the following code is the best way to work with a cursor (when you have to use it e.g. because of VERY BIG data):

        DECLARE CurName CURSOR LOCAL FORWARD_ONLY STATIC READ_ONLY FOR
    SELECT 'dummy'
    ;
    OPEN CurName;
    WHILE 1 = 1
    BEGIN
    FETCH NEXT FROM CurName INTO @dummy;
    IF @@fetch_status <> 0 BREAK;

    -- DoSomething
    END;
    CLOSE CurName;
    DEALLOCATE CurName;

     

    God is real, unless declared integer.

  • I've seen a recursive CTE get thrown into an infinite loop that also had option (maxrecursion 0). The end result was the query ran until the drive that had TempDB on it ran out of space.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply