Cursor replacement suggestions

  • I have a process that uses SqlAgent jobs to process data coming from many sources, and I'm seeing performance issues trying to process the data. Looking for suggestions on processing of the data table. I have a single SP currently processing from the table and it has a parm of equip_id so I break the sql agent jobs up to try to process individually. If I have records from equip 205v1 and 205v2

    and 205v3. I make 3 sql agent jobs for my SP and call it like Import_Katt '205v1%' ,Import_Katt '205v2%',Import_Katt '205v3%'.

    Records for these pieces of equipment are hitting the data tables every 20-30 seconds so I try to break up the processing by creating the mutliple sql agent jobs, trying to process from the queue. I listed the cursor below pulling from the data table. After capturing

    these records the data goes thru other SP's to process from the cursor. I can list other components as needed, but looking for ideas to make processing faster more efficient.

    Thanks.

    DECLARE Quality_Cursor CURSOR LOCAL DYNAMIC READ_ONLY FOR
    SELECT DISTINCT [Key], [Date], [Time], Small_Beads, Large_Beads, Continuity, Speed, Continuity_Voltage,
    Continuity_Amp, Small_Bead_Threshold, Large_Bead_Threshold, Equip_ID,XLarge_Beads,XLarge_Bead_Threshold
    FROM Import_Katt_3
    WHERE (Processed = N'FALSE') AND (Equip_ID LIKE @Equipment)
    ORDER BY [Date],[Time]
  • It might be helpful if you would elaborate on the 'performance issues'. Are we talking CPU, disk access, locking/concurrency, parallelism ...?

    What makes you think that the cursor is causing the issue?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • The processing can't keep up with the amount of records hitting the queue...parallelism ?

    Based upon design do you think pulling records from the table using cursor is efficient ?

  • Do you execute one or more procs for every row hitting the queue?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • There are multiple sql agent jobs processing from the queue, but I tried to break them out so they where only picking up

    individual records from the queue.

    If I have records from equip 205v1 and 205v2 and 205v3. I make 3 sql agent jobs for my SP and call it like Import_Katt '205v1%' ,Import_Katt '205v2%',Import_Katt '205v3%'.

    After loaded to the cursor(BEGIN - Start Transaction) than there are other sp's used in the process before a final END.

    Thanks.

  • OK, what I was angling towards is this: can your process be re-engineered to process all currently unprocessed rows in the queue, rather than picking them off one by one?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • As we don't know what the processing is no one will be able to say whether or not a DYNAMIC READ_ONLY CURSOR is the best way to process your data. Can you supply details of what the processing does? It might be that you are just missing some indexes or need to rewrite some of the queries in the processing to speed it up.

  • @Phil   I was thinking on the queue also do you have any examples or suggestions to get me started...

    Thanks.

  • Bruin wrote:

    @Phil   I was thinking on the queue also do you have any examples or suggestions to get me started...

    Thanks.

    As Jonathan AC Roberts mentioned, this is difficult without knowing more. Whether or not you can implement my suggestion depends on your processing needs.

    Or are you simply looking for guidance relating to how to handle queue processing and concurrency (which can be a little tricky if you have multiple instances of the same proc running)?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Or are you simply looking for guidance relating to how to handle queue processing and concurrency (which can be a little tricky if you have multiple instances of the same proc running)?

    -- Yes.

  • Here is a detailed article on the topic: http://rusanu.com/2010/03/26/using-tables-as-queues/

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Bruin wrote:

    After loaded to the cursor(BEGIN - Start Transaction) than there are other sp's used in the process before a final END. Thanks.

    You posted the code that creates the cursor and nothing else.  There's nothing for us to even make a suggestion on.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I need to measure how many records I'm processing in my Import table which has a Date field(DATE) and a Time(Time(7) field.

    I like to know records per minute and records per hour ..

     

    Thanks.

  • Go back to your first post... see that cursor definition you posted?  Yeah... we need to know what you're trying to do with the data that cursor gathers.  We don't need to know about the jobs that you're currently using because (speaking only for myself) I think that's the wrong way to approach any of this just like I believe using a cursor is the wrong way to do this.

    Let's get back to the basics so we don't have to speculate.  We need to know the gazintas and the gazottas... see the first link in my signature line below and provide some relevant readily consumable data and a reasonable description of what you're going to do with the data and maybe even a picture of what you'd like it to look like.

    Yep... I know you're a bit frustrated with this problem... so are we.  😉  You're not describing things correctly and I'm here to tell you that any use of a cursor will probably continue to frustrate you for your simple "queue" processing. 😉

    First link in my signature line below... just do it.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I would be interested to here if not using SQL Agent jobs to process the incoming data what method would you suggest to allow

    processing of multiple Import records in parrel

     

    Thanks.

Viewing 15 posts - 1 through 15 (of 56 total)

You must be logged in to reply to this topic. Login to reply