SQL Procedure runs fast in Management Studio but slow called from VB.

  • First time running up against this. I have a vb app calling a stored procedure with 2 parameters. I changed the parameters in the procedure to local variables and still no success. Tried recompile with no success either. The parameters are start date and and end date. If I put in a few months, it drags the SQL server to almost a halt. Have no idea why it would do this. Again works fantastic in Management Studio with no issue. Any insight would be great. Thanks.

  • Connection settings.   Also look at  the actual execution plan in both situations and compare them.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • Connection strings are the same in other programs See below:
    Dim constring As String = "Data Source=" + Server + ";Initial Catalog=" + _
                      database + ";Integrated Security = SSPI"
       Dim con As SqlConnection = New SqlConnection(constring)
       con.Open()

    Can you  help me with the execution plan? This is something new to me. Thanks

  • See Erland Sommarskog's article for a good explanation of what's probably going on here.

    http://www.sommarskog.se/query-plan-mysteries.html

    Thomas Rushton
    blog: https://thelonedba.wordpress.com

  • a bit over my head but I will try to get through it.

  • The Sommarskog article is a very good article, if you're having problems with it, I'd say the first thing to check is that the sessions running the query have the same connection properties, such as ARITHABORT which Microsoft says should always be set to ON, as it is in SSMS, even though for .Net and other applications the default is OFF:
    https://docs.microsoft.com/en-us/sql/t-sql/statements/set-arithabort-transact-sql

    If you're having the ARITHABORT setting problem then trying to solve parameter sniffing won't really help.  You can configure an instance to default all connections to ARITHABORT ON using SSMS server properties, connections page, arithmetic abort setting:
    https://docs.microsoft.com/en-us/sql/database-engine/configure-windows/server-properties-connections-page

  • bswhipp - Thursday, October 18, 2018 12:47 PM

    a bit over my head but I will try to get through it.

    If you have plans for both of those, you can pass in the handle to sys.dm_exec_plan_attributes. It shows the all the options set for the query which part of what Erland's article discusses, what Steve was mentioning and what Chris is referring to in the above post.

    Sue

  • I found the code that is offending.  It is a select statement. Within the select statement there is a select statement to a very very large table. I know a select within a select is not the best way to do it but for some reason, when I join the table i get different/incorrect results even though I am using the same criteria. The execution plan suggested I put a Non clustered index on the table. I am not sure of the impact on the table if I do this. The table is part of our  ERP  system and I don't want to mess with that if I don't have to.

  • Can you take a look at this for me? Maybe I am over seeing something.

    This is the offending code:
    Select Distinct fsono, fprodcl, fpartno, frev, fccompany, lstclassify,
    Coalesce((Select sum(ABS(fnamount))/2 from m2mdata01..ocdist where fjob_so = fsono
        and #PMARevenueTemp.fpartno = ocdist.fpartno and frev = fcpartrev
        and fdate >=@lstartdate and fdate <= @lenddate), 0) as Cogs
    into #PMACogs
    from #PMARevenueTemp

    Which yields 359.00 for Cogs which is correct.
    I try to change it to this and it yeilds 1750.00 for cogs.

    Select Distinct #PMARevenueTemp.fsono, #PMARevenueTemp.fprodcl, #PMARevenueTemp.fpartno, #PMARevenueTemp.frev,
    #PMARevenueTemp.fccompany, #PMARevenueTemp.lstclassify,
    sum(ABS(fnamount))/2 as cogs
    into #PMACogs
    from #PMARevenueTemp
    inner join m2mdata01..ocdist on fjob_so = fsono and
    #PMARevenueTemp.fpartno = ocdist.fpartno and #PMARevenueTemp.frev = ocdist.fcpartrev
    and fdate >= @lstartdate and fdate <= @lenddate
    Group by fsono, fprodcl, #PMARevenueTemp.fpartno, frev, fccompany, lstclassify

  • OK, now I know why, there are more than 1 records in the #PMARevenueTemp with the same fsono, fpartno, and frev. But how do I get around that? I only want to get the information once for these and then sum from the other table?

  • Without the structure, I'm not going to attempt to write the T-SQL, but you could use a TOP 1 value for the matching between your temporary table and the actual table. That's going to work better with indexes (assuming good ones are in place) than a DISTINCT will.

    For more thorough help, I'd need to see the two execution plans to fully understand.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • I got around it by selecting a 0 as the field name then doing an update to the table setting the field to a select statement value. Runs lickidy split now. Thanks to all though.

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply