Update statistics job is failing

  • I've set up a Maintenance Plan (yeah, yeah, code it up myself and create the Agent job, I know... :hehe:) to run an Update Statistics once a week on a server here.

    Well, the past couple times its run, it's coming back with an error, and I'm not sure how to resolve it:

    Executing the query "UPDATE STATISTICS [dbo].[TABLENAME]

    WITH FULLSCAN

    " failed with the following error: "Error converting data type varchar to numeric.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

    No, the table in question isn't really named "TABLENAME." The table in question does have quite a few computed columns, but there's nothing in BOL to indicate that these should be a problem. Yes, this table has quite a few columns (106 of which 39 are computed,) but it was (I believe) inherited from a mainframe system.

    I know I could exclude this DB from the update job, but I'd rather not. Further, it has worked in the recent past (about 3 weeks back,) and the dev responsible for it hasn't made any schema changes in months.

    Any suggestions?

    Thanks,

    Jason

  • DBCC CheckDB WITH NO_INFOMSGS, ALL_ERRORMSGS, DATA_PURITY

    If you do a SELECT * FROM TableName do you get the same error?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • While I do have an Agent job that runs DBCC CheckDB once a week as well (at a time that doesn't impact any other jobs,) I'll manually run it as you suggest this weekend (I have to wait that long, there's Agent jobs that fire every 30 minutes to load data throughout the week. They don't run on the weekends...)

    As for the select *, it's about 3/4 done (8.4 million rows in the table) with no problems so far...

    And it just finished with the same error. Sounds like the DBCC CheckDB job isn't catching this problem. I'll run your code this weekend, and see what it spits up.

  • Update stats on that table manually, see if you get the same error. If so, update stats one by one, see which one gives you the error. Look at what columns that stats object is defined on.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • That's going to be fun...

    😉

    56 auto-created stats to check, plus one on the primary key. Well, no one said this job would be easy!

    Hmm, time to see if I can do this by RBARing through an output of sys.stats, so I can automate this some...

    🙂

    Thanks much Gail!

  • OK, found a better way then the cursor, and already tried it.

    Used this:

    select

    name

    , STATS_DATE(object_id, stats_id) as LastUpdated

    from sys.stats

    where object_id = 735341684

    order by LastUpdated asc;

    to see which stats hadn't been updated this weekend, then manually tried to update each of those. The one that failed is an auto-created stats on one of the computed columns.

    The datatype of the column is Numeric, and the computed text is: (CONVERT([numeric],[ColumnName],(0))/(100.0))

    Now, I checked ColumnName, and I think I've found the problem (time to give the dev a headache! 😉 ) ColumnName is a varchar(10). So I'm betting, someone, somewhere, entered something like 1234ABC.123 in there.

    Time to do a select ColumnName from... and see what comes up.

    And, found the problem child...

    Somehow a value of '16923951 0' got entered...

    Chatting with the Dev about making the computed column better able to deal with things like this...

  • jasona.work (10/7/2013)


    Somehow a value of '16923951 0' got entered...

    while you insert value ( '16923951 0' ) into it.

    it will throw error like "String , binary data would be truncated"

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • Bhuvnesh (10/8/2013)


    jasona.work (10/7/2013)


    Somehow a value of '16923951 0' got entered...

    while you insert value ( '16923951 0' ) into it.

    it will throw error like "String , binary data would be truncated"

    That's a 10-character string, it'll insert without error into a varchar(10).

    It won't convert to numeric, but that will only be done when someone queries the calculated column, not at insert time, so the insert will succeed but subsequent selects will fail.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Once more, thanks Gail!

    The Dev put together a function to check for non-numeric characters, and if found, not create the computed value. In theory, the application which uses this DB is eventually going away, but who knows when that might really be...

  • A better solution might be to change the column to numeric. Or is that not possible?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster (10/8/2013)


    A better solution might be to change the column to numeric. Or is that not possible?

    That was my first suggestion, but according to the Dev, the people who send him the data basically told him "we make no promises it'll be in the correct format, and you're on your own once we send it to you."

    I've suggested long-term it would be better to catch "bad" data on the import side, but he's thinking the app is going away soon enough that it wouldn't be worth the effort to fix / improve / replace the BCP import with an SSIS package.

  • In that case may I suggest a simple after/instead of insert trigger to check for (and fix) invalid data?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • I'm presuming that such a trigger would have better performance than the function (function has to work on every row where the computed column is selected, trigger catches the problem at the beginning)

    I'll look into this, and work with the Dev on implementing one. Something to check for and replace anything non-numeric (depending on the requirements, maybe just strip it out)

    I suspect we can start with the function as a foundation, just improve it some.

    Once more, thanks Gail!

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply