Load Monitor

  • We have a Transaction server writing records to SQL recording machine data. The process writes records every 15 minutes to a table

    from different equipment. I would like some way to monitor:

    1) it wrote to SQL  -- there should be at least 3 groups per hour

    2) the temp data actually contains data and not Null

    There has been problems from Transaction Server side where it's writing data, but no temp values are being passed. I would like an

    alert\email to run hourly to capture previous hours data and do some sort of validation that data is actually processing.

    If data passing is empty or Null send alert to email group alerting someone to check TRansaction Processor.

    Thanks.

    CREATE TABLE [dbo].[mact_data](
    [dtstamp] [datetime] NOT NULL,
    [equipment_id] [varchar](6) NOT NULL,
    [temperature] [int] NULL,
    [status] [int] NULL,
    CONSTRAINT [PK_mact_data_dtstamp] PRIMARY KEY CLUSTERED
    (
    [dtstamp] ASC,
    [equipment_id] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]

    GO

     

     

     

  • I'm not sure what you're looking for in terms of help. Write a query that satisfies the criteria you outlined. Schedule that query to run using SQL Agent. Put an alert on the Agent job. Done. Unless I'm missing something.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • "Write a query that satisfies the criteria you outlined"

    Not sure how to setup the query to gather the information to alert groups of a problem.

    Looking back at previous time window to detect the problem.

    Thanks.

     

  • It's two steps. One, gather the info. Two, based on that info, send an alert. So, a rough outline of psuedo-code would be something like:

    SELECT COUNT(*)

    FROM dbo.mact_data

    WHERE dtstamp > (GETDATE()-60);

    Then just set up the Agent job with the query. Again, that's not a precise query, but it should give you the general outline to go from.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Bruin wrote:

    We have a Transaction server writing records to SQL recording machine data. The process writes records every 15 minutes to a table from different equipment.  

    I'd take out the proverberial middle-man.  Why not have the "Transaction Server" do a quick check on the data and send an email if the "data is NULL"?

    Another possibility is to setup a job on SQL Server to pull the data from the "Transaction Server" and that job could check for content.  Checking for at least 3 per hour would then become "inherent" because of the "Transaction Server" were down, then the job (scheduled to execute every 15 minutes) would know it and you wouldn't need a separate occurrence-count check.  Of course, the job could also count the number of rows for each run and do a whole bunch more like, maybe, pre-aggregation of data for a given time period for reporting purposes, etc, etc, as well as sending alert emails if anything went haywire.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Create a job to check on the data.  Even though it will check the last hour, you could schedule it to run as often as you wanted to check:

    DECLARE @mins_back_to_check int
    SET @mins_back_to_check = 60

    IF EXISTS(
    SELECT 1
    FROM (
    SELECT
    COUNT(*) AS total_rows_found,
    SUM(CASE WHEN equipment_id IS NULL OR equipment_id = '' THEN 1 ELSE 0 END) AS equipment_id_missing_count,
    SUM(CASE WHEN temperature IS NULL THEN 1 ELSE 0 END) AS temperature_missing_count,
    SUM(CASE WHEN status IS NULL THEN 1 ELSE 0 END) AS status_missing_count
    FROM dbo.mact_data
    WHERE dtstamp >= DATEADD(MINUTE, -@mins_back_to_check, GETDATE())
    ) AS derived1
    WHERE total_rows_found < 3 OR equipment_id_missing_count > 0 OR
    temperature_missing_count > 0 OR status_missing_count > 0
    )
    BEGIN
    PRINT '' --dummy statement, naturally remove once you add code to block
    /*set up email param values
    DECLARE @recipients varchar(max)
    DECLARE @subject nvarchar(255)
    DECLARE @... ...
    SET @subject = 'Missing and/or invalid temperature data for the past hour.'
    SET @... = ...
    EXEC msdb.dbo.sp_send_dbmail @recipients=@recipients, @subject=@subject, ...
    */
    END /*IF*/

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Thanks Scott, that's what I was looking for, as pushed to SQL not pulled.

  • I found a temperature within the past hour and it didn't report back in the query.  Will I just be able to output derived1

    in the sendmail?

     

    Thanks.

  • Yes, definitely.  That's another reason I wrote the trigger that way, to allow sample value(s) to be pulled for use in the email, if needed.

    For example:

    DECLARE @min_number_of_entries_expected int
    DECLARE @mins_back_to_check int
    SET @min_number_of_entries_expected = 3
    SET @mins_back_to_check = 60

    DECLARE @equipment_id_missing_count int
    DECLARE @temperature_missing_count int
    DECLARE @status_missing_count int
    DECLARE @total_rows_found int

    SELECT
    @equipment_id_missing_count = SUM(CASE WHEN equipment_id IS NULL OR equipment_id = '' THEN 1 ELSE 0 END),
    @status_missing_count = SUM(CASE WHEN status IS NULL THEN 1 ELSE 0 END),
    @temperature_missing_count = SUM(CASE WHEN temperature IS NULL THEN 1 ELSE 0 END),
    @total_rows_found = COUNT(*)
    FROM dbo.mact_data
    WHERE dtstamp >= DATEADD(MINUTE, -@mins_back_to_check, GETDATE())

    IF @total_rows_found < @min_number_of_entries_expected OR @equipment_id_missing_count > 0
    OR @temperature_missing_count > 0 OR @status_missing_count > 0
    BEGIN
    DECLARE @body nvarchar(max)
    DECLARE @recipients varchar(max)
    DECLARE @subject nvarchar(255)
    SET @recipients = '...'
    SET @subject = 'Missing and/or invalid temperature data for the past ' + CAST(@mins_back_to_check AS varchar(5)) + ' minutes.'
    SET @body = 'In the past ' + CAST(@mins_back_to_check AS varchar(5)) + ' minutes, there have been: ' +
    CASE WHEN @total_rows_found < @min_number_of_entries_expected THEN
    CAST(@total_rows_found AS varchar(5)) + ' total entries found but expected a minimum of ' +
    CAST(@min_number_of_entries_expected AS varchar(5)) + '. ' ELSE '' END +
    CASE WHEN @equipment_id_missing_count > 0 THEN
    CAST(@equipment_id_missing_count AS varchar(5)) + ' missing equipment id(s). ' ELSE '' END +
    CASE WHEN @temperature_missing_count > 0 THEN
    CAST(@temperature_missing_count AS varchar(5)) + ' missing temperature(s). ' ELSE '' END +
    CASE WHEN @status_missing_count > 0 THEN
    CAST(@status_missing_count AS varchar(5)) + ' missing status(es). ' ELSE '' END;
    EXEC msdb.dbo.sp_send_dbmail @recipients=@recipients, @subject=@subject, @body = @body
    END /*IF*/

    SQL DBA,SQL Server MVP(07, 08, 09) A socialist is someone who will give you the shirt off *someone else's* back.

  • Nice script captures all requirements.

     

    Many Thanks

  • It looks like you got what you needed.  I would suggest to have an error table / file to capture the error records with a time stamp.  Then you can queries/reports against  this table.

    Cheers

    DBASupport

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply