Counting records over several tables

  • Hi - still a newbie, still trying to learn! 😉

    Hopefully you can help me with this one. In my database there are several tables (around 90 to be exact).

    Each table contains the column 'RECORD_ModificationDate'.

    I would like to query each table and count the number of records that have a 'RECORD_ModificationDate' of after, say, midnight (still trying to work out what the best time is).

    Ideally the output needs to contain the table name, followed by the number of records.

    What is the quickest way of getting this info?

    Many thanks for your help.

  • select 'Table1' as TableName, count(*) as Qty -- First Table

    from dbo.Table1

    where RECORD_ModificationDate >= @StartDate_in

    and RECORD_ModificationDate < @EndDate_in

    union all

    select 'Table2' as TableName, count(*) -- Second Table

    from dbo.Table2

    where RECORD_ModificationDate >= @StartDate_in

    and RECORD_ModificationDate < @EndDate_in

    union all

    -- Duplicate query for each table

    Is one way to do it pretty efficiently. I'm defining @StartDate_in and @EndDate_in as datetime input parameters. You can put whatever is appropriate for your query in the Where clauses.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Why would you need to know how many records in every table have been changed? That seems strange. I could see a few of them, but wouldn't lots of tables be unrelated?

    In any case, you can union up the data as suggested above, querying each table. That might take a long time, and be cumbersome.

    I'd probably build a temp table or build a real table, and then have a stored procedure that queries each table separtely, inserts the data (table name, count, and current GETDATE() ) into this table that you can query. It gives you history, though you can manage that and delete older data regularly. It also lets you stagger the load, potentially keep going if you have issues (set up try..catch blocks) and better manage things than a single query.

  • This is a possible solution. I am not sure how to avoid it, but I'm sure someone will post another solution that doesn't go RBAR which would definitely be preferable.

    CREATE TABLE #TableNames (TableName VARCHAR(100))

    CREATE TABLE #RowCounts (TableName VARCHAR(100), NumRows INTEGER)

    DECLARE @vSELECT VARCHAR(1000)

    DECLARE @vFROM VARCHAR(1000)

    DECLARE @vWHERE VARCHAR(1000)

    DECLARE @vINSERT VARCHAR(1000)

    DECLARE @vTableName VARCHAR(100)

    --get all of the tables with a column named RECORD_ModificationDate

    INSERT INTO #TableNames(TableName)

    SELECT SO.Name

    FROM sysobjects SO

    WHERE Xtype='U' AND EXISTS( SELECT TOP 1 SC.Name

    FROM syscolumns SC

    WHERE SC.id=SO.id AND SC.Name='RECORD_ModificationDate')

    --the insert and where clauses won't changed based

    SET @vINSERT = 'INSERT INTO #RowCounts(TableName, NumRows) '

    SET @vWHERE = 'WHERE RECORD_ModificationDate > CAST(FLOOR(CAST(GETDATE() AS FLOAT)) AS DATETIME)'

    WHILE EXISTS(SELECT TOP 1 * FROM #TableNames)

    BEGIN

    --get the name of the first table that has not already been counted

    SET @vTableName = (SELECT TOP 1 TableName FROM #TableNames)

    --update select and from statements

    SET @vSELECT = 'SELECT '''+@vTableName+''' AS TableName, COUNT(1) AS NumRows '

    SET @vFROM = 'FROM '+@vTableName+' '

    --execute the combined insert/select/from/where statements

    EXEC (@vINSERT+@vSELECT+@vFROM+@vWHERE)

    --remove the table name from the table to signify it has already been counted

    DELETE FROM #TableNames WHERE TableName=@vTableName

    END

    --return the counts

    SELECT * FROM #RowCounts

    --cleanup after yourself

    DROP TABLE #TableNames

    DROP TABLE #RowCounts

    *edit* I forgot to close the parentheses on the exec

  • Dear all,

    Thank you very much for your replies, your help is greatly appreciated.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply