Count(*)

  • Hi,

    A table contains approximately 6 billion rows. Now, if I ran a command like 'select count(*) from table;' to get the actual count. Will it slow down my system? will it be like table scan of such a big table? what are safe methods to get the table rows information? I think I should be able to get this from standard report. I will work on that however let me know your thoughts as well?

    Sincerely!

  • If you have a column with an identity value and is a clustered index you could select max(id) from the table.

    You can also use this script to check the row count, page count and more of your tables.

    SELECT 
    t.NAME AS [TableName Database],
    --i.name as indexName,
    p.[Rows],
    sum(a.total_pages) as TotalPages,
    sum(a.used_pages) as UsedPages,
    sum(a.data_pages) as DataPages,
    (sum(a.total_pages) * 8) / 1024 as TotalSpaceMB,
    (sum(a.used_pages) * 8) / 1024 as UsedSpaceMB,
    (sum(a.data_pages) * 8) / 1024 as DataSpaceMB
    FROM
    sys.tables t
    INNER JOIN
    sys.indexes i ON t.OBJECT_ID = i.object_id
    INNER JOIN
    sys.partitions p ON i.object_id = p.OBJECT_ID AND i.index_id = p.index_id
    INNER JOIN
    sys.allocation_units a ON p.partition_id = a.container_id
    WHERE
    t.NAME NOT LIKE 'dt%' AND
    i.OBJECT_ID > 255 AND
    i.index_id <= 1
    GROUP BY
    t.NAME, i.object_id, i.index_id, i.name, p.[Rows]
    ORDER BY
    object_name(i.object_id)
    GO

    if your tables has tons of indexes it might show your rows per indexes also, you might have to tweak the script a little bit, this script i think is widely known, can't recall source but i didn't made it, it have helped me quite a lot.

  • Thanks, I was able to use the DISK Usage by Table report which I was thinking about, and it shows #record for that table.

  • You could also right click the object in this case the table, properties and go to storage and see the 'Row Count'

  • Yeah! thank you!

  • Alejandro Santana wrote:

    If you have a column with an identity value and is a clustered index you could select max(id) from the table.

    That only works if no rows have ever been deleted from the table. We have a number of tables that have max(id) values that are many times the number of records.

  • sizal0234 wrote:

    Hi,

    A table contains approximately 6 billion rows. Now, if I ran a command like 'select count(*) from table;' to get the actual count. Will it slow down my system? will it be like table scan of such a big table? what are safe methods to get the table rows information? I think I should be able to get this from standard report. I will work on that however let me know your thoughts as well?

    Sincerely!

    Everything has a cost. As to whether or not an index can supply this value, I'd suggest getting the estimated plan before you run the query. It'll let you know if you're going to see a scan or not and where you'll see that scan.

    Also, in SQL Server 2016, you could take advantage of APPROX_COUNT_DISTINCT to reduce the overhead radically. You won't get a perfect count, but you will avoid pain.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • @Grant Fritchey : Thank you, sir, for your response and time and truly appreciate your advice. It always helps!

    I learned about APPROX_COUNT_DISTINCT 🙂

  • Or this, hopefully I wrote it correctly as I currently don't have access to a SQL Server database to test:

    select
    tab.name as 'TableName'
    , ca.RowCnt as 'RowCount'
    from
    sys.tables as tab
    cross join (select sum(par.rows)
    from sys.partitions as par
    where tab.object_id = par.object_id
    and par.index_id in (0,1)) as ca
    where
    tab.object_id = object_id('dbo.yourtablename'); -- where dbo may be a different schema
    -- than dbo and yourtablename is the
    -- name of the table

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply