I’m sure most people have identity values lurking somewhere in their databases. In fact I would guess that most people have a lot of them.
An often overlooked fact is that while they are an ever increasing/decreasing value, they do have limits.
If you have a high number of inserts into a table with an identity then it is entirely possible to exhaust the possible values. What happens when you reach this point? Well let’s try
--Create an example Schema and table
CREATE SCHEMA Example AUTHORIZATION dbo
CREATE TABLE Example.incTest(ID TINYINT IDENTITY(1,1), Value varchar(15))
--Populate up to the limit
DECLARE @i INT
SET @i = 1
WHILE @i <= 255
INSERT INTO Example.incTest
SELECT 'Number: ' + CONVERT(VARCHAR(3), @i)
SET @i = @i + 1
--Validate all the rows are there
--and that the maximum value of 255
SELECT ID, Value
ORDER BY 1 DESC
--Now try to insert a new value
INSERT INTO Example.incTest
SELECT 'Uh-Oh Broke it'
After the final insert you should see
Msg 8115, Level 16, State 1, Line 1
Arithmetic overflow error converting IDENTITY to data type tinyint.
Arithmetic overflow occurred.
Not the most informative of error messages I’m sure you will agree but at least it mentions IDENTITY
What can we do about it? Well with the use of a few system table and some maths we can see which table are near their limits quite easily.
--Identify tables containing identities and how close they are to
FLOOR( ( maxidentityvalue - identityseed) / identityincrement ) as TotalVals,
FLOOR(( maxidentityvalue - identityseed) / identityincrement ) - ((currentidentity - identityseed)/identityincrement) as RemainingValues,
(FLOOR(( maxidentityvalue - identityseed) / identityincrement ) - ((currentidentity - identityseed)/identityincrement))
/ FLOOR( ( maxidentityvalue - identityseed) / identityincrement ) *100.00 as PercentageFree
FROM (SELECT TABLE_SCHEMA as TableSchema, table_name AS TableName, idt.name as ColumnName,
typ.name AS Datatype,
IDENT_SEED(TABLE_SCHEMA + '.' + TABLE_NAME) AS IdentitySeed,
IDENT_INCR(TABLE_SCHEMA + '.' + TABLE_NAME) AS IdentityIncrement,
IDENT_CURRENT(TABLE_SCHEMA + '.' + TABLE_NAME) AS CurrentIdentity,
WHEN IDENT_INCR(TABLE_SCHEMA + '.' + TABLE_NAME) >= 0 THEN CASE typ.name
WHEN 'int' THEN 2147483647
WHEN 'bigint' THEN 9223372036854775807
WHEN 'smallint' THEN 32767
WHEN 'tinyint' THEN 255
ELSE CASE typ.name
WHEN 'int' THEN -2147483648
WHEN 'bigint' THEN -9223372036854775808
WHEN 'smallint' THEN -32768
END AS MaxIdentityValue
FROM INFORMATION_SCHEMA.TABLES tab
JOIN (SELECT [object_id],
WHERE is_identity = 1) idt
ON OBJECT_ID(TABLE_SCHEMA + '.' + TABLE_NAME) = idt.object_id
JOIN sys.types typ
ON idt.system_type_id = typ.system_type_id
WHERE OBJECTPROPERTY(OBJECT_ID(TABLE_SCHEMA + '.' + TABLE_NAME), 'TableHasIdentity') = 1
AND table_type = 'BASE TABLE' and typ.is_user_defined = 0)drv
ORDER BY PercentageFree ASC
--DROP TABLE Example.incTest
--DROP SCHEMA Example
We have had tables use up all the available values a number of times in the past, so we use similar code to the above wrapped in a stored procedure with a parameter for a percentage free threshold. This stored procedure is then executed hourly by a job and if anything is getting near to our limit then emails the database administrator team.
After you have identified tables nearing their limits you can then plan what you need to do to remedy the situation. In our case as we were removing old values from the table we just used DBCC CHECKIDENT to re-seed the table, it’s a simple command and books online covers it well so I won’t re-hash it here.
One limitation of the code is it doesn’t take into account any gaps in the sequences of numbers due to re-seeding etc. In my case this is fine as once the number hits the limit it will prevent any more inserts regardless.
Either way try the code out and see if you find any surprising results!