Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««123

Generating a Sequential Pattern Expand / Collapse
Author
Message
Posted Wednesday, May 17, 2006 12:34 PM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, November 25, 2008 9:48 AM
Points: 57, Visits: 8

It should be a separate column and concatenated into the computed column that is displayed to the user.

 

I think thats the most important post in this discussion. I think one of the worst things you can do to a relational DB is to store MORE THEN ONE Information in a field. And if you "force" the field to become a wrong datatype it will even get worse.

 

Take the Year-Number example...

 SET @tmp_invoice_id = (SELECT MAX(SUBSTRING(invoice_id, 4, 5) + 1)
                        FROM Invoice WHERE (invoice_id LIKE @tmp_date + '%'))


To find all "Year entries" you can utilize an Index with "like '06%' after that you are using a function to find the maximum value of ALL Invoices for that year. That means in a "worst case" you have to do 99.999 times Substring ?

 

I think the overhead when creating the sequential Invoices can be negelected if you are using a sequence table... Also here is a small suggestion for the self updating of the sequence table. If I am faced with a the same problem, then i usually do it the "lazy way" Update the row i need to update, and if I get a rowcount of "0" then I know that i need to actually insert the row. Updating 0 Rows has no negative performace effect on SQL Server (Assuming correct indexes) and if I first check if the row exists, and then update, then i need to access the index 2 times... Once for checking and once for updating... So I can skip the 2nd Access to the index most of the times. And the Insert will automatically fierd when its needed.

 

But when the reporting season starts this (string) system will hurt you. It is not possible to generate a "good" index on these fields. and when you start running querries this will start to hurt you. It might only start with an concattinated Invoice Number... but in a current  application i am working with this started to run rampart... They needed to change the Datatype from Varchar to Binary to store all the "extra" information (Yes I tried to convice them that this is stupid... trust me I tried.. I am also looking for a Job now where ppl actually listen)

P.s: The suggested solution can still be valid if the number of Invoices remains small. You allways have to think about which compromise you can make...

 

Post #280824
Posted Wednesday, May 17, 2006 3:08 PM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Friday, November 21, 2014 6:34 AM
Points: 6,259, Visits: 2,031

Hehe...

I was refering to the Identity column with primary key constraint

Should you need this "sequencing" you can use the table and should you could care less about it use the Identity instead.

The sequence is very good to "reserve"  client side batches of candidate keys in one go

I am certaint that scalability is excellent if you apply all the remomendations I posted (I have !)

 




* Noel
Post #280878
Posted Wednesday, May 17, 2006 8:26 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, February 14, 2008 2:21 PM
Points: 18, Visits: 3
Thankyou all for taking time to put your comments... I am glad to see a healthy discussion here on topic which I started

Regards,
Asif
Post #280927
Posted Thursday, May 18, 2006 12:59 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Tuesday, November 25, 2008 9:48 AM
Points: 57, Visits: 8

I was thinking a bit about the performance problem I mentioned above when creating a sequence.

 

The group by querry could be rewritten so it can correctly use the indexec.

 

The following querry would do the trick:

 

 SET @tmp_invoice_id = (SELECT top 1 (SUBSTRING(invoice_id, 4, 5) + 1)
                        FROM Invoice WHERE (invoice_id LIKE @tmp_date + '%')order by  invoive_id desc)

 

This will limit the querry to only retrieve one record and only do one substring and one implicit cast on the data.

Post #280961
Posted Thursday, May 18, 2006 2:31 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Friday, March 8, 2013 5:22 AM
Points: 119, Visits: 9
How about this string keys are to be avoided (not least because they become foreign keys), also the indexes will be larger and less efficient than integers. This method uses an identity key and a derived column that calculates the invoice id from the identity column. An alternative would be to calculate the invoice id using the function and storing it alongside the identity column, might be worth storing and indexing the year too as likely to group indexes by year, can use check constraints to ensure values are in sync.

create table tbl(id int identity(1,1), dt datetime)
go

create function dbo.GetInvoiceId(@i int, @yr int) returns varchar(10)
as
begin
declare @r varchar(10)
set @r = @i - coalesce((select max(id) from tbl where year(dt) < @yr),0)

return substring(cast(@yr as varchar(4)),3,2) + substring('-000000',1, 7 - len(@r)) + @r

end
go
begin
alter table tbl add MyId as dbo.GetInvoiceId([id], year(dt))

insert tbl(dt)
select top 100 '2004-01-01' from sysobjects
insert tbl(dt)
select top 100 '2005-01-01' from sysobjects
insert tbl(dt)
select top 100 '2006-01-01' from sysobjects

select * from tbl

end
go
drop table tbl
go
drop function dbo.GetId



Phil Nicholas
Post #280979
Posted Thursday, May 18, 2006 8:39 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, February 14, 2008 2:21 PM
Points: 18, Visits: 3

I liked your revised approach of letting SQL pick proper index and help speed up performance.  The goal for this approach was totally eliminate the need to “maintain any sort of counters” and I think I have mentioned that the solution I proposed is for a small to medium size operation, believe it or not it always worked for me since about 8 years

 

I will give it another shot using CLR integration and see how much does it makes difference.

 

Regards,

Asif

Post #281104
Posted Saturday, June 17, 2006 11:57 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Tuesday, October 28, 2014 11:02 AM
Points: 2,397, Visits: 3,419

Iron clad?

-- prepare test data
declare @test table (t varchar(50))

insert @test
select 'ov1' union all
select 'ov2' union all
select 'ov4' union all
select 'ov31' union all
select 'ssss1' union all
select 'ssss99'

-- show data before inserting new values
select * from @test

-- do the work
declare @prefix varchar(50)

select @prefix = 'ov' -- Throw anything in!

insert @test
select @prefix + convert(varchar, t.n + 1)
from (
select max(convert(int, substring(t, 1 + len(@prefix), 50))) n
from @test
where t like @prefix + '%'
and not substring(t, 1 + len(@prefix), 50) like '%[^0-9]%'
) t
where not @prefix like '%[0-9]%'
and not t.n like '%[^0-9]%'

-- show data after inserting new values
select * from @test




N 56°04'39.16"
E 12°55'05.25"
Post #288342
Posted Wednesday, May 16, 2007 10:30 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, April 3, 2008 6:08 AM
Points: 11, Visits: 2
hmmm....
well it does solve the issue MOMENTARILY

however if its really deployed over a vast network where a transaction of insert would be like fired so very frequently...the user is definitely gettin more already used errors...i am sure...

and plus there are better alternatives already available...
nyways this definitely helped thoughh
Post #366628
Posted Thursday, May 17, 2007 5:41 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Tuesday, May 12, 2009 10:03 AM
Points: 27, Visits: 108
We've had a similar situation and deployed a solution like Noels which I think is the best. It gives you something to lock on to prevent the concurrency problem (which will show up once you get a lot of users and data on the system). It also will perform must faster because you only have to read one row out of a table that will have only a couple of rows (if one per year) rather than scanning your invoice table for max constantly.
Post #366723
« Prev Topic | Next Topic »

Add to briefcase «««123

Permissions Expand / Collapse