November 29, 2015 at 11:30 pm
Hi Friends,
We are Designing Database for a project. We have
1) Customer Master Table
2) Item master Table
a) Item Sub Master
Each Customer will have min of 3000 Items
Each item will have min of 30 Sub Items.
i.e, For each customer we 9000 rows in the transaction table.
Suppose if we have 500 Customers 45,00,000 rows
We thought of keeping each customer data in text file (Json Format)
Kindly suggest what is this the best way to handle the transaction DB. Frequent Update will be there for the table.
November 29, 2015 at 11:42 pm
Sounds like you need to think about normailisation before continuing; do you really need 3000 attributes for a customer?
...
November 29, 2015 at 11:47 pm
Hi HappyGeek,
Thanks for the reply.
Each Customer is having 3000 Attributes.
November 30, 2015 at 12:02 am
Agreed, sounds like the customer table needs some normalisation. Though I don't see how 3000 attributes = 3000 rows, an attribute is a column, so 3000 columns. But yeah, you probably need to break that table down.
If you want to store stuff in JSON documents, go use something like DocumentDB (Microsoft), MongoDB or one of the other document databases. SQL is a relational DB, should have normalised table designs if you want it to work well.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
November 30, 2015 at 11:49 am
GilaMonster (11/30/2015)
Agreed, sounds like the customer table needs some normalisation. Though I don't see how 3000 attributes = 3000 rows, an attribute is a column, so 3000 columns. But yeah, you probably need to break that table down.
I thought the same until I read the "We will have 9000 rows in the transaction table", which made me think are you talking attributes!
There may be much more to the design, I think the poster needs to supply more detail otherwise there will be little to help with.
...
November 30, 2015 at 10:22 pm
Let me explain in detail,
We are developing application for Grocery stores as SAS model.
Each customer selling items ( More than 3000 items) each items is sold based on its UOM( Unit of measurement)
ex : Rice Bag, It can be (1,3,5,25,50) Kg Bag, Transaction Table rows for rice will be 5 rows.
Similar if we calculate for each item having 5 UOM, then it will be 3000*5=15000, For each Customer.
For 500 Customer it will be 15000*500=7500000 rows.
DB is accessed from Mobile APP through WCF Service.
December 1, 2015 at 12:59 am
Really, transaction table will eventually grow far beyond number of items (7500000). But I see nothing wrong with the above design. You need proper indexing, probably partitioning, enterprise feature.
December 1, 2015 at 1:55 am
In that case, sounds fine. Not seeing any problems with it. 7.5 million rows is pretty small, so nothing to be concerned about there.
Proper indexing, definitely. Partitioning, maybe, remember it's not for performance it's for fast data loads, fast data removal and index maintenance on parts of a table. Well written queries, decently spec'd server if you're expecting lots of transactions/sec.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 1, 2015 at 9:05 am
venkatesh.naikd (11/30/2015)
Let me explain in detail,We are developing application for Grocery stores as SAS model.
Each customer selling items ( More than 3000 items) each items is sold based on its UOM( Unit of measurement)
ex : Rice Bag, It can be (1,3,5,25,50) Kg Bag, Transaction Table rows for rice will be 5 rows.
Similar if we calculate for each item having 5 UOM, then it will be 3000*5=15000, For each Customer.
For 500 Customer it will be 15000*500=7500000 rows.
DB is accessed from Mobile APP through WCF Service.
Well is this database just supposed to be a list of what each customer has available to sell or is it meant to track actual sales and inventory?
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply