I'm looking for a best practice approach. I've been charged with re-designing a leads management system that literally can yield between 10 to 100 million records within a quarter. This depends upon the customer based.
The current architecture is to create a new database for a given customer. Each customer database generates approx. 9 million records per month X 12. There are currently 150 customers X 108 million records. It's been forecasted that within the next year there will be 200 more customer will be added by the end of next year 350 databases X 108 million records. At this rate the current architecture will not be able to handle this scenario (in my opinion).
I've never dealt with this data volume even in ERP systems. How do I begin to tackle and fully optimize this system. I'm an Application Architect not a DBA or Data Architect.
How can I incorporate best practice design to this issue? Any references or books to review would be greatly appreciated?
I can provide additional details but I just found out today about the data explosion.