That sounds like quite a job. 500,000 records in one day is no small order of information to deal with.
It would pobably help if you provided a general breakdown of the table's schema. Are we dealing with a lot of columns of comma-delimited data, or is it mostly well constrained numbers?
I'm not sure it would be necessary to store the table in a separate database instance. Having a separate table in the same db shouldn't really hurt performance, and if it would be accessed very heavily in the future, you could look at storing it on its own filegroup.
Shooting from the hip, my suggestion would be to build a nice normalized table to hold the aggregate information and then run a sql server agent job sometime after midnight that contains the insert statement that aggregates the information from the day before.
INSERT AggregateTable (SaleDate,ProductID,AvePrice)
/*could even store this as integer primary key in aggregate table*/
FROM dbo.YourBigTable b
That is of course depending upon your indexes on the YourBigTable. The select part is the fun part you get to optimize.
Let me know how it goes.