• yasserbatsh (10/10/2013)


    When you say "data aggregator", I'm thinking something like a data mart, a database that aggregates data from multiple sources and possibly different structures.To leverage the strengths of a relational database like SQL Server, I would reccomend creating a fixed number of conformed and normalized tables. For exmple, there should be one (and only one) Customer table, Invoice table, Campaign table, etc.

    what if the user wants to import a new data structure that is not normalized yet in database ...say the data is for assets... is it best to treat such cases by creating the new structure programatically into database or store new structures as xml data types? other better ways?

    i have seen similar case in MS dynamic CRM. they have an import wizard that lets you import new structures and define the data types of fields and foreign keys as well but i could not find how they treat this at the database level.. any clues?

    thanks.

    My experience is in the healthcare, banking, and government sector. When there is a need to import data, then a business analyst will gather requirements about the exact format of the source input file and also document which attributes are required by the organization to import into the database. This scenario about letting non-IT end users import data from generic sources (anything from invoices to restraunt menus) into a database has no context.

    If basically all you need to do is injest and archive this data somewhere, and you're not sure how the organization will actually use it, then perhaps MongoDB would be a better choice. You could use MongoDB as a staging database, and then have an ETL process that feeds specific data items into SQL Server as the organization requires it.

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho