While...Union

  • I've searched everywhere for a method of combining *many* tables in a database, but everytime i search for something like "union while loop sql server" all I see are responses telling folks this is a bad idea. My circumstances differ from all of theirs, however...

    I have a database with over 800 pertinent tables, all with the same formats, that I would like to be able to report on as one recordset. Basically each client gets his own table(s), but each has identical columns/indexes/etc. I know this is horrible database design, but it's the back end of a software suite we use - it isn't our design and we can't change it.

    What i want to do is, generally, select the list of tables from information_schema.tables:

    DECLARE @TableName NVARCHAR(256)

    SET @TableName = ''

    WHILE @TableName IS NOT NULL

    BEGIN

    SET @TableName =

    (SELECT MIN(QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME)) FROM INFORMATION_SCHEMA.TABLES

    WHERE TABLE_TYPE = 'BASE TABLE'

    AND QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME) > @TableName

    AND QUOTENAME(TABLE_NAME) like 'i3[_]%'

    AND QUOTENAME(TABLE_NAME) like '%[_]CH%'

    AND QUOTENAME(TABLE_NAME) not like 'i3[_]test%'

    )

    --Union next table here?

    END

    Then I'd like to union each additional table to the last...perhaps make a view of it. I realize I might run into

    performance issues with so many tables, but I have to at least try it - I can't think of another way to do any sort of standardized reporting on this monster.

    I'd appreciate any input.

    Thanks!

  • One possibility is to create a warehouse type of table that is the structure of your tables with an additional column for the source-table-identifier. Then you could have a nightly (?) batch populate this table for reporting purposes.

    Another, riskier solution, would be to put all of the data in one table and then have the 800 (yowza) tables instead be views.

    Working with 3rd party applications is sooo much fun.

    good luck

    daryl

  • Use something like this to create your view (this creates a view a script that creates a view that combines all of my _Contacts tables, obviously.):

    DECLARE @My_View VARCHAR(max)

    DECLARE @stmt VARCHAR(1000)

    PRINT 'CREATE VIEW Ginormous_Recordset AS '

    DECLARE rs CURSOR FOR

    SELECT

    'SELECT * FROM ' + TABLE_NAME + ' UNION ALL '

    FROM

    INFORMATION_SCHEMA.TABLES

    WHERE

    TABLE_TYPE = 'BASE TABLE'

    AND

    TABLE_NAME like '%[_]Contacts%'

    OPEN rs

    FETCH NEXT FROM rs INTO @stmt

    WHILE @@FETCH_STATUS = 0

    BEGIN

    PRINT @stmt

    FETCH NEXT FROM rs INTO @stmt

    END

    CLOSE rs

    DEALLOCATE rs

    Remove the last "UNION ALL", take the output and paste into another query window to have your create view code. Adding another table? Run this code again to have the new tables included. Now you can use this view to test if it is feasible to use in any kind of reporting scenario (it won't be).

    You can also use this view to pump the data into a reporting table nightly like Daryl suggested. Probably your best bet.

    ______________________________________________________________________________
    How I want a drink, alcoholic of course, after the heavy lectures involving quantum mechanics.

  • Daryl-273852 (6/15/2011)


    One possibility is to create a warehouse type of table that is the structure of your tables with an additional column for the source-table-identifier. Then you could have a nightly (?) batch populate this table for reporting purposes.

    Another, riskier solution, would be to put all of the data in one table and then have the 800 (yowza) tables instead be views.

    Working with 3rd party applications is sooo much fun.

    good luck

    daryl

    I would recommend looking at either of these options instead of trying to create a view over all of those tables. In fact, using a view over all of those tables will probably put such a performance hit on the database that it may not be usable while you are running your queries.

    Create a new database - separate from the live database (preferably on another server) and export the data from each table, add an identifier (if needed) to identify each one and import that into this new database.

    There are a lot of advantages to this approach. Especially looking at how you index the table - which you will be able to optimize for your reports.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply