Loop columns of a table to perform data integrity checks

  • Hello,
    I am looking for some advice on the best way to approach this task.
    I will be receiving files fro multiple sources and I want to put them through a data validation routine before importing them into my staging process.

    I am using the following to identify the specific constraints for each column:

    SELECT

    TABLE_CATALOG,
    TABLE_SCHEMA,
    TABLE_NAME,
    COLUMN_NAME,
    case when(is_nullable = 'NO') then 'YES' ELSE 'NO' END as 'MANDATORY',
    data_type as 'DATA_TYPE',
    CHARACTER_MAXIMUM_LENGTH as 'LENGTH'
    FROM INFORMATION_SCHEMA.COLUMNS
    WHERE TABLE_NAME in ('Fact_Corelines_Staging','Fact_Client_Refs')
    and COLUMN_NAME not in ('UIDENT','LOAD_DATE','UPDATED_DATE','CHECK_SUM','RECORD_ID')

    ORDER BY table_name,ORDINAL_POSITION

    and then my first idea was to use a cursor to loop through each column and apply the above criteria for the relevant column to some variables.
    I should add I have around 112 columns to check.
    I know that by mentioning the dreaded word "Cursor" I have split the group in 2 so apologies for that :).
    What other ideas are worth investigating for this task?

    Cheers

    Dave

  • Are you just loading into those 2 tables?  How is the data being loaded?  Can you create a staging table with these constraints already on them, then use SSIS to load.  Any failures, you could redirect to an error file

    For better, quicker answers, click on the following...
    http://www.sqlservercentral.com/articles/Best+Practices/61537/

    For better answers on performance questions, click on the following...
    http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

  • Hi Mike, yes I can load to a pre staging table with these constraints already in place then output the errors to a file. I was wondering if there was a more elegant way to achieve it.

    I will be loading around 16 dimensions alongside the 2 fact tables.

    I am currently specking our this project and so have no clue about the number of records to be processed hence my desire for the slickest process I can put together.

    Many thanks for replying to me,

    Dave

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply