Importing large text file to table

  • Hi there,

    I was given a text file with 750 columns and 5,000,000,000 rows. Size of text file is about 5GB. Is it possible to put these in a single table? I was only told that none of the columns exceed VARCHAR(2000).

    I tried using "SQL Server Import and Export Wizard" but get an error saying that memory is not enough.

    How can I resolve this issue?

    Thx

    MR

  • This might help:

    https://technet.microsoft.com/en-us/library/ms190421(v=sql.105).aspx">

    https://technet.microsoft.com/en-us/library/ms190421(v=sql.105).aspx

    bcp will attempt to load in one transaction so use bcp -b <transactions> e.g. 10000

    - Damian

  • rash3554 (10/26/2016)


    Hi there,

    I was given a text file with 750 columns and 5,000,000,000 rows. Size of text file is about 5GB. Is it possible to put these in a single table? I was only told that none of the columns exceed VARCHAR(2000).

    I tried using "SQL Server Import and Export Wizard" but get an error saying that memory is not enough.

    How can I resolve this issue?

    Thx

    MR

    Something is wrong here. If there are 5 Billion rows and the file is only 5GB, then each row would only be 1 byte.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply