Simple OPENROWESET questions

  • I am running the following statement:

    SELECT cast(field1 as varchar(10)) as field1, cast(field2 as varchar(10)) as field2 INTO file1table FROM

    OPENROWSET('MSDASQL','Driver={Microsoft Text Driver (*.txt; *.csv)};DefaultDir=D:\;'

    , 'SELECT * from D:\file1.csv')

    However, the file1.csv contains 2 fields: field1 is a long (bigint) integer that I want stored as a varchar. field2 is a bit smaller integer (that fits into an integer type.

    Of course, I could import this in other ways, but I want to import it as specified.

    The problem is that field1 ends up being NULL for any value above the maximum value for integer.

    Does anyone know how I can have this import all the data and not lose any?

  • IMO Bulk Insert (check BOL) is what might serve you better.

    e.g.

    BULK INSERT mydb.myschema.myobject

    from '\\UNC_path\myobjectdata.txt'

    WITH

    ( FIELDTERMINATOR = ''

    --,CODEPAGE = 'raw'

    , FIRSTROW = 2

    )

    go

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • I definitely agree and appreciate the response.....

    However, my situation is that they are 'telling me' to use OPENROWSET even after numerous discussions.

    However, it does not work when the numbers are in the 'bigint' range. It imports the rows but makes them all 'NULL'. Is there any way to control the data type (I would actually prefer varchar to be imported) because there are leading 0's and I would want to keep those.

    Thoughts?

  • I believe you may be experience something similar to this bug.

    http://support.microsoft.com/kb/194124

  • There are the resulte of the test I performed :

    (sql2005 dev edtn SP2 CU5)

    set nocount on

    -- is "Ad Hoc Distributed Queries" enabled ? if not, enable it

    declare @SQLConfigOption varchar(256)

    Create table #tmpConfigOption (configName varchar(128), MinValue varchar(15), MaxValue varchar(15), ConfigValue varchar(15), RunValue varchar(15))

    Set @SQLConfigOption = 'sp_configure @configname = ''Ad Hoc Distributed Queries'''

    insert into #tmpConfigOption

    exec (@SQLConfigOption)

    if exists (select * from #tmpConfigOption where configName = 'Ad Hoc Distributed Queries' and RunValue = '0' )

    begin

    exec sp_configure @configname = 'Ad Hoc Distributed Queries', @configvalue = '1' ;

    RECONFIGURE WITH OVERRIDE;

    End

    go

    /*

    * Now the test : http://www.sqlservercentral.com/Forums/Topic449625-338-1.aspx#bm449892

    */

    /*

    -- content of Import_OpenRowset.csv

    field1,field2

    1,a

    2,b

    3,c

    9223372036,10Pos

    92233720368,11Pos

    922337203685,12Pos

    9223372036854,13Pos

    92233720368547,14Pos

    922337203685477,15Pos

    9223372036854775,16Pos

    922337203685477580,TheMaxBigInt

    */

    SELECT cast(field1 as varchar(10)) as field1

    , cast(field2 as varchar(15)) as field2

    INTO dbo.file1table

    FROM OPENROWSET('MSDASQL','Driver={Microsoft Text Driver (*.txt; *.csv)};DefaultDir=X:\;'

    , 'SELECT * from X:\Test\Import_OpenRowset.csv')

    /*

    -- result

    Msg 232, Level 16, State 2, Line 5

    Arithmetic overflow error for type varchar, value = 9223372036.000000.

    The statement has been terminated.

    Msg 208, Level 16, State 1, Line 2

    */

    go

    SELECT cast(field1 as bigint) as field1

    , cast(field2 as varchar(15)) as field2

    INTO dbo.file1table2

    FROM OPENROWSET('MSDASQL','Driver={Microsoft Text Driver (*.txt; *.csv)};DefaultDir=X:\;'

    , 'SELECT * from X:\Test\Import_OpenRowset.csv')

    go

    Select * from dbo.file1table;

    /*

    Invalid object name 'dbo.file1table'.

    */

    go

    Select * from dbo.file1table2;

    /*

    field1 field2

    -------------------- ---------------

    1 a

    2 b

    3 c

    9223372036 10Pos

    92233720368 11Pos

    922337203685 12Pos

    9223372036854 13Pos

    92233720368547 14Pos

    922337203685477 15Pos

    9223372036854774 16Pos

    922337203685477632 TheMaxBigInt

    */

    go

    IF OBJECT_ID('dbo.file1table','U') IS NOT NULL

    DROP TABLE dbo.file1table

    GO

    IF OBJECT_ID('dbo.file1table2','U') IS NOT NULL

    DROP TABLE dbo.file1table2

    GO

    /*

    * test as ended

    */

    -- did we enable "Ad Hoc Distributed Queries" ? If yes, disable it

    if exists (select * from #tmpConfigOption where configName = 'Ad Hoc Distributed Queries' and RunValue = '0' )

    begin

    exec sp_configure @configname = 'Ad Hoc Distributed Queries', @configvalue = '0' ;

    RECONFIGURE WITH OVERRIDE;

    End

    go

    drop table #tmpConfigOption

    go

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • Sounds as though the IMEX flag might be a concern.

    However, I believe that is only for Excel files, based on some reading

    I'm using the text driver (due to this file having 3million rows. Again, I know that this should be done with a BULK operation, but the 'powers to be' state they want it this way.

    I've 'worked around' (did it right) with a BULK operation until I can find a solution. However no luck as of yet.

    I guess if I put quotes around everything that might fix it, but also jump my file size, not to mention having to write something to add the quotes.

    Thank you for any additional assistance.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply