Resetting default field length on file import (varchar)

  • When I import a file (I do that a great deal) I get a default field length of 8000 for varchar fields. Is there a way to change the default to 255? I have used a machine set that way in the past but someone before me in the job made the change and they are Long Gone now.

    Please be as specific as possible as I am a newbie. (This is a stand alone installation that won't affect anyone else in the organization. The machine is not clustered and I am running Developer's Edition on WinXP professional).

    I need to do this cause I often import text files with 50 to 150 fields. At 8000 characters a field (current default) the machine will return an error "Out of storage space". I'd like to avoid having to change each field individually as I do now.

    Thank you in advance.

    Marshall Deixler

  • Hello.

    If your fields are specified as VARCHAR and not CHAR, then you should not run into any error as long as your actual data is LESS then 8060 bytes per row.

    VARCHAR(8000) only means that you cannot store more then  8000 characters in the field. It doesn't mean that it actually uses 8000 bytes. The storage is roughly equivalent to the number of characters you insert into the field + a little overhead in handling VARIABLE sized types. You will get a warning: Warning: The table 'myTable' has been created but its maximum row size (1144324) exceeds the maximum number of bytes per row (8060). INSERT or UPDATE of a row in this table will fail if the resulting row length exceeds 8060 bytes.

    ...but as it says, you will have no problem inserting into it so long your total row size is less or equal to 8060 bytes.

    And probably you do not want all your fields as VARCHAR since this is a performance killer, in a lot of cases it's hard to use and plain wrong in a RDBMS sense.

    Regards,

    Hans 

  • It's true that only the actual data is stored, so a row can be defined with many varchar(8000) fields as long as the total row size fits in 8060 bytes, but it is difficult for the server to create good execution plans for tables where every field is varchar(8000).

    You can import the data with the default size and fix the columns later.  This script can generate ALTER TABLE commands that redefine the columns to fit the largest value in the column.  It is also written to change NVARCHAR to VARCHAR to save space, change it if that is not a good idea in your case.  The script generates SELECT commands which will generate the ALTER TABLE commands.  It could be written to run automatically by defining strings and using EXEC, if you try it out and trust the results.

    You could do something with "WHERE MIN(CASE WHEN <col> > '' THEN IsNumeric(<col> ) END) = 1" to find numeric columns and create different ALTER COLUMN statements.

    set

    quoted_identifier off

    select

    replace(replace("select 'alter table <tbl> alter column <col> varchar(' + cast(max(len(<col> )) as varchar) + ')' from <tbl>",

        '<tbl>', tbl),

        '<col>', col)

    from (

        select quotename(table_schema) + '.' + quotename(table_name) as tbl,

            quotename(column_name) as col

        from information_schema.columns

        where table_name = 'AreaDefaults' and data_type like '%varchar' and character_maximum_length > 50

    ) x

  • I'm not sure that I've been clear on my problem.

    The files are being imported from text or csv files. Apparenty the stock import package takes a tab or comma delimited file in as varchar fields upon import. When there are too many fields (with maximum field length set to the default of 8000) the server gives me an error of not enough storage prventing me from importing the data.

    You stated "You can import the data with the default size and fix the columns later." My issue is that in order to import this table now I have to manually change the size of EVERY one of those 50 to 150 columns in the "Transform" submenu of the destination on the default import. Otherwise the file generates an error and does not import.

    There is no more then 30 to 60 characters in any given field but the default is set to 8000. I can safely reduce the field size significantly without impacting the data.

    Doing something with Alter table won't solve my problem because the file hasn't been imported and the table hasn't been created.

  • Sometimes I use the "Check SQL" button to get the table creation SQL, copy that to another editor and play with it.

    You can do a global search-and-replace to change all the 8000 to 100, then copy it back to the dialog you got it from.  Or to Query Analyzer to execute directly.

    You other solution is to quit using the wizard and write something more intelligent yourself.

  • Hmm.

    Okay, could it be that it is not VARCHAR you are talking about but the type CHAR?

    Or could it be that your Field or Row terminator has changed so you are infact importing the data wrongly? I mean, maybe the importer does not interpret the rows as you do. Maybe it tries to squeeze in (name a number) of your preceived fields into one field in the DB and so it reaches the limit?

    There must be some other error then what you think because even 1000 fields defined as VARCHAR(8000) is fine as long as the DATA in A row does not superseed the MAX ROWSIZE in Sql Server (8060).

  • If this posts twice please forgive me. I ran into an error message trying to post it.

    Your questions are good ones. Here is what I can tell you for sure.

    1) Once imported the fields in the file shows up in the Enterprise Manager as varchar 8000.

    2) The field delimiter is definalty a tab - I check that in a text utility called J Write.

    3) I don't know that the end of record character is CR+LF but it imports properly with that setting once the field size is changed to something down arould 200 to 255 or so.

    4) The exact wording of the message is "Not enough storage is available to complete this operation." There is 99 gig spare on the drive and the file I just tested with is 24 meg in text with 70,000 records and 164 fields.

    5) That file is a known quantity (I actually imported this one previously by manually changing the field lengths on the import.). It is 164 fields and the longest data record in it is no more then 1,800 characters (well withen the 8060 limit.)

    6) I definatly had a machine previously that defaulted to 255 instead of 8000 and did not run into these problems on that one. I wish I knew hod that had been set up to do it but the person that did that is long gone.

    Is there anything else I could add to this information to help?

    Thank you all for the help you are offering. I trully appreciate it.

  • Okay, I have a little much to do right now but if you'd like to change the default Column Max Size for A package:

    Open the package in the designer, select Disconnected Edit from the Package menu within the designer window. In the Edit All Package Properties dialog expand the object tree for Connections. You can then identify your problem text file connection within those shown. Expand your connection and then expand OLE DB Properties. You should then see the property Max characters per delimited column. Select this item and change the Value property in the right hand pane.

    Try to import you data now and please tell me if the data seem ABSOLUTELY correct. Maybe there was/is some problem with the settings for the file anyways?

    Regards,

    Hanslindgren

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply