Number of fields in SQL Server table

  • Hello,

    Looking at SQL Server Specifications I see that the maximum fields per table is 1024 in SQL Server 2000. I have created a table with about 150 fields but when I "design" the table in SQL Server Administrator, it can take up about 10 seconds to show the fields. I have now been asked to create a table with over 400 fields in it and am a bit concerned as to how SQL Server (and any apps using the table) will perform. All the fields are logically related to the key and the table is properly normalised. Will SQL Server be able to handle it easily or is there any other strategy I can use for the design of the table? Someone suggested splitting the table but as the table is normalised, this doesn't seem sensible to me.

    Any suggestions gratefully accepted.

    Michael Jennings.

  • like always, it depends on the usage of your table.

    400 columns/tb  and normalized ?   I'm in no position to judge but this seems a .... having an average of 20 bytes/column.

    Prepare for huge wastedspace when update occurs !

    Did you figure out how many indexes you'd provide/forsee and on which columns ?

    We've got some insert-only tables having up to 580 columns with only one keyfield (datetime). Access is always on a range of datetime selecting for up to 15 columns at a time. These tables capture plc-readings for valves, cylinderspeed, ... In that context I did some testing to pivot these data to tables having (datetime, valueKey, value), eating up to 4 times the space and because of the way the data was handeled, not resulting in performancegain. So the effort has been stoped overthere.

    Alteration for these tables is only being done with "alter table mytable add mynewcolumn decimal(7,5) NULL" This way the existing data is not touched at altertime. Rebuilds are only to be done at planed downtime !

    This tables are not handled (designed) using EM, because it tends to use a load of "rename, create , insert into" sequences.

     

     

     

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • I'd be willing to bet that the table is anything but "properly normalized", but taking your assertion at face value...

    SQLDMO (which is what EM and QA Object Explorer use) is known to have problems with performance when enumerating large number of objects.  Your application(s) on the other hand are not likely to be using SQLDMO and should not suffer the same problems.  After all your application doesn't need to list all the columns before it can query them...hopefully.

    /*****************

    If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek

    *****************/

  • Hello, Thank you for your replies. The information I am recording relates to school inspections. The table is normalised in that each data item in the table relates to an individual inspection and only that inspection. The table has a key of Inspection identifier which is an autonumber. There will not be a great volume of records, about 1000 schools each inspected once every 7 years. There may well be a lot of wasted space per record as many of the data items are not mandatory but with the volume of records I do not see that as a problem.

    The vast majority of the fields contain answers to questions such as

    'Opportunities for Leaders to Develop Further Skills'

    Valid values are 1 to 5, with 1 being poor an 5 being excellent. I intend to use tinyints to store these values so the table should not exceed the SQL record size of 8000 bytes.

    To be honest when I created the the first table I mentioned above of 150 fields, I was surprised to see how much EM slowed down when I designed the table, especially since the SQL

    spec is for 1024 allowable columns in a table. This is new territory for me (tables of this size), and I know that there will be other tables of similar sizes needed further down the line.

     

  • You really should suggest doing something like this.

    Schools  (Details about the school)

    School

    Address

    ...

    Surveys (The base details of the survey)

    Survey_ID

    Date_Performed

    Performed_By

    Survey_Answers (The Answers)

    Survey_ID (Foreign Key to Survey_ID in Surveys)

    Question_ID (Foreign Key to Question_ID in Questions)

    Answer_ID (Foreign Key to Answer_ID in Answers)

    Questions (The questions list)

    Question_ID

    Question_Desc

    Answers (The answers list)

    Answer_ID

    Answer_Desc

    Then of course you will need to consider how to display the questions and related answers so they are reusable. I have several different survey systems myself and I can guarantee you 400 will not be the limit and you have to anticipate future change in the questions and available answers.

Viewing 5 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply