Multiple processes, locking

  • I have to load about 10 million records in a table. I have split the load into 8 processes to improve performance. I plan to run 8 insert queries simultaneously. The target table is not partitioned.

    Could i run into deadlock issues. How can I avoid them?

    Does SQL server 2005 allow hash partitioning of tables?

    Thanks!

    Shailendra

  • Just a question. What's the source of these 10 million records? You will want to bulk-load the table while the database is in bulk-logged mode or the thing that will slow you down is the logging to the transaction file.

    Also, I don't think you'll run into any deadlocking issues, but you could potentially run into blocking which would render your 8 parallel inserts useless.

    Not sure what you mean about hash partitioning, but SQL Server does allow for horizontal partitioning. You could potentially break up the table into 8 seperate partitions spread across seperate physical disks and try your paralell inserts that way. I've never done it like that but in theory, it seems possible.

    Honestly though, 10 million records doesn't seem like a lot... Are you sure you don't have any indexes on this table that could be slowing the insert down? If so you can disable them during the insert and rebuild them after its complete.

  • The source is an Oracle database. My query is:

    Insert into SQL_Table

    ( columns)

    Select columns from openquery (Linked_server, 'Query_Oracle')

    I ran multiple queries without any partitioning - I ran into deadlocks. I range partitioned the target table and the processes went through.

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply