sql server 2017 INSERT and UPDATE lock database

  • Hello friends,i have a database with table of millions of records.I update my database with INSERT or UPDATE from file JSON but i have one problem:

    while I update my tables users are blocked;

    i attach the table schema, the script and JSON file that i use.

    this operation must always be performed while the db is in use.

    please help me

     

    Attachments:
    You must be logged in to view attached files.
  • if your users are blocked running selects against your tables,  RCSI may help you with the issue

    link:

    https://sqlperformance.com/2014/05/t-sql-queries/read-committed-snapshot-isolation

  • You'll find it unlikely that many of us are going to download that zip file, but for those that aren't, I did download it to a Pi to extract instead. Where is contains a 6MB JSON file, and some DDL and batch SQL.

    Considering that the JSON file is 6MB, it's pretty big. Does this operation have to be done while users are using the system? if you're updating that many rows, normally you would try to do this as an "offline" process.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • Hi,

    yes this must be done while they are working Unfortunately.

  • Hi,

    how "offline" process?

  • emanuele 8962 wrote:

    Hi, how "offline" process?

    As in, when the system isn't being used by users. Another phrase would be "out of hours". So, you might do the process between 01:00 and 03:00, instead of 11:00 and 13:00, which might be peak hours for your users.

    Thom~

    Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
    Larnu.uk

  • unfortunately it is impossible, it is an online app that is always used

  • When doing large updates from files, I usually load the records into a staging table first, which guarantees that I have all updates loaded and nothing missing from the file.  If any part of the load to staging fails, I can remove any records loaded and try again.  This way, I have an "all or nothing" process from the file.

    The next step is to apply the changes from the staging table to the production table.  I do that in batches, so that each batch completes quickly and doesn't interfere with other users.  Inserts are easy; use a DELETE from staging with an OUTPUT to production.  That single atomic statement acts like a "move".  The updates are a little more challenging, but they can be batched as well.  I use a temp table to get a small set, and in a transaction UPDATE production, then DELETE from staging, then COMMIT.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply