When doing large updates from files, I usually load the records into a staging table first, which guarantees that I have all updates loaded and nothing missing from the file. If any part of the load to staging fails, I can remove any records loaded and try again. This way, I have an "all or nothing" process from the file.
The next step is to apply the changes from the staging table to the production table. I do that in batches, so that each batch completes quickly and doesn't interfere with other users. Inserts are easy; use a DELETE from staging with an OUTPUT to production. That single atomic statement acts like a "move". The updates are a little more challenging, but they can be batched as well. I use a temp table to get a small set, and in a transaction UPDATE production, then DELETE from staging, then COMMIT.