May 15, 2019 at 12:57 pm
Hello friends,i have a database with table of millions of records.I update my database with INSERT or UPDATE from file JSON but i have one problem:
while I update my tables users are blocked;
i attach the table schema, the script and JSON file that i use.
this operation must always be performed while the db is in use.
please help me
May 15, 2019 at 1:04 pm
if your users are blocked running selects against your tables, RCSI may help you with the issue
link:
https://sqlperformance.com/2014/05/t-sql-queries/read-committed-snapshot-isolation
May 15, 2019 at 1:13 pm
You'll find it unlikely that many of us are going to download that zip file, but for those that aren't, I did download it to a Pi to extract instead. Where is contains a 6MB JSON file, and some DDL and batch SQL.
Considering that the JSON file is 6MB, it's pretty big. Does this operation have to be done while users are using the system? if you're updating that many rows, normally you would try to do this as an "offline" process.
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
May 15, 2019 at 1:16 pm
Hi,
yes this must be done while they are working Unfortunately.
May 15, 2019 at 1:27 pm
Hi,
how "offline" process?
May 15, 2019 at 1:30 pm
Hi, how "offline" process?
As in, when the system isn't being used by users. Another phrase would be "out of hours". So, you might do the process between 01:00 and 03:00, instead of 11:00 and 13:00, which might be peak hours for your users.
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
May 15, 2019 at 1:45 pm
unfortunately it is impossible, it is an online app that is always used
May 15, 2019 at 5:15 pm
When doing large updates from files, I usually load the records into a staging table first, which guarantees that I have all updates loaded and nothing missing from the file. If any part of the load to staging fails, I can remove any records loaded and try again. This way, I have an "all or nothing" process from the file.
The next step is to apply the changes from the staging table to the production table. I do that in batches, so that each batch completes quickly and doesn't interfere with other users. Inserts are easy; use a DELETE from staging with an OUTPUT to production. That single atomic statement acts like a "move". The updates are a little more challenging, but they can be batched as well. I use a temp table to get a small set, and in a transaction UPDATE production, then DELETE from staging, then COMMIT.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply