January 14, 2016 at 10:31 am
Hi, our devs are testing a new application which bulk inserts 2.4 million rows of data. It is failing due to not running out of memory to obtain locks. (I think!). In error log I can see
"The instance of the SQL Server Database Engine cannot obtain a LOCK resource at this time. Rerun your statement when there are fewer active users. Ask the database administrator to check the lock and memory configuration for this instance, or to check for long-running transactions."
Looking at the sys.dm_tran_locks view I can see the session is generating around 4 million locks.
Looking at sys.dm_os_memory_clerks I can see OBJECTSTORE_LOCK_MANAGER is using just over 3GB of memory. The Instance has 5GB allocated.
Firstly, is this expected behaviour with the Bulk Insert? I have asked the devs to batch their imports.
Secondly, is there anyway to calculate the memory requirements for the loads the application will be seeing in real life?
Any help with this would be great.
Thanks.
January 14, 2016 at 1:52 pm
For a BULK INSERT, you'd typically want to take a table lock anyway, to allow minimal logging (edit: and reducing locking and logging overhead as much as possible) . I'd suggest doing that if at all possible.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply