October 2, 2003 at 4:22 am
Hi all,
I'm just wondering whether anyone has a solution to this -
In my DTS I'm using a Transform Data task to import data from a local CSV to a table. There are several steps both before and after this that all work fine, but yesterday I added another Execute SQL task, which performs an update on the table I imported the data to. Essentially, I am creating a foreign key in the table that isn't present in the data when I first import it, which will allow the remainder of the application to function a lot easier/faster. The problem here is that I'm using the following SQL statement:
UPDATE [dbo].[tblVehicleSearch]
SET [DealershipID] = [D].[DealershipID]
FROM [DBName].[dbo].[tblDealerships] AS [D]
, [DBName].[dbo].[tblVehicleSearch] AS
WHERE .[DealershipName] = [D].[DealershipName]
As you can see, the update involves reading data from the tblVehicleSearch table, as well as writing to it. The tblVehicleSearch table is the table I originally import the data to, and it seems that SQL Server 2K doesn't release a lock on the table when it's finished the import. I have read a little about this problem, but short of using staging tables etc... I don't see an elegant way around the locking issue.
I have confirmed that I can write to, and read from the table, just as long as I don't do both at the same time.
Any advice on how I can circumvent the issue?
TIA
Mark
October 2, 2003 at 4:25 am
Sorry, seems the forum didn't like my table aliases and replaced them with smilies, lol. The table alias is just "V"
October 3, 2003 at 7:03 pm
Hey Mulletboy! There's a couple of options, depending on the cause of your trouble.
Quick and dirty, use the (Nolock) query hint (updated your syntax, too, easier to read this way)
UPDATE [V1]
SET [DealershipID] = [D].[DealershipID]
FROM [DBName].[dbo].[tblDealerships] AS [D] (NOLOCK)
JOIN[DBName].[dbo].[tblVehicleSearch] AS [V1] (NOLOCK)
on [V1].[DealershipName] = [D].[DealershipName]
2nd Option, look at your transactions in DTS. Under "Workflow" properties for any of your tasks you'll see various Transaction choices. Play around with these. Also, when you're using a DTT you can choose to "Lock Table" when importing. This is a good idea if possible, as it speeds the DTT up considerably. However, if you have the read task in the same transaction as the update task then you'll run into trouble.
Keep fiddling around and you'll figure it out.
Signature is NULL
October 6, 2003 at 11:21 am
Hi Calvin,
Excellent, thanks for that 🙂
2 of the other tasks in the DTS are to set a flag in a seperate table, which tells the rest of the application that the search data is currently being updated, and that they may therefore not use the search at that time. This isn't a problem as it is an internal application, not a website one, and the import takes roughly 10 seconds including the download. Therefore I think the NOLOCK option should be ideal.
In the meantime I've used a temporary measure of a staging table for the import, but I'll have a go at this method as soon as I get a few minutes.
Thanks again
Mark
October 6, 2003 at 12:18 pm
You know, using a staging table isn't necessarily a bad idea with DTS. You can use some optimization (like the "table lock" referenced above) that you couldn't use with a live table. Plus, you can "wash" the data before importing it, and the table to table inserts are quite fast.
Good luck; DTS can be incredibly useful (even if it is a little finicky).
Signature is NULL
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply