April 12, 2017 at 3:09 am
I checked it in the Oracle database and exported the IDs with the old links into an excel sheet
Its a migration project of the old links in oracle to the new links in SQL Server
April 12, 2017 at 3:52 am
OK. The way I see it is this - and please correct me if I'm wrong:
Two options therefore occur to me:
(1) Put a "GO" statement (batch terminator) into this statement every 1000 rows or so, or
(2) Rewrite the query as an UPDATE ... FROM ... query:
UPDATE BigTable
SET bt.URL = it.URL
FROM BigTable bt INNER JOIN ImportTable it ON bt.ID = it.ID
WHERE bt.URL <> id.URL
If your server still complains, you can wrap that up in a loop that sets 1000 (or whatever) rows at a time, and repeats until there are no more rows updated.
Just out of interest, are you trying to run the update script from SSMS which is running on the server itself?
Thomas Rushton
blog: https://thelonedba.wordpress.com
April 12, 2017 at 4:28 am
Hi Thomas
Answers to your questions
Kal > The table from Oracle contains the old URLs and the IDs actually come from the new SQL Server table so i am matching the URLs in both tables. I was originally updating as follows:
UPDATE dbo.MEMOPOOL_DATA SET MEMODATA='https://thehub.dolphin.co/pages/ExternalLinksPage.aspx?LivelinkID=2360074&VersionID=5' WHERE MEMODATA like '%https://ezdelcomuae.dolphinenergy.com/livelink/livelink/fetch/2254013/2359262/2359206/2359000/Project_HSE_Management_Plan.pdf?nodeid=2360074&vernum=5%';
But then i modified the query to use the MEMOPOOLIDs and create a non clustered index on it
Kal> The SQL Server table is all i have now as i created the script using EXCEL
Kal> No i need to update it using my EXCEL sheet values there is no table i combined all links and IDs into one excel sheet
Kal> I build the script using EXCEL and oracle scripts
Kal> Yes
Your script below:
UPDATE BigTable
SET bt.URL = it.URL
FROM BigTable bt INNER JOIN ImportTable it ON bt.ID = it.ID
WHERE bt.URL <> id.URL
Its similar to using a temporary table; i can load the new URLs along with the corresponding MEMOPOOLIDs using bulk insert into a temp table and then update the URLs using the query:
UPDATE TempTable
SET original.URL = tt.URL
Where original.ID = tt.ID
I believe this might work but will it be faster than what im doing now?
Only the execution plan will answer this question eh?
Kal
April 12, 2017 at 7:36 am
I get that there isn't a table, but the data is in a format that can be moved to a table and then you can do what Thomas is suggesting. That would be the best approach here, not using Excel to script UPDATE statements individually for huge numbers of rows.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 12, 2017 at 12:36 pm
Grant Fritchey - Wednesday, April 12, 2017 7:36 AMI get that there isn't a table, but the data is in a format that can be moved to a table and then you can do what Thomas is suggesting. That would be the best approach here, not using Excel to script UPDATE statements individually for huge numbers of rows.
I recommended making this table-driven in my first post on this thread yesterday morning. It would likely be even more helpful since many rows seem to get the same value. Range seeks are possible there with proper indexing, and you can update potentially many rows per page with each page hit that way.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
April 13, 2017 at 6:08 am
Hi everyone
First i wanted to say thanks for all the help
I was able to run the script i created and it took 5 mins
Here is what i did.
The production database is called PTW3 and the testing database is called PTW_LINKS
1. Create a new database with the same collation as the original as i was getting collation issues
2. I create a new table in PTW_LINKS as a select..into statement to get the MEMOPOOLIDs and OLD URLs
3. I created another table called UPDATE_LINKS and loaded the excel sheet holding the columns; OLD URL and NEW URL
4. I modified the table PTW_LINKS and added a new column called NEW_LINKS then i updated this value with the NEW URL field from UPDATE_LINKS
5. Finally i ran the update statement below:
update ptw3.dbo.MEMOPOOL_DATA
SET MEMODATA = PTW_LINKS..ID_LINKS.LINK_HUB
from ptw3.dbo.MEMOPOOL_DATA, PTW_LINKS.dbo.ID_LINKS
where ptw3.dbo.MEMOPOOL_DATA.MEMOPOOLID = PTW_LINKS.dbo.ID_LINKS.MEMOPOOLID
Took me five minutes and as every good technical person, i ran tests on the row count before and after updates to see how many rows were updated and so what not.
I guess i used 2 temporary tables to resolve this issue but it worked and thanks to everyone who helped
Have a good week
Kal
April 13, 2017 at 6:43 am
Thanks for posting the solution that worked for you. Glad it all worked out too.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
April 13, 2017 at 6:47 am
there's certainly nothing wrong with staging tables when doing ETL. Although it's sometimes "nice" to do every transformation in one hit, it's often a lot to ask to do something things all in one go. Thus doing it over a couple of steps can end up being faster, but also (and possibly more importantly) much more easily maintained.
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
Viewing 9 posts - 16 through 23 (of 23 total)
You must be logged in to reply to this topic. Login to reply