December 17, 2014 at 4:44 am
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I do the same.
Small bug in a T-SQL script?
Whoops, I reformatted all of the code and did some performance tuning 😀
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
December 17, 2014 at 4:47 am
That makes me feel somewhat better. Thanks, Koen.
December 17, 2014 at 4:55 am
BTW, can anyone with Visual Studio coding experience take a look at my question here?
I've been fighting this on and off for several months now and just can't seem to find the information I'm looking for.
December 17, 2014 at 5:16 am
Koen Verbeeck (12/17/2014)
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I do the same.
Small bug in a T-SQL script?
Whoops, I reformatted all of the code and did some performance tuning 😀
+1. Isn't that normal? When you have a problematic process, you should consider the whole process and tune. Making it handle larger sets more efficiently is always a good thing. So it has a small set now...that doesn't mean they won't get bigger. In general, data tends to grow over time, so why not design and code properly in the first place?
December 17, 2014 at 5:33 am
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I think what you did was the professional and appropriate thing, without and shadow of a doubt. The files may be small *usually* - but then what about the time(s) they're not?
I'm a DBA.
I'm not paid to solve problems. I'm paid to prevent them.
December 17, 2014 at 5:57 am
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
Doing all of them at once while everything is fresh in your mind is the smart thing to do, especially since you said you modified the underlying tables.
And "Isn't giving us problems" isn't the same as "fast". You probably saved yourself headaches in the future by getting this right now.
--------------------------------------
When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
--------------------------------------
It’s unpleasantly like being drunk.
What’s so unpleasant about being drunk?
You ask a glass of water. -- Douglas Adams
December 17, 2014 at 6:05 am
... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...
December 17, 2014 at 6:35 am
Lynn Pettis (12/17/2014)
... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...
What is the big news? What is this counting for? 🙂
December 17, 2014 at 6:37 am
Divine Flame (12/17/2014)
Lynn Pettis (12/17/2014)
... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...What is the big news? What is this counting for? 🙂
I believe this is how many days he has left on his Afghanistan contract.
December 17, 2014 at 6:42 am
Brandie Tarvin (12/17/2014)
Divine Flame (12/17/2014)
Lynn Pettis (12/17/2014)
... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...What is the big news? What is this counting for? 🙂
I believe this is how many days he has left on his Afghanistan contract.
Oh, getting back to home 🙂 That sure is a thing to count for 🙂
December 17, 2014 at 6:49 am
Divine Flame (12/17/2014)
Brandie Tarvin (12/17/2014)
Divine Flame (12/17/2014)
Lynn Pettis (12/17/2014)
... Mark one off, 73 days on the calendar to go. 73 days on the calendar to go, 73 days to go, ...What is the big news? What is this counting for? 🙂
I believe this is how many days he has left on his Afghanistan contract.
Oh, getting back to home 🙂 That sure is a thing to count for 🙂
Yes, I am coming home from Afghanistan. Even better, I still have a job with my company when I get home.
December 17, 2014 at 6:50 am
Koen Verbeeck (12/17/2014)
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I do the same.
Small bug in a T-SQL script?
Whoops, I reformatted all of the code and did some performance tuning 😀
I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 17, 2014 at 7:14 am
Jeff Moden (12/17/2014)
Koen Verbeeck (12/17/2014)
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I do the same.
Small bug in a T-SQL script?
Whoops, I reformatted all of the code and did some performance tuning 😀
I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.
And that PC stuff is a skill that I haven't yet mastered. "Yeah, your stuff sucked, so I used the opportunity and my expertise to give it a tune-up."
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
December 17, 2014 at 7:19 am
WayneS (12/17/2014)
Jeff Moden (12/17/2014)
Koen Verbeeck (12/17/2014)
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I do the same.
Small bug in a T-SQL script?
Whoops, I reformatted all of the code and did some performance tuning 😀
I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.
And that PC stuff is a skill that I haven't yet mastered. "Yeah, your stuff sucked, so I used the opportunity and my expertise to give it a tune-up."
Heck, I'm more diplomatic than this, and that's saying something.
But I agree Brandie, you didn't do anything wrong.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 17, 2014 at 7:21 am
WayneS (12/17/2014)
Jeff Moden (12/17/2014)
Koen Verbeeck (12/17/2014)
Brandie Tarvin (12/17/2014)
Brandie Tarvin (12/16/2014)
WHOO! #HappySnoopyDanceI just spent 2 & 1/2 days reworking a package and its supporting tables / stored procedures. Previous time of execution between 8 - 12 hours.
Current execution time ... < 5 minutes.
"Celebration Time, come on!"
Sigh. So I presented coworker with my changes and asked for a peer review. Partially to make sure I didn't miss anything, partially to give coworker a learning opportunity since coworker was the one who designed the package. There were 7 files / queries involved and lots of tables. The response I got when I told coworker was: "You redesigned all the queries? But only the one was giving us problems!"
Me, I'm looking at the package and seeing opportunities for cleaning up extra variables, reducing the number of containers in half by moving File System Tasks from one container to another and deleting the excess, stripping out code and variables that were built in the initial development and not used. Also, I'm seeing 7 procs where excrutiating RBAR WHILE loops are slowing down processing. Granted, 6 of those procs were not having issues because they are usually working on files that are pretty small. But that doesn't mean they won't cause a problem in the future. After all, the one problem we did have wasn't a problem when this process was first put into place.
And being fair, this package was the first major ETL process designed by my coworker. Coworker had a huge learning curve not only for SSIS but the 7+ files that were being sent from a different system. Using the code I got from the forums here, he redesigned one file into a separate package that worked effectively. The other package (with the other files) were left alone.
But still... To my mind, since I'm already fixing one part of the package, why shouldn't I proactively fix the rest of it to make sure the other 6 procs and ETL paths do NOT become a problem down the road?
Or am I just being too efficient?
I do the same.
Small bug in a T-SQL script?
Whoops, I reformatted all of the code and did some performance tuning 😀
I'm the same way. Just understand that it IS the co-worker's work and since everything was rewritten, that co-worker is feeling seriously wounded.
And that PC stuff is a skill that I haven't yet mastered. "Yeah, your stuff sucked, so I used the opportunity and my expertise to give it a tune-up."
I wouldn't say it sucked. It was a damn good first try. The process had been in production for almost 2 years before it started having issues. Then our boss told us it needed to be optimized. We were both working on other projects at the time and this fell to the bottom of the list. I just happened to get to it before coworker did.
But yes, hurt feelings I get. I've had my boss call me over to his desk because my code forgot something or did something it wasn't supposed to do.
Viewing 15 posts - 46,621 through 46,635 (of 66,712 total)
You must be logged in to reply to this topic. Login to reply