August 7, 2013 at 12:02 am
Comments posted to this topic are about the item Building an Incremental Load Package – Level 3
Andy Leonard, Chief Data Engineer, Enterprise Data & Analytics
August 7, 2013 at 4:11 am
Andy, Superb article! Informative, nicely written and elaborated with appropriate sample data. Can't wait to read your next article on BIML!!
Thanks a ton.:)
August 8, 2013 at 10:41 am
Questions from a newbie that's only been scripting queries for 2+ years:
You have it set up for deltas that are in table 2 that are not in table 1, to which data from table 2 is added to table 1, which is a great idea. I see that potential on the ETL that we do. My question on this issue is what happens to the deltas that are in table 1 that are not in table 2? What we do is an update on a personnel system here and some people get deleted from the system. So I wonder if this system can remove the ones that are removed when the data is updated. I have not studied this series in super detail, so if the answer is in your series, please let me know and I apologize for the inconvenience.
Related to above, we get text files that are imported into tables within a staging database. Could this work without the use of creating additional tables for a pre_load? If not, then I guess we can do a pre_stage table and use this process to increment our actual staging tables.
Someone suggested I explore this process for a possible alternative to using bcp queryout and/or Data Flow tasks for a bunch of files created from sql scripts. Would this be part of your Stairway discussion down the road?
Thanks.
August 8, 2013 at 11:34 am
Hi jbm6401,
You will want to check out the Stairway to Integration Services[/url] for answers to these types of questions.
Hope this helps,
Andy
Andy Leonard, Chief Data Engineer, Enterprise Data & Analytics
August 9, 2013 at 7:56 am
Thanks.
August 9, 2013 at 11:57 pm
Awesome post Andy. This will help tremendously in trying to explain some SSIS patterns to my team!
August 11, 2013 at 9:38 pm
August 20, 2013 at 3:22 pm
Finally got around to working through the examples. Very thorough and well-documented. I look forward to that BIMLScript class before the PASS Conference now and am looking forward to the rest of the series.
Only issue I had that was odd was the test package from article #2 had a different encryption level. I rebuilt it so everything had the same encryption level, but the package wouldn't debug until I deleted that first test package.
I tried to anticipate the code you listed so I'd get a bit more experience seeing what was available. Sometimes successfully, sometimes not so much, but it was a good learning experience going through the example. Now on to #4.
September 9, 2013 at 8:07 am
Andy this is turning into a really interesting series and I can't wait to see what further BIML wizardry you've got in store for us.
November 14, 2014 at 4:00 pm
I just returned from PASS where I first heard about BIML, so this article was very timely. Thank you for this! It's very powerful stuff. I noticed that apparently biml compiler does a lot of things "behind the scenes". For example, we never specified column mapping between the output of the "Conditional Split" and stgUpdates destination. I guess since in this case the column names matched, they were simply mapped based on their names. But what if they were different, which, I suspect, is a more common scenario? We would probably need to write much more code.
November 16, 2014 at 8:37 pm
First off, great series. My problem is this. I built the lookup operation verbatim, but it's giving me the following error:
"The component has detected a potential metadata corruption during validation. Error at Load tblDest [Correlate [28]]: The Correlate.Outputs[Lookup Match Output] has an invalid error or truncation row disposition.
It may not be possible to recover to a valid state using a component specific editor. Do you want to use the Advanced Editor dialog bog for editing this component?"
I opened up the advanced editor for the lookup, and I think the problem is that the NoMatchBehavior is being set as "Treat rows with no matching entries as errors." even though the biml file is explicitly telling it to send them to the no match output.
<Lookup Name="Correlate" OleDbConnectionName="SSISIncrementalLoad_Dest" NoMatchBehavior="RedirectRowsToNoMatchOutput">
Any ideas?
November 26, 2014 at 5:35 pm
The BIML engine does some of the same kind of "invisible hand" stuff that the SSIS designer does for you when you wire up objects in a data flow.
If you've ever tried to programmatically manipulate an SSIS package (either directly or via EzAPI), it's easier to envision what BIML is effecting when you "compile" your BIML file to SSIS package from this. You'll also understand how easy it is to blow things up when you don't hook things up right...
BIML is a nice abstraction over the process, way more so than EzAPI, which itself is way more so than directly using the SSIS libraries in .Net world.
I'd rather type a bunch of stuff than the click-fest that is working with the SSIS designer.
August 12, 2015 at 6:16 am
Thanks for this series!
I was recently handed the specs for twenty-five separate flat files and told to build a process to import them into our data warehouse.
Because of this series, I was able to use BIML to build 25 SSIS packages and a master package to run them in less than 8 hours.
🙂
August 12, 2015 at 1:17 pm
Thanks Andy for the great article!
August 13, 2015 at 10:32 pm
Andy, I have used BIML in one of my project to create a package dynamically. The script created was using xquery of sql server to generate Biml code by fetching appropriate package configurations and source queries. The script helped us to create 50 complex package within no time
. However we have the package template with many custom tasks we have built and we are not able to configure biml for these. Can you help here
I will be publishing article on the dynamic script I have created soon..
--Divya
Viewing 15 posts - 1 through 15 (of 21 total)
You must be logged in to reply to this topic. Login to reply