January 28, 2010 at 6:17 am
Folks,
I am creating an SSIS 2008 package that will periodically import and shape SQL Server data from a number of sources periodically. A number of transforms and calculations need to be applied to the data before storing. These transforms include logic.
My choices would appear to be:
1. Carry out all transforms via TSQL in sprocs and views.
2. Embed some code in the CLR with classes to implement the logic/calculations.
Some teammates would prefer using CLR assemblies as the code will be easier to unit test, however I am old school SQL Server and I am reluctant to add custom assemblies to the db if a pure SQL solution is possible. Maybe I am being a bit of a luddite. I am aware that ideally we do not want to be storing logic in the db, but should I add custom code to SSIS to keep the logic out of the db?
Any advice is appreciated.
Thanks
M
January 28, 2010 at 6:46 am
mick L (1/28/2010)
I am aware that ideally we do not want to be storing logic in the db
My two cents: those who tell you that business logic can completely live outside the DB are lying.
Tables, constraints and views are part of the business logic and I can't imagine a database without tables and constraints.
That said, I would suggest you to use the right tool to do the job: SQL procedures, functions and views are powerful enough to hold the business rules, CLR comes into play when complex calculations that wuold be hard to code with SQL are involved.
Regards
Gianluca
-- Gianluca Sartori
January 28, 2010 at 6:52 am
Thanks Gianluca,
If there were complex calculations in there then I would definitely add assemblies/clr functions. Currently I have a number of views in which I have implemented the logic/calculations. Any advice on how I should make such a solution (lots of inter-dependent views) more unit-testable?
January 28, 2010 at 7:16 am
I'm sorry, I'm not a unit testing expert. I hope somebody else can give you good advice on the subject.
-- Gianluca Sartori
January 28, 2010 at 7:27 am
Maybe you can replace some views with CTEs / subqueries instead of having a view for each calculation.
For unit testing you could use the OUTPUT clause and store the results in intermediate table(s).
It's hard to tell if this an option without knowing more details...
January 28, 2010 at 10:16 am
I don't know if this helps at all, but we are currently testing a "mixed approach". The approach entails SQLCLR procedures making calls to an External Rules engine. We haven't seen the need yet to put it into triggers, etc... but it's not impossible to do.
I work in the insurance field so some of the logic is fairly arcane, and isn't easy to express in T-SQL. Additionally, there has been a need to centralize the rules, so that a single change to how we do business doesn't have to cause changes in 25 different touch points in our validation process. At the same time, using SQLCLR objects seem to be a decent compromise: using SQL-native objects that CAN be incorporated into the data layer to enforce some constraints, while still having the rules in an easily managed repository outside of the DB itself.
Again - it's still being vetted (scalability for one), but it's looking fairly good so far. This would work well for the "complex" stuff, so you're still back to Gianluca's point: right tool for right test.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply