May 29, 2008 at 4:06 pm
Hello all,
We author a COTS Enterprise software package that has many multi site installations. Several of these installations have sites that are in different countries and therefore culture codes.
One of the features of the application is tracking and performing calculations on complex calibration measurement data. We are currently rolling these calculations up into a C# Assembly for various reasons (provided if interested).
I have the assembly performing all the calculations and have verified my results, the last issue I am grappling with is setting the culture for the user providing the data. Decimal / Grouping separator and Number Of decimal places are the three culture code values of most interest to these calculations.
Since the DB holds all of the data for all of the sites it is often the case that the preferred cultural settings for the site are not the current server default settings. This means we need to change the culture code to match the site of the executing user. To make matters worse our application allows for each value to be overridden as some companies may want to, for example, use The German date format, but company policy states all data shall use the decimal place as the decimal separator etc.
As it stands now I can switch the threads culture as long as I use WITH PERMISSION_SET = UNSAFE, but this implies that any assemblies we write will also require this permission level and I want to be sure there isn't a better way to handle this.
How do others set the culture code when running CLR code in SQL Server?
Is there a way to change the culture of the thread for Assemblies compatible with PERMSION_SET = SAFE?
Is there a way to set the culture code without requiring System.Threading?
Can I set the Culture in T-SQL and have the C# code inherit this setting in some manner for the current transaction?
Are there any good reads out there on managing multiple cultures in SQL CLR development?
I have to admit my biggest worry is flack from customers as soon as they see the Assembly is marked as unsafe, so maybe all I need are good answers to the inevitable questions. You would think this would be a fairly common problem and I have to be missing something, Google doesn't seem to bear that out though.
Any help is always appreciated.
Regards,
Bill
May 29, 2008 at 5:37 pm
I'm not sure why you would need that. The culture is going to be important on the front-end, since it's going to determine how you interact with the end-user. However - once the input data leaves the UI - there should be no culture component involved. Numbers come back through as numbers, in a format that the DATABASE can understand, so the decimal place will come back predictably no matter what the culture is. Same with datetime settings.
Assuming you respect the N-tier layers and do your formatting in your presentation layer, the data layer should not need to know what the Culture is on the UI.
Perhaps I'm missing how you envision this working, but it just seems like that doesn't belong in your data layer at all.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
May 29, 2008 at 9:23 pm
Matt Miller (5/29/2008)
I'm not sure why you would need that. The culture is going to be important on the front-end, since it's going to determine how you interact with the end-user. However - once the input data leaves the UI - there should be no culture component involved. Numbers come back through as numbers, in a format that the DATABASE can understand, so the decimal place will come back predictably no matter what the culture is. Same with datetime settings.Assuming you respect the N-tier layers and do your formatting in your presentation layer, the data layer should not need to know what the Culture is on the UI.
Perhaps I'm missing how you envision this working, but it just seems like that doesn't belong in your data layer at all.
Thank you Matt, I appreciate (and expected) your response, unfortunately, it is not "how I envision this working" it is the reality I am faced with. So here are the whys to the need of the how.
First let me say that our database is as strongly typed as possible. We have a requirement stating that all attributes guaranteed to be of a certain type shall be stored in the Database Native format (Or ISO 8601 format for datetimes), regardless of whether the underlying column is of the same type (our "user defined fields" are nvarchar, but can be more narrowly typed via configuration).
Unfortunately, the next requirement states, Data of a mixed type that can not be guaranteed of a certain type shall be stored in the appropriate culturally sensitive format.
That means that unless we can guarantee that an attribute will store only data of a given type we can not store it in the Native/ISO8601 format. Calibration data is this type of data. We can not guarantee that the data is going to be numeric. The result entered could just as easily be "Blue", "28.05.2008 12:34:56.789" or "B3754." Because I can not say it will always be a number and run it through our formatting routines it is required to be stored in the user's local culture. Anything else would require checking various data types on a row by column basis (Not desirable nor fast)
Now we introduce the problem. Users trying to batch generate work over slow WAN links are experiencing unsatisfactory execution times when generating > X000 monthly calibrations over slow links (it takes too long to run the job over their 64 >> 512 kb/s leased lines).
Solution: Develop DB centric job management system where users can generate Data sheets as needed without coordination with the (often unresponsive) central office.
Did I mention I have to do it yesterday??? Oh yeah, and I have to use as much verified code as possible until we can fit a proper refactor into a sprint...
So basically I am stuck moving code that properly handled all of this on each of the middle tiers to a central DB implementation using the existing C# code until it can be / needs to be refactored.
So, all that says yeah I academically agree with you, but...
In all honesty this math can get complex, is not best done in SQL, and since we have existing customers it would be very hard to guarantee that we can produce identical results in all cases after a port suggesting it is dangerous to switch calculation libraries without more testing than we currently have resources for...
Any answers to my questions very appreciated..
Regards,
Bill
May 30, 2008 at 8:11 am
Argh... you're running an EAV setup. That explains it.... Sorry - I just had to ask the question...:D
I was going to suggest using a backdoor around resetting the culture (i.e. picking up the numeric format info from the UI's Culture, storing in the client profile, and accessing it directly), but the Regex Replace logic that it would take for that is ultimately worse than marking the Assembly as Unsafe. Especially since it sounds like you have a LOT of regional settings/Cultures to deal with.
The one thought I had was around the idea of messing with the "standard" EAV model, to have "type purity" going on. (I will spare you the whole "EAV sucks and is a headache to the data model" discussion - I'm sure you already know all that). The cleanest scenario is to have multiple EAV tables, whose only difference is to have the "Value" part be a specific data type. So - you'd have EAVChar, EAVInt, EAVCurrency, EAVDatetime, etc. - each storing their own share of the data. (The other scenario which IMO works less well is the single EAV table, but multiple value columns: ValueChar, ValueInt, etc....). The right mapping process (since it sounds that you have config files controlling the data types of your attribs) would simply translate into selecting from multiple tables.
That would put your data back in the "I can guarantee type now" bucket...
(I know - I'm reaching. I just don't see a way to avoid the Culture setting otherwise).
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
May 30, 2008 at 9:00 am
Hi Matt,
The entire application isn't EAV, the majority of it isn't in fact. However in the land of asset management to assume you can granularly define the entire set of data all of your customers are going to want to track would be foolish and we have employed an EAVesque approach to handle the individual needs of each customer. Ahh the world of COTS software... I will say that our model is working out very well and all config is stored in the DB. We have developed our own metadata framework and it is performing better than expected. We have achieved a nice balance and still have the large majority of our business rules modeled into the schema. ESPECIALLY those that pertain to the regulatory environment the majority of our customers work in.
That being said, it is even more tricky with measurement data from calibrations. This table isn't EAV. Each column in the table is special purpose and only holds one attribute for the MD relation. Here is the trick though, as an example, if you are doing chromagraphic assays, ph etc you may end up comparing colors to reference card standards (With codes on them). The very next set point may measure the amount of light absorbed by the sample compared to a known standard, so in the same record details you have a human doing a visual color comparison and have numeric data from a piece of test equipment, both are right and both are needed and they both represent the same attribute (Instrument Value), but the numeric data needs to have a std dev calculated, limit tolerances need to be tested and ultimately the data needs to be reviewed (by a separate human) to make sure the human visual comparison matches up with the optical sensor comparison data. (Think calibrating the calibrator)...
So MD is by far the most complex portion of our application and as such I guess it makes sense to be the first place we are going to try this CLR integration stuff in our production software. I have strongly resisted it up to this point, as I have never found a compelling need for it, but I admit at the moment it is the best answer.
If I have to live with the assembly marked as unsafe to gain thread level access I can do that, but you would think that with MS touting CLR for complex string manipulation in the DB it would make sense for there to be a way to set the local culture so you have context for those strings etc. I admit I am completely new to the CLR and found a way to do it, but that doesn't mean I found the best way to do it. So I thought I would ask and hopefully find the missing link, and maybe even learn a little...
Anyway, please don't think I don't appreciate the thoughts, I have thought of them all myself, I just was trying to avoid writing a book to explain this particular case...
Regards,
Bill
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply