Coupon data validation logic - Heavy design stuff

  • I am working on a coupon redemption project that appears to require dynamic data validation and need some help. For example, a coupon could require a person to buy XXX number of XXX items from XXX manufacture before the transaction is approved. The logical operator between each validation could also be “AND” and “OR”. My first attempt at this has been something along the lines of creating a coupon table with a child table of validation logic. However, being able to apply this logic (and at what layer) has turned out to be a challenge. I am very concerned about speed as all requirements have to be met before a transaction is approved. Any help with trying to figure out the best solution for this would be great@!

    Coupons

    C_ID

    C_DESC

    MFG_ID

    MFG_OFFERNUM

    Coupon_Validation

    CV_ID

    C_ID

    MFG_ID (Manufacture)

    MFG_PARTNUM (Manufacture Part Number)

    CV_QTYREQUIRED (Number to meet requirements)

    CV_ITEMLOGIC (AND, OR)

  • I had to meet a similar need. We figured out that there were only a few constants. So, instead of trying to design a storage engine to get at data that just wasn't going to be very flexible, instead we concentrated on storing enough data to build a dynamic query that performed as fast as possible. It ended up working well except that there was no way to set up end user maintenance since we had to create the ad hoc queries and test them in TSQL. Very few of the users that would have done the entry were even aware that there was a database behind the app.

    If I remember correctly (last worked on this about seven years ago), we had a pair of tables, one that defined a test, and we were able to force a minimum set of parameters to define it (IsActive bit, description, a couple of others) and a text field to hold the query information. The second table was simply to record which coupon had been used by which user. We got an agreement that as long as two criteria were met, we'd let them do anything else they could think of. First thing was, we could only work from data that was already in the database, rather than trying to fish for extra parameters or whatever, on the fly, and the second that while multiple offers may apply, none of the offers was directly related to another offer, meaning they were applied more or less in parallel with none in series.

    It worked well, we processed 2gb worth of transactions daily. I realize this is a bit short on specifics, but I hope it helps.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply