July 5, 2014 at 5:51 pm
I have a process where I reload 20 reports on a remote server each morning. This completes anywhere between 4:30 and 5:30 am, and after each reload, a SQL Server 2008 table inserts a row with success/failure info. At this point, users know to wait til after 6am to use their reports, but I'd like to send an email as soon as reports have reloaded with latest days data.
What is the best way to do this?
Some of the columns in the table that are populated after report reloads is as follows:
,[scheduled_reload]
,[successful_reload] ----When successful_reload = 1, this means success.
,[failed_reload]----When failed_reload = 1, this means failure.
,[package_execution_id]
,[end_date]
,[memory_usage]
,[elapsed_seconds]
Can a trigger be written that waits for all 20 inserts before sending database mail notification? Is this a good method, is this possible?, is there a better way?
July 5, 2014 at 8:43 pm
Why not just make the last step of the "process" check your log table and send and eMail either way (obvious, different targets for success and fail).
--Jeff Moden
Change is inevitable... Change for the better is not.
July 6, 2014 at 11:24 am
Jeff Moden (7/5/2014)
Why not just make the last step of the "process" check your log table and send and eMail either way (obvious, different targets for success and fail).
Hi Jeff, That 'process' is QlikView and it doesn't send a notification when reports are finished reloading (actually, this may have more to do with the features that are included in the license and the companies desire to stay within budget ie. not buy Notification Connector for Qlikview). By the way, I have worked out an SSIS package that makes calls to Qlikview via 20 process tasks (using psexec on cmd line) to issue remote command to reload each report, retry if failure, and send email on completion, (and that works great!) EXCEPT it is taking longer than Qlikview does to determine how much pararellization to allow when it's reloading it's own reports. Therefore, I'm seeking another way to detect when the reloads are done, which is why I want to harness a sql table that someone other than myself built, and which is populated by a query that queries QlikView's log reports.
I need a way to listen for all 20 reports to finish loading, and want to do the way that makes the most sense.
That's why I am thinking to use a trigger but it's the waiting for all 20 rows to insert that's got me stuck (and the fact I'm not a programmer) . Can this be done with trigger?
July 6, 2014 at 11:41 am
You say you've "worked out an SSIS package that makes calls to Qlikview via 20 process tasks...". How does that SSIS package currently know when it's done and stops running? 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
July 6, 2014 at 12:10 pm
The last task following all execute process tasks is a Send Mail task. So, back to matter I'm requesting assistance on (NOT the SSIS package).
Is there a way to build a trigger to wait for all 20 rows to populate. could someone kindly advise on links that may help me build a 'listening' process to send db email after 20 inserts meeting specific conditions ie. reload_successful = 1, have been inserted into a sql server 2008 table?
July 6, 2014 at 9:58 pm
anyone?
July 6, 2014 at 10:06 pm
...
July 6, 2014 at 11:10 pm
KoldCoffee (7/6/2014)
The last task following all execute process tasks is a Send Mail task. So, back to matter I'm requesting assistance on (NOT the SSIS package).Is there a way to build a trigger to wait for all 20 rows to populate. could someone kindly advise on links that may help me build a 'listening' process to send db email after 20 inserts meeting specific conditions ie. reload_successful = 1, have been inserted into a sql server 2008 table?
This should be fairly straight forward, i.e. an after insert trigger counting members of the set that the inserted value is a member of. For a daily job, that could be "day of year" etc..
Here is something to get you started
😎
USE tempdb;
GO
CREATE TABLE dbo.TBL_COUNT_INSERT
(
CI_ID INT IDENTITY(1,1) PRIMARY KEY CLUSTERED NOT NULL
,CI_DATE DATE NOT NULL DEFAULT (GETDATE())
,CI_VAL INT NOT NULL
);
GO
ALTER TRIGGER TRG_TBL_COUNT_INSERT_NOTIFY
ON dbo.TBL_COUNT_INSERT
AFTER INSERT
AS
BEGIN
DECLARE @COUNT INT = 0;
SELECT @COUNT = COUNT(*)
FROM dbo.TBL_COUNT_INSERT CI
WHERE CI.CI_DATE = (SELECT MAX(CI_DATE) FROM inserted)
IF @COUNT = 3
BEGIN
-- SEND REPORT
EXEC sp_who2
END
END;
GO
INSERT INTO dbo.TBL_COUNT_INSERT (CI_VAL) VALUES(1);
INSERT INTO dbo.TBL_COUNT_INSERT (CI_VAL) VALUES(2);
INSERT INTO dbo.TBL_COUNT_INSERT (CI_VAL) VALUES(3);
INSERT INTO dbo.TBL_COUNT_INSERT (CI_VAL) VALUES(4);
DROP TABLE dbo.TBL_COUNT_INSERT;
July 7, 2014 at 12:47 am
I very appreciate!
July 7, 2014 at 8:01 am
KoldCoffee (7/6/2014)
The last task following all execute process tasks is a Send Mail task. So, back to matter I'm requesting assistance on (NOT the SSIS package).Is there a way to build a trigger to wait for all 20 rows to populate. could someone kindly advise on links that may help me build a 'listening' process to send db email after 20 inserts meeting specific conditions ie. reload_successful = 1, have been inserted into a sql server 2008 table?
Yes, absolutely there is a way to use a trigger to do this. But why? Have the last task in the SSIS package examine the table and give you a full up report on all the reports.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 12, 2014 at 1:40 pm
because I explained to you at least a couple of times, I can't use the ssis package because it is a slow solution. Therefore I'm not using SSIS.
July 13, 2014 at 5:00 pm
KoldCoffee (7/12/2014)
because I explained to you at least a couple of times, I can't use the ssis package because it is a slow solution. Therefore I'm not using SSIS.
Heh... OK. I'm officially confused. You said the process was being controlled by SSIS, correct? Can you change that?
And, even if the answer is "no", a trigger isn't going to make things faster for you whether you're using SSIS or not. Again, I might be misunderstanding something but, no matter what the process is, the process is done when the process is done. That's the only time you should query the table to see if everything was done correctly, not every time you make an entry into a table.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 13, 2014 at 5:32 pm
KoldCoffee (7/12/2014)
because I explained to you at least a couple of times, I can't use the ssis package because it is a slow solution. Therefore I'm not using SSIS.
You haven't made yourself entirely clear, but it seems like you are saying that you have two ways to refresh these reports.
1) Qlikview can do it but doesn't send a notification.
2) SSIS can do it, but is slower than the native way.
You haven't said how soon after the last report is refreshed the email should go out...so I am going to suggest you have a SQL Agent job that checks that table for 20 success messages every X minutes between 4.30am and 6am and send an email when ready.
MM
select geometry::STGeomFromWKB(0x0106000000020000000103000000010000000B0000001000000000000840000000000000003DD8CCCCCCCCCC0840000000000000003DD8CCCCCCCCCC08408014AE47E17AFC3F040000000000104000CDCCCCCCCCEC3F9C999999999913408014AE47E17AFC3F9C99999999991340000000000000003D0000000000001440000000000000003D000000000000144000000000000000400400000000001040000000000000F03F100000000000084000000000000000401000000000000840000000000000003D0103000000010000000B000000000000000000143D000000000000003D009E99999999B93F000000000000003D009E99999999B93F8014AE47E17AFC3F400000000000F03F00CDCCCCCCCCEC3FA06666666666FE3F8014AE47E17AFC3FA06666666666FE3F000000000000003D1800000000000040000000000000003D18000000000000400000000000000040400000000000F03F000000000000F03F000000000000143D0000000000000040000000000000143D000000000000003D, 0);
July 13, 2014 at 9:36 pm
right now, this Datewarehousing ETL nightly process has to complete before the QlikView reports can be refreshed. So, a SQL agent job is already engaged in the overall process and knows to kick off the QlikView Service Manager after the main ETL work is done.
QlikView's 'Publisher' then orchestrates the reloads (determines the degree of parallelization) and seems to be faster.
For a awhile (which is where my SSIS package came into being) the idea was to see if SSIS could replace Publisher. SQL Server agent, in that scenario, would kick off a dtsx package after the nightly ETL. The SSIS package could issue a series of psexec calls to the Qlickview Service Manager, one per distinct report. But, when I give full reign to SSIS to determine the degree of parallelization of the rerunning of 16 huge QVWs (reports), it fails all over the place. When I controll parallel tasks (using precedent constraints and sequence container) limiting to 2 at a time, the whole things takes twice as long as when run by QlickView Service manager.
That's 2 1/2 hours longer.
That's why, having SQL Server Agent sniff this stupid sql table every 15 minutes (a table that is the end result of a sniffing sproc that runs every 15 minutes against a QlikView log file) for existence of 20 new rows for the day is appearing to be better. And that is why it's an either/or.
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply