October 10, 2011 at 8:11 am
Hello !
I'm new to this forum ;I'm new to SQL Server 2005 programming to. I guess I could find some help about my issue.
Here my environnment :
I have a production software that allow to use DSN to connect to database ; with a proper DSN, I can populate a table on my SQL2005 server. What I need to achieve, is to get the just inserted data, and dump them into a file for further processing.
I created a trigger that runs "after insert" on the needed table. This trigger make external directory creation, and dump the data (from the selected table) into a file ; I use the "xp_cmdshell" command to echo data into the file.
When I make a query as 'sa', on the SQL management tools, everything work as expected. When I use the production software, it seems that the select query I make (@select data = select field1+'|'+ field2 from inserted) seems to return NULL (I traced the values). This behaviour appends even if I use "sa" user on the DSN connection.
I hope someone can help me 🙂
regards,
Raphael.
October 10, 2011 at 9:32 am
raphael.kong (10/10/2011)
Hello !I'm new to this forum ;I'm new to SQL Server 2005 programming to. I guess I could find some help about my issue.
Here my environnment :
I have a production software that allow to use DSN to connect to database ; with a proper DSN, I can populate a table on my SQL2005 server. What I need to achieve, is to get the just inserted data, and dump them into a file for further processing.
I created a trigger that runs "after insert" on the needed table. This trigger make external directory creation, and dump the data (from the selected table) into a file ; I use the "xp_cmdshell" command to echo data into the file.
When I make a query as 'sa', on the SQL management tools, everything work as expected. When I use the production software, it seems that the select query I make (@select data = select field1+'|'+ field2 from inserted) seems to return NULL (I traced the values). This behaviour appends even if I use "sa" user on the DSN connection.
I hope someone can help me 🙂
regards,
Raphael.
I would suggest perhaps a different design. I would use some sort of indicator on the table, a new column, perhaps you could call it process_ind. I would default it to 'N'. Then I would create a sql job that runs at regular intervals. This job would change all process_ind = 'N' to process_ind = 'I'. Then take all the process_ind = 'I' and write them out. Finally update all process_ind = 'I' to process_ind = 'Y'
In general I would not create complicated processes on triggers. First triggers that take a while to run can cause other processes to fail. If you make modifications to the table where the trigger exists and if those modifications cause the table to be dropped and recreated, your trigger will go away. You also need to make sure your trigger code is set based, since some people forget that triggers can have one record or many records.
Anyway, just my opinion.
October 10, 2011 at 9:35 am
thx for your opinion, but I really need date to be sync almost in real time.
October 10, 2011 at 9:40 am
I'm guessing here, so you may want to wait for someone with more knowledge on the subject, but I think that what you're describing would use a BULK INSERT instead of an INSERT. This means that you'd have to specify "FIRE TRIGGERS", otherwise no insert triggers execute.
http://msdn.microsoft.com/en-us/library/ms188365.aspx
October 10, 2011 at 9:47 am
Thx for your answer !
What I dont understand is why when I write the sql insert from the sql management studio, the trigger runs properly, with every data as expected.
When the data are inserted from the production software, the trigger runs too, but some data are missings, these from the this code :
select @data = (select field1 from inserted)
October 10, 2011 at 9:54 am
raphael.kong (10/10/2011)
Thx for your answer !What I dont understand is why when I write the sql insert from the sql management studio, the trigger runs properly, with every data as expected.
When the data are inserted from the production software, the trigger runs too, but some data are missings, these from the this code :
select @data = (select field1 from inserted)
Because when you write it in SSMS, you're writing an INSERT statement instead of the BULK INSERT that I guess the DSN connection is using.
--EDIT--
Bear in mind I am completely guessing, and in fact if any of the data is hitting the trigger then I'm incorrect because without "FIRE TRIGGER" set during a BULK INSERT then nothing would have been passed through.
October 10, 2011 at 10:01 am
humm ...
In case of bulk insert, no trigger are fired, OK ; But, when the software insert data into the table (using the DSN) the trigger IS fired, as some actions, done by the trigger (ie insert log, folder creation etc ...) are properly executed.
Only the variable set by retrieving the inserted value (from the inserted table) is NULL.
I hope I'm clearer, my english is quite poor.
Thx for answering 🙂
October 10, 2011 at 10:05 am
raphael.kong (10/10/2011)
Thx for your answer !What I dont understand is why when I write the sql insert from the sql management studio, the trigger runs properly, with every data as expected.
When the data are inserted from the production software, the trigger runs too, but some data are missings, these from the this code :
select @data = (select field1 from inserted)
So I mentioned about having set based code in your trigger. The code you have assumes that only one record exists. If multiple records were inserted all at once you are only getting one with your current code.
October 10, 2011 at 10:38 am
As said above the logic of your trigger is not going to work for multiple inserts. It needs to be flexible to handle this. If you want some help there are lots of people on here willing and able to help. Just post up some ddl and what you have so far. We can modify what you have to handle this without too much difficulty.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
October 10, 2011 at 11:00 am
raphael.kong (10/10/2011)
humm ...In case of bulk insert, no trigger are fired, OK ; But, when the software insert data into the table (using the DSN) the trigger IS fired, as some actions, done by the trigger (ie insert log, folder creation etc ...) are properly executed.
Only the variable set by retrieving the inserted value (from the inserted table) is NULL.
I hope I'm clearer, my english is quite poor.
Thx for answering 🙂
Try running profiler to see what the application is actually doing.
eg The application may insert blank columns and then update them etc.
I agree with the comment about not doing too much in a trigger.
If you need the process to be near real time, I would be inclined to:
1. Use a trigger to add the inserted/updated columns to a working table.
2. Write a service in .Net to process the working table.
The service would have a callback on the working table to process it as soon as rows are added.
October 11, 2011 at 8:02 am
I'm not at the office this afternoon, tomorrow, I will come back with the trigger source.
Thx all for your help 🙂
October 12, 2011 at 1:46 am
Here is the trigger snippet code :
My environnement : SQLServer 2005, PV_TEST2 is the database; it contains 2 tables, results and debugTrace
DSN is like : DRIVER={SQL Server};DATABASE=PV_TEST2;SERVER=FQDNofTheServer;UID=PV_sun;PWD=****
USE [PV_TEST2]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TRIGGER items_insert on Results
AFTER INSERT
AS
BEGIN
SET XACT_ABORT ON
SET REMOTE_PROC_TRANSACTIONS OFF
DECLARE @OUTPUT AS VARCHAR(8000);
DECLARE @OUTPUT2 AS VARCHAR(8000);
DECLARE @CMD as VARCHAR(8000);
DECLARE @SEPARATOR as VARCHAR(5);
DECLARE @SERVERPATH as varchar(100);
DECLARE @BATCHFILE as varchar(100);
-- Variable for folder name
DECLARE @TodayDate as varchar(40);
DECLARE @TodayHour as varchar(40);
DECLARE @TodayMinu as varchar(40);
DECLARE @TodaySec as varchar(40);
DECLARE @TodayMSec as varchar(40);
DECLARE @NewFileName as varchar(100);
DECLARE @GenericFileName as varchar(100);
-- INIT Variable
SELECT @TodayDate = CONVERT(varchar(10), GETDATE(), 112)
SELECT @TodayHour = DATEPART(hh,GETDATE())
SELECT @TodaySec = DATEPART(ss,GETDATE())
SELECT @TodayMinu = DATEPART(mi,GETDATE())
SELECT @TodayMSec = DATEPART(ms,GETDATE())
SELECT @NewFileName =@TodayDate + '_' + @TodayHour + '_' + @TodayMinu + '_'+ @TodaySec + '_' + @TodayMSec
-- Other variable, for file creation
SELECT @SEPARATOR='|';
SELECT @SERVERPATH='e:\PV\';
SELECT @BATCHFILE='uploadPV.bat';
SELECT @GENERICFILENAME='output.dat'
SET XACT_ABORT ON
SET REMOTE_PROC_TRANSACTIONS OFF
-- @OUTPUT should receive the concatenation of the data separated with a pipe
SELECT @OUTPUT = (SELECT left(SerialNumber,30) +@SEPARATOR+
ModuleName +@SEPARATOR+
convert(varchar,Reduction,120)
FROM inserted)
-- This insert is for debug ; when the trigger is launch by hand with INSERT Statement, the output is good, as expected
-- When the trigger is fired from the application using DSN, this output is NULL
INSERT DebugTrace SELECT 'PV', USER_NAME(), getdate(), 'OUTPUT '+@OUTPUT
SET @CMD='mkdir ' +@SERVERPATH+@NewFileName
-- In all case, the folder is properly created
EXEC xp_cmdshell @CMD,NO_OUTPUT;
-- Debug insert ; always with expected values
INSERT DebugTrace SELECT 'PV', USER_NAME(), getdate(), 'MKDIR '+@CMD
SET @CMD='echo "'+@OUTPUT+'">'+@SERVERPATH+@NewFileName+'\'+@GENERICFILENAME
-- Here, again, manual insert : OK, "DSN insert" : NOK, @CMD is NULL
INSERT DebugTrace SELECT 'PV', USER_NAME(), getdate(), 'ECHO '+@CMD
EXEC xp_cmdshell @CMD,NO_OUTPUT;
-- batch call this parameters
SELECT @BATCHFILE=@SERVERPATH+@BATCHFILE+' '+@SERVERPATH+' '+@NewFileName +' '+ @GENERICFILENAME
-- batch always run
EXEC xp_cmdshell @BATCHFILE;
INSERT DebugTrace SELECT 'PV', USER_NAME(), getdate(), 'BATCHFILE '+@BATCHFILE
END
October 12, 2011 at 6:26 am
In the log table, I got this NULL message instead of having 'OUTPUT' + @OUTPUT.
This strange 'NULL' string make me think that maybe the concatenation I've done fails somewhere, then, the query fails, but without error message. So I remove almost all the concatenated field, and then it works !
I have to investigate why and which field fails !
There is a lot of float field (almost 45) that I cast to get an string .. issue may come from there.
I will keep you updated.
Raphael.
October 12, 2011 at 7:37 am
yeah, i think if your fields might be null, you'd get unexpected results, right?
if you wrap ISNULL into your concat, does it show where the issue lies?
-- @OUTPUT should receive the concatenation of the data separated with a pipe
SELECT @OUTPUT = (SELECT LEFT(ISNULL(SerialNumber,'NOSERIAL!'), 30) + @SEPARATOR + ISNULL(ModuleName,'NOMODULE!'), + @SEPARATOR + CONVERT(VARCHAR, ISNULL(Reduction,0), 120)
FROM inserted)
Lowell
October 12, 2011 at 7:39 am
Thx for your answer !
I will rewrite all my query and keep you posted !
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply