June 23, 2005 at 10:44 am
I am using the following code in a batch file to process ansii text files as they are FTP'ed across to my server(every 15 mins):
rem Performs bulk copy operations into real-time staging tables
for /r G:\FTPServer\ADAM6Upload\Realtime\ %%i in (*.ais) do bcp sis2_dfo..rt_aisstaging in %%i -c -t"\|" -r"\n" /U"sa" /P"pword" /S"server"
for /r G:\FTPServer\ADAM6Upload\Realtime\ %%i in (*.nav) do bcp sis2_dfo..rt_navstaging in %%i -c -t"\|" -r"\n" /U"sa" /P"pword" /S"server"
for /r G:\FTPServer\ADAM6Upload\Realtime\ %%i in (*.edt) do bcp sis2_dfo..rt_edtstaging in %%i -c -t"\|" -r"\n" /U"sa" /P"pword" /S"server"
However, some data is being merged if if tries to process two sets of files simultaneously. For example, if it is currently loading the files, BKNL05-031.ais, BKNL05-031.nav and BKNL05-031.edt and then all of a sudden, another batch of files are FTP'ed across to my server, for example, BKSF05-118.ais, BKSF05-118.nav and BKSF05-118.edt, then data from say, the .ais files are merged during the bulk copy operation. Is there a way to fix this given my current code? Is there something I can 'wrap around' the code above to ensure that one bulk copy completes before another starts? Any help would be greatly appreciated. Thanks.
June 24, 2005 at 6:21 am
How is the data moved from the staging tables?
* If you have access to osql.exe you could always check if the tables are empty.
* If you can convert your script to VBscript or something you could use application locks in SQL Server.
* You could always create one staging table for every file.
June 24, 2005 at 6:28 am
The data is moved from the staging tables using stored procedures such as:
CREATE procedure rt_edt_sp
as
SET ANSI_NULLS ON
SET NOCOUNT OFF
declare @trg_sub_code varchar(8)
declare @mid varchar(10)
declare @callsign varchar(6)
declare @src varchar(1)
declare @id varchar(4)
declare @cid varchar(6)
declare @utc varchar(10)
declare @tlat varchar(8)
declare @tlng varchar(9)
declare @TCC varchar(3)
declare @zone varchar(3)
declare @io varchar(1)
declare @distio varchar(6)
declare @va varchar(2)
declare @nat varchar(3)
declare @fish varchar(3)
declare @crs varchar(5)
declare @spd varchar(5)
declare @vid varchar(6)
declare @name varchar(32)
declare @side varchar(12)
declare @call varchar(10)
declare @comm varchar(100)
exec delete_rt_edt
declare @row varchar(400)
declare e_cursor cursor for select data
from rt_edtstaging
open e_cursor
fetch next from e_cursor into @row
/*
IF len(@row) <= 16
BEGIN
set @mid=substring(@row,1,10)
print @mid
delete from rt_edt
where mid=@mid
END
*/
--insert into rt_mission table
--if (select mseq from rt_mission where mid=@mid) is null
--begin
--end
WHILE(@@FETCH_STATUS <> -1)
BEGIN
IF (@@FETCH_STATUS <> -2)
BEGIN
IF len(@row) <= 16
BEGIN
set @mid=substring(@row,1,10)
set @callsign=substring(@row,11,6)
--insert into rt_edt(mid,callsign)
--values (@mid,@callsign)
END
ELSE
begin
set @src=substring(@row,1,1)
set @id=substring(@row,2,4)
set @cid=substring(@row,6,5)
set @utc=substring(@row,11,10)
set @tlat=substring(@row,38,8)
set @tlng=substring(@row,46,9)
set @TCC=substring(@row,75,3)
set @zone=substring(@row,78,3)
set @io=substring(@row,81,1)
set @distio=substring(@row,82,6)
set @va=substring(@row,88,2)
set @nat=substring(@row,90,3)
set @fish=substring(@row,97,3)
set @crs=substring(@row,101,5)
set @spd=substring(@row,106,5)
set @vid=substring(@row,111,6)
set @name=substring(@row,117,32)
set @side=substring(@row,149,12)
set @call=substring(@row,161,10)
set @comm=substring(@row,171,100)
select @trg_sub_code=f.subcode
from fn_sub_desc(@src,@nat,@tcc) f
insert into rt_edt
values (@mid,@callsign,@src,@id,@cid,@utc,@tlat,@tlng,@tcc,@zone,@io,@distio,@va,@nat,
@fish,@crs,@spd,@vid,@name,@side,@call,@comm,@trg_sub_code)
--deletes real time edt data from rt_edt table if mid has changed on current update - June 17, 2005 12:44PM
delete from rt_edt where callsign=@callsign and mid<>@mid
END
--delete from rt_edt
--where mid=@mid and (src is null or src='')
fetch next from e_cursor into @row
END
end
close e_cursor
deallocate e_cursor
GO
There are 3 separate stored procedures for each of the file types (.ais, .edt and .nav), each containing distinct data, that are using to populate the production tables.
June 24, 2005 at 6:29 am
Ooops! I forgot to mention in my last post that I am using three separate staging tables for each of the file types.
June 24, 2005 at 11:10 am
When the files are sent by FTP how do you initiate the bcp?. Do you have the bcp commands as Batch file.
Like I mentioned before before initiating the batch command just check IF Exists (Select * from StageTable) . Use your stored procedure to clear all staging table rows once processed.
http://www.sqlservercentral.com/forums/shwmessage.aspx?forumid=8&messageid=192677
Regards,
gova
June 24, 2005 at 11:24 am
Well, in his first post he actually wrote that he started BCP from his batch file...
June 24, 2005 at 12:15 pm
very well. How the batch file is executed? If he initiates the batch file himself then he can check whether there are any rows in the staging table. Or if he uses some automated process to execute the batch file that process should check whether there are any rows. The Idea is to make sure there are no rows in the staging table when the batch file is executed.
Regards,
gova
June 24, 2005 at 2:18 pm
The .bat file that initiates the Bulk Copy is executed from a Task Scheduling program and is set to run every minute.
June 24, 2005 at 2:43 pm
Can that program check whether there are any rows in the staging tables.
1. Create one step to check whether there are still rows in staging table that needs processing. If there are rows report previous process is still running and exit. else report success goto step 2.
2. Execeute your batch file that loads the batch files. On success goto step 3
3. Execute your stored procedure that process your data in staging table. On success delete rows in staging table.
If it is executed via a SQL Server job it would be lot easier.
Regards,
gova
June 26, 2005 at 9:04 am
Sure Hanslindgren it doesn't matter. I would say some places it is easy to implement than other places.
As I mentioned the idea is to make sure there are no rows in staging table when loading the data via bcp. I would prefer to do it in SQL Server job.
Yes we can use OSQL.EXE to execute SQL commands.
Step 1. Use batch file with OSQL.exe and check whether there are any rows in staging table. (It will be liitle difficult comparing SQL Server job to go to next step based on the return results)
Step 2. If there are no rows then load data to staging tables.
Step 3. Execute Stored procedure the original poster already has to transfer data to concerned tables.
Step 4. Remove the data in staging tables.
Step 5. Rename or move files from FTP folder so that same file is not processed again.
Using SQL server job or through a service that uses SQL DMO will be lot easier than doing it via OSQL.EXE is my opinion. But we can do all with OSQL.EXE as well.
Regards,
gova
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply