June 25, 2003 at 1:33 am
Hi,
Trying to insert a large amount of data. Now, normally I would just use a dts.
however, I need to process (using stored proc)this data I upload, and create a kind of overall batch record for each upload. and since, I am doing it over the internet, a lot of pple can upload batches at the same time..
Anybody have any idea how I can do it such that i always know which records belong to which upload??!!
Help!
Dawn
June 25, 2003 at 2:05 am
Hi there
Your in luck, we do this all the time, not huge numbers of users mind you (perhaps 5 concurrent max at times), but still, files are big. We use ASP upload then call some stored proc routines that do somthing like...
IF @Lodgement_No IS not null and @UploadedFilename IS not null BEGIN
SET @v_sql = 'BULK INSERT #lodgement_enrolment FROM ''' + @file_upload_path + @UploadedFilename + ''' WITH (FORMATFILE = ''' + @file_upload_path + 'LodgementUploadFile.fmt'')'
EXEC(@v_sql)
SET @ErrorNum = @@error
IF @ErrorNum > 0 GOTO EndSP
BEGIN
CREATE CLUSTERED INDEX ix_enrolment1 ON #lodgement_enrolment (lodgeenrol_cps_no, lodgeenrol_module_code, lodgeenrol_trs_no)
CREATE INDEX ix_enrolment2 ON #lodgement_enrolment (lodgeenrol_id)
<etc>
The user requires bulk load instance privs of course, and in our case, also required hgib level privs on tempdb for the account running the stored proc.
The scheme above works a treat and is very quick.
Cheers
Ck
Chris Kempster
Author of "SQL Server 2k for the Oracle DBA"
Chris Kempster
www.chriskempster.com
Author of "SQL Server Backup, Recovery & Troubleshooting"
Author of "SQL Server 2k for the Oracle DBA"
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply