February 24, 2011 at 2:34 pm
Hi all,
Is there a way to bulk insert multiple files from a shared folder into a table without using xp_cmdshell? Using SSIS and turning xp_cmdshell on is not an option.
Thanks
February 24, 2011 at 2:39 pm
the BULK INSERT command doesn't use xp_cmdshell it basically has the same functionality as bcp(which does require xp_cmdshell);
here's an example of multiple files via BULK INSERT...
this example assumes I had four specific directories of multiple comma delimited TXT files, and all the files had identical formats, just lots of them.
my example below used xp_cmdshell to get the list of files....so if you have a set list of files, you can ignore that section.
--BULK INSERT MULTIPLE FILE Example
--a table to loop thru filenames drop table ALLFILENAMES
CREATE TABLE ALLFILENAMES(WHICHPATH VARCHAR(255),WHICHFILE varchar(255))
--the source table: yours already exists, but needed for this example.
CREATE TABLE BULKACT(RAWDATA VARCHAR (8000))
--some variables
declare @filename varchar(255),
@path varchar(255),
@sql varchar(8000),
@cmd varchar(1000)
--get the list of files to process:
--#########################################
SET @path = 'C:\DB\'
SET @cmd = 'dir ' + @path + '*.txt /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell @cmd
UPDATE ALLFILENAMES SET WHICHPATH = @path where WHICHPATH is null
SET @path = 'C:\DB2\'
SET @cmd = 'dir ' + @path + '*.txt /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell @cmd
UPDATE ALLFILENAMES SET WHICHPATH = @path where WHICHPATH is null
SET @path = 'C:\DB3\'
SET @cmd = 'dir ' + @path + '*.txt /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell @cmd
UPDATE ALLFILENAMES SET WHICHPATH = @path where WHICHPATH is null
SET @path = 'C:\DB4\'
SET @cmd = 'dir ' + @path + '*.txt /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell @cmd
UPDATE ALLFILENAMES SET WHICHPATH = @path where WHICHPATH is null
--#########################################
--cursor loop
declare c1 cursor for SELECT WHICHPATH,WHICHFILE FROM ALLFILENAMES where WHICHFILE like '%.txt%'
open c1
fetch next from c1 into @path,@filename
While @@fetch_status <> -1
begin
--bulk insert won't take a variable name, so make a sql and execute it instead:
set @sql = 'BULK INSERT BULKACT FROM ''' + @path + @filename + ''' '
+ ' WITH (
DATAFILETYPE = ''char'',
FIELDTERMINATOR = '','',
ROWTERMINATOR = ''\n'',
FIRSTROW = 2
) '
print @sql
exec (@sql)
fetch next from c1 into @path,@filename
end
close c1
deallocate c1
Lowell
February 24, 2011 at 3:04 pm
Thank you, so if i don't have a set list of file i still need xp_cmdshell? if it is, then i am out of luck, i can't turn on the xp_cmdshell
February 24, 2011 at 3:11 pm
Lowell's example uses xp_cmdshell to find a list of files to BULK INSERT from specific directories. If the files you'll be BULK INSERTing will never change, or can be sent to you (like through a proc parameter or loaded into a staging table for you to read) then you will not need xp_cmdshell.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
February 24, 2011 at 11:05 pm
e90fleet (2/24/2011)
Is there a way to bulk insert multiple files from a shared folder into a table without using xp_cmdshell? Using SSIS and turning xp_cmdshell on is not an option.
Another option is to use a SQLCLR function to enumerate the file list. You could also perform the whole operation via SQLCLR using the SqlBulkCopy interface, but that might be more than you need.
February 24, 2011 at 11:08 pm
Thank you all, i found a way to do this using powershell
February 25, 2011 at 10:05 am
SQLkiwi (2/24/2011)
e90fleet (2/24/2011)
Is there a way to bulk insert multiple files from a shared folder into a table without using xp_cmdshell? Using SSIS and turning xp_cmdshell on is not an option.Another option is to use a SQLCLR function to enumerate the file list. You could also perform the whole operation via SQLCLR using the SqlBulkCopy interface, but that might be more than you need.
Be careful when using the SqlBulkCopy class to load files. Most of the WriteToServer methods accept memory resident data structures (e.g. DataSet) meaning if you try loading a very large amount of data from a file in one shot it can use a lot of server resources.
Instead, if you want to use SqlBulkCopy and there is a chance that you'll receive very large flat-files I would recommend using the WriteToServer(IDataReader) method. Exposing a flat-file via an IDataReader interface can be done using a text/csv ODBC driver with System.Data.Odbc.OdbcDataReader.
I like the CLR for a lot of things but I am actually talking myself out of it for this task as I go...at this point if I have to employ the text/csv ODBC driver I might as well just use OPENROWSET or a Linked Server and keep it all in T-SQL 🙂
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply