Multiple excel files to be transfered to Database

  • Is there any way to transfer multiple files in the database through DTS package.

    Let's say I have more than 100 excel files and I wanted to transfer data in one short using DTS package or any other technique. It will take long time if I go file by file.

    Appreciated if anyone could help me.

  • You will still need to go one at a time especially if they have to occurr in a specific order. If not then you can setup several seperate unlinked tasks in the same package and set the simultaneous count in the DTS package properties to run more than one piece at a time.

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

  • I have a similar situation where I recieve a lot of excel files from remote offices and needed to import them.

    I created a proc to walk a directory and dynamically link the excel files as a linked server, aliasing the name, and then import the data. The files all have the same structure, and are all placed in a single directory by another process, so it works well for me. If your process is similar enough, or it sounds like this will work for you, I'll post the script.

  • Scorpion, can you please post the script?

    Thank you.

  • Scorpion. Can u please post the script and also let me know where can I find it.

    Thanks a lot

  • I apologize. This thread slipped by me, and I just remembered to check. Of course, Let me clean it up slightly, and it will be my next post. Again, I apologize for the delay.....

  • Here is a reduced code set which I hope will be useful to you. Modify it to your use. I quickly concatenated it together and stepped through it once but you should include your own error checking and logging procedures.

    I use a slightly more complex process using this basic logic to Import roughly 4000 excell files a day, from 10kb up to 400MB, and it generally finishes in under 30 minutes.

    ---------------------------------------------------------------------------------------------------------------------------------

    ---------------------------------------------------------------------------------------------------------------------------------

    --DISCLAIMER:

    --

    -- I present this as an example, and expect it to be modified to fit your needs.

    --NOT as a working proc, as I stripped out the items specific to myself and my

    --company and replaced the items which I could not strip out with generics.

    --Also, I concatenated three processes to create this example as mine has grown

    --complex enough to require three jobs to run it.

    --I have one job which populates a queue table.

    --The next processes what is in the queue.

    --And the last archives the file and removes it from my queue

    --It is neccessary to have permissions to the file directories involved.

    --If run through a job, the SQL Agent login must have appropriate rights to access the same.

    --Step through it carefully one step at a time, and have fun. It shouldn't take more than an hour or so

    --to modify it to your purposes.

    ---------------------------------------------------------------------------------------------------------------------------------

    ---------------------------------------------------------------------------------------------------------------------------------

    Declare @FilePath as Varchar(4000),

    @Command as Varchar(4000)

    --I actually have a table which has many different directories to import from along with the archive directories

    --and the end tables for the data, though the nesting makes it difficult to follow for some.

    Select @command = 'Dir "C:\Temp" /B' --Path to Import Files /B gives full path with no details

    Create table #text_ret(

    cmdoutput varchar(4000)

    )

    Insert into #text_ret

    exec master..xp_cmdshell @command

    --We now have a table containing a list of files at the given path.

    --Lets process them

    StartLoop:

    Select Top 1 @FilePath = cmdoutput

    From #text_ret

    Where cmdoutput Is Not Null

    If Len(@FilePath) > 4-- must be at least 5 character to be a valild excell file ex. 1.xls

    Begin

    If Substring(@FilePath, (Len(@FilePath ) - 3) , 3 = 'xls'--check the extention to verify its an excell file

    Begin

    --I had logic here to check for existence of previous linked server and drop it before going on in case of previous failures

    --Link an Excell file

    EXEC sp_addlinkedserver

    'ExcelSource', --Alias Name For Excell File

    'Jet 4.0',

    'Microsoft.Jet.OLEDB.4.0',

    @FilePath , --Path To Excell File (Can be UNC)

    NULL,

    'Excel 5.0'

    --Error checking for success or failure here

    --Create the Linked Logins

    EXEC sp_addlinkedsrvlogin

    'ExcelSource', --Alias Name Used For Excell File

    'false',

    'LinkUser', --Log-In used to access Sql Server or perform job

    'Admin', --Default User for excell files (Swap to secured user if security is on in file)

    NULL--Default password for excell files (Swap out for secured files.)

    --Error checking for success or failure here

    --The actual acquire of the data into a temp table

    Insert Into #TempTable ( FieldList )

    SELECT ( FieldList )

    FROM ExcelSource...Sheet1$--Take note of the three periods after the alias

    --Error checking for success or failure here

    --I log statistical information here to give me info on length of time to import, etc...

    EXEC sp_droplinkedsrvlogin --Remove the logins for linked server

    'ExcelSource',

    NULL

    --Error checking for success or failure here

    sp_dropserver

    'ExcelSource', --Drop the linked server

    'droplogins'--Optionally drop logins as well (not neccessary to do this in above step if this is used)

    --Error checking for success or failure here

    End

    End

    --And if everything is successful all the way through finish up

    /*

    --I archive the file to an archive directory and attatch the import date to the end

    --My process goes through some extra steps to insure success but I am not including these as the complexity increases

    --I parse the filepath and filename into seperate variables neccessary to rename and move the file.

    --FYI : A move is much much faster than a copy and delete

    --I will leave this as an EXAMPLE of what I do though

    Select 'Creating Directory: '+ @NewPath

    Select @command = 'MKDIR "' + @NewPath + '"'

    exec master.dbo.XP_cmdshell @command

    Select @filename = @FlatFilePath + @FlatFileName

    Select @CmdTxt = 'move "' + @Filename + '" "' + @NewPath + '\' + @Convdate + '_' + Replace(ltrim(rtrim(Substring(Convert(varchar(20),getdate()),13,5))),':','_') + '_' + @FlatFileName + '"'

    Select 'Moving File: ' + @command

    exec master.dbo.XP_cmdshell @command

    End

    --Lets Make sure the file was moved before deleting the job

    Select @echo_text = 'File_Exists'

    Set @Answer = null

    select @command = N'if exist "' + @filename + N'" echo ' + @echo_text

    create table #textret2(cmdoutput nvarchar(20) null)

    insert into #textret2 exec @retcode = master..xp_cmdshell @command

    Select @Answer = cmdoutput from #textret2 where CMdoutput is not null

    drop table #textret2

    Select @Answer = ISNULL(@Answer,'File_Does_Not_Exist')

    --And based on that information, we know whether we are done or not.

    If @Answer = 'File_Does_Not_Exist'

    Begin

    Delete FROM #text_ret

    WHERE cmdoutput = @FilePath

    End

    */

    --And of course, check for another in the table and loop to do it again

    Delete FROM #text_ret

    WHERE cmdoutput = @FilePath

    If Exist (Select cmdoutput From #text_ret Where cmdoutput Is Not Null)

    Begin

    Goto StartLoop

    End

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply