January 13, 2010 at 1:16 am
I have a situation similar to the example Link listed below in which I need to import all csv files in a directory without hard coding the file name because the file names include the date.
I have not been able to get the outside parties to change their naming convention
For some data feeds the naming convention is for example SalesYYYYMMDD and these data has to be loaded in date order.
How can I accomplish this using BCP in T-SQ? Unfortunately I'm accustomed to being able to set file standards in the past and I usually would accomplish using the File System Object in an ETL Tool as opposed to T-SQL.
http://www.nigelrivett.net/SQLTsql/ImportTextFiles.html
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
January 13, 2010 at 2:53 am
:laugh:
The sort part of the question was a no brainer, all I need to do to accomplish that is load the filenames in a temp table and use the order by clause.
I regret asking that question.
I believe that I can use the xp_cmdshell to read all of the file in a directory and then somehow load them into a table?
-- Reads the files
exec master.dbo.xp_cmdshell 'dir c:\import\*.csv'
-- Create a temp Table
-- Then loop thru the directory or something to load the FileNames into the Temp Table?????
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
January 13, 2010 at 3:00 am
Sorry, I found a solution to the problem.
Please disregard...
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
January 13, 2010 at 4:07 am
Would you mind sharing how you'll do it?
One reason is to provide a solution for someone having the same isse.
The second reason: there might be better/faster/cleaner solutions out there but the person who'd reply didn't have the time between your intial and your last post.
January 13, 2010 at 7:42 am
The approach take in the following example script may be more practice.
I can't hard code the file names so I believe that I could use the xp_cmdshell to read the files in the import directory & load them into a temp table & sort by date.
Does this sound like a practical approach or do you know of a better way?
I'm open to suggestions.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
January 13, 2010 at 8:48 am
I've often done this using the following code.
create table #data_extract_files
(
filename varchar(20),
depth tinyint,
fileorfolder tinyint
)
insert into #data_extract_files
exec master..xp_dirtree 'C:\data_extracts\',1,1
Hope this helps!
Simon 🙂
January 13, 2010 at 9:09 am
Cool!
Thank you very much!
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
January 13, 2010 at 9:15 am
Simon,
I assume that there is a way to gather other file attributes, such as DateCreated, modified, size, etc?
I usually use the FSO to get this information.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
January 13, 2010 at 9:23 am
I thought you were going to use the date in the filename - if not then I can't think of a way without using xp_cmdshell in which case you could do it in one go with your original plan of using DIR.
January 13, 2010 at 1:30 pm
Yes you correct but I may need to check the size of the file to ensure that it is within an expected range.
Also there this is only one report and there are so many and due to the fact that this system was managed outside of IT (Finance & Accounting) the importance of consistent file naming conventions when trying to automate processes.
Currently there are a lot of manual processes.
For example if the Calls to Sales Leads Ratio jumps up one week from 70% to 85% the SUper Smart Finance & Accounting Business Owner mentally recognizes that CSR enter some calls and they were incorrectly assigned.
Another example other columns are low than he knows in his head that one of the 18 files did not come in. The bottom line is there is a ton of automation to be done working with some bad data and inconsistencies.
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply