September 20, 2009 at 6:56 am
I am frequently called upon to import data from flat files into SQL tables. Historically I have used bulk insert and fought with csv files that have comma's imbedded in fields. I recently started working with openrowset to import from xcell files and discovered that this also works with csv files far better than bulk insert for several reasons not the least of which is that it handles embedded commas automatically, at least in the cases I've tried so far. The problem I am currently having is that it tries to determine the field type with only marginal success. I have several cases where decimal [usually money] fields are coming in as integers dropping anything to the right of the decimal. All of my research to date indicates that I need to use a format file to control the field type but I have not been able to find examples that address importing a csv file to SQL. Does anyone have an simple example of how a format [preferably xml but I'll appreciate anything that works] that can be used to import a csv to SQL defining the field types. I don't need anything fancy like excluding fields or changing the order.
Thanks
Chuck
September 20, 2009 at 1:36 pm
Microsoft LogParser does great job regarding varying commas inside CSV and type controlling what goes into SQL
September 20, 2009 at 4:22 pm
dbo.benyos (9/20/2009)
Microsoft LogParser does great job regarding varying commas inside CSV and type controlling what goes into SQL
Tal... I've not heard of the Microsoft LogParser. How do you use it?
--Jeff Moden
Change is inevitable... Change for the better is not.
September 21, 2009 at 4:08 am
It's a SQL like log querying tool by Microsoft, originally developed for querying W3C log files, but has many capabilities, featuring CSV, TSV, FileSystem and many more input formats. The most powerful part is the "VBScript" like functions.
Useful info:
and have a quick look at the CHM file. It has lots of options and samples.
Just for a taste, here are 2 sample scripts:
Print the 10 largest files on the C: drive into a CSV file:
LogParser "SELECT TOP 10 Path, Name, Size INTO c:\temp\LargestFiles.csv FROM C:\*.* ORDER BY Size DESC" -i:FS
RSS: Read XML directly from HTTP into CSV:
LogParser "SELECT title INTO c:\temp\MyRSS.csv FROM http://blogs.msdn.com/MainFeed.aspx#/rss/channel/item" -i:XML -fMode:Tree
This thing is so powerful I use it at least twice a day. I have just imported free text files into SQL table. I had to generate a word list from a text file directory. It fetched 12,000 (distinct) words and converted them into 12,000 records in a dictionary table in less than 10 seconds...
September 21, 2009 at 6:38 am
Outstanding. Thanks for the tips, Tal. I'll give it a whirl.
--Jeff Moden
Change is inevitable... Change for the better is not.
September 21, 2009 at 10:20 am
I bet you won't be sorry for that.
good luck and contact me if you need any help...
Tal
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply