July 14, 2010 at 8:22 pm
I am going to attach an Excel workbook with three worksheets. The worksheets contain:
1. A picture of the product and measurements that will be provided in a CSV file from an equipment vender. This will help understand the data within the SQL DB.
2. A layout in Excel with the columns of information that will be provided to me in the CSV file. There can be a various number of text fields that will be imported into a SQL DB Table.
3. The third worksheet contains a layout of the SQL DB table that I am trying to import the data into.
Since some of the CSV fields are fixed and specific, these do not meet an array description. There can be from from 12 to 256 data values on the end of each row in the CSV file. These need to be handled properly to fit into a proper SQL DB. Read description in attached Excel workbook for more details.
Thanks for any help and advice someone can provide. I've used the SQL Server Central site to solve many of my problems on my own with well documented solutions. I could not find one for this problem.
Dean
July 14, 2010 at 11:36 pm
dsabjp (7/14/2010)
I am going to attach an Excel workbook with three worksheets. The worksheets contain:1. A picture of the product and measurements that will be provided in a CSV file from an equipment vender. This will help understand the data within the SQL DB.
2. A layout in Excel with the columns of information that will be provided to me in the CSV file. There can be a various number of text fields that will be imported into a SQL DB Table.
3. The third worksheet contains a layout of the SQL DB table that I am trying to import the data into.
Since some of the CSV fields are fixed and specific, these do not meet an array description. There can be from from 12 to 256 data values on the end of each row in the CSV file. These need to be handled properly to fit into a proper SQL DB. Read description in attached Excel workbook for more details.
Thanks for any help and advice someone can provide. I've used the SQL Server Central site to solve many of my problems on my own with well documented solutions. I could not find one for this problem.
Dean
Heh... it bloody well figures. I'm on vacation until Sunday and don't have access to an SQL Server box and NOW the fun problems come out. 😛 I've had to do similar to this a couple of times in the last year and it's a seriously fun task with an awesome T-SQL solution.
Dean, you did a heck of a nice job in your explanation of what the requirements are and the spreadsheet helps to easily understand the problem. To help others help you better, [font="Arial Black"]could you attach one of the CSV files AND post the target table CREATE TABLE statement[/font] so we can start working on this puppy for real? Thanks. Of course, no private information, please.
If no one answers this post, I'll give it a shot without access to SQL Server if you act as my "eyes". Also, if no one answers in another 24 hours or so , then "BUMP" this thread simply by posting to it with the message that "Jeff Moden said to bump this thread in 24 hours." so I'll be alerted and the "forum ettiquette" watchdogs won't bust your chops. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
July 17, 2010 at 8:42 pm
Good ideas. I seem to have found an option that works pretty good. Using OPENROWSET my import works okay as long as the table has equal to or more columns than the csv file. Something I failed to mention; it is possible that different rows have different number of fields in the CSV file. So far in the examples I've played with Openrowset allows records to be added and any missing columns are taken off from the last columns. I can live with that since I can delete the rows later if needed. Now for the few records that were possibly causing the whole import to crash, I now get the whole import and can do a search and delete rows afterwards. Sorry for the delayed response. I've just got back to my computer. Next time I post something I will add the create table commands, etc.
If there is any negative issue yet with Openrowset, I haven't run into it yet. Any suggestions or cautions would be appreciated.
Thanks 🙂
Dean
July 18, 2010 at 12:11 pm
Does that mean that you're all set?
--Jeff Moden
Change is inevitable... Change for the better is not.
July 19, 2010 at 4:11 pm
This sort of approach may be of interest to you. For testing purposes, I created a comma-separated file called Challenge.csv based on your Excel sheet CSV_File_Format. The next step is to bulk load the text file into a staging table using BULK INSERT. Then using a double CROSS APPLY method, the data is resolved into the desired format corresponding to SQL DB Table. The trick would then be to create a version which dynamically builds the SQL to accommodate different formats of the incoming file.
IF NOT OBJECT_ID('tempdb.dbo.#ChallengeTemp', 'U') IS NULL
DROP TABLE #ChallengeTemp
CREATE TABLE #ChallengeTemp
(
[Date - Time] varchar(50),
[Batch Number] varchar(50),
[Batch Product # (10 products per batch)] varchar(50),
[Number of Tracks = N (measurement tracks)] varchar(50),
[Number Of Measurement Locations Per Track = L] varchar(50),
[Track 1 Measurement 1] varchar(50),
[Track 1 Measurement 2] varchar(50),
[Track 1 Measurement 3] varchar(50),
[Track 1 Measurement 4] varchar(50),
[Track 2 Measurement 1] varchar(50),
[Track 2 Measurement 2] varchar(50),
[Track 2 Measurement 3] varchar(50),
[Track 2 Measurement 4] varchar(50),
[Track 3 Measurement 1] varchar(50),
[Track 3 Measurement 2] varchar(50),
[Track 3 Measurement 3] varchar(50),
[Track 3 Measurement 4] varchar(50)
)
BULK INSERT #ChallengeTemp FROM 'C:\Challenge.csv'
WITH
(
FIELDTERMINATOR = ',',
FIRSTROW = 2, TABLOCK
)
;WITH cte AS
(
SELECT C.[Date - Time], C.[Batch Number], C.[Batch Product # (10 products per batch)], C.[Number of Tracks = N (measurement tracks)], C.[Number Of Measurement Locations Per Track = L],
Z.[Track Number], Z.[Track Measurement 1], Z.[Track Measurement 2], Z.[Track Measurement 3], Z.[Track Measurement 4]
FROM #ChallengeTemp AS C
CROSS APPLY
(SELECT 1, [Track 1 Measurement 1], [Track 1 Measurement 2], [Track 1 Measurement 3], [Track 1 Measurement 4]
UNION ALL SELECT 2, [Track 2 Measurement 1], [Track 2 Measurement 2], [Track 2 Measurement 3], [Track 2 Measurement 4]
UNION ALL SELECT 3, [Track 3 Measurement 1], [Track 3 Measurement 2], [Track 3 Measurement 3], [Track 3 Measurement 4])
AS Z ([Track Number], [Track Measurement 1], [Track Measurement 2], [Track Measurement 3], [Track Measurement 4])
)
SELECT cte.[Date - Time], cte.[Batch Number], cte.[Batch Product # (10 products per batch)], cte.[Number of Tracks = N (measurement tracks)], cte.[Number Of Measurement Locations Per Track = L],
cte.[Track Number], X.[Measurement Location], X.[Thickness]
FROM cte
CROSS APPLY
(SELECT 1, [Track Measurement 1]
UNION ALL SELECT 2, [Track Measurement 2]
UNION ALL SELECT 3, [Track Measurement 3]
UNION ALL SELECT 4, [Track Measurement 4])
AS X ([Measurement Location], [Thickness])
July 19, 2010 at 7:06 pm
I'm good with what I've got. Thanks for checking. Have a great morning, noon or afternoon; whenver you get this.
Dean
🙂
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply