November 16, 2016 at 8:10 pm
I would Like to ask the experts regarding this matter. I was tasked to create a process on importing data from a CSV File to Database. My first thought is SSIS, but the additional characteristics of the process would be
1) The process should Import everyday.
2) It may have to import additional columns. (there would be cases that the data would be 4 columns, some might be 6)
Then I thought of DTS; But, I have read in a article that DTS was depreciated in SQL SERVER 2000, if so, is there any other alternatives for SQL SERVER 2008 R2 Express to Import data from a excel file to the database? I have to emphasize that I am using Express edition to convey my limitations on what I could do and the tools available in my DBMS. What I am thinking regarding this is that, my only solution would be, manually importing the data from the excel file to the database. I would have to check every day the data, then load it to the database. If the data has additional columns, I have to include it the table by adding a column to the table before inserting values in it. So DTS is my answer because I could automate the said process if I would use DTS.
November 17, 2016 at 8:14 am
in express edition, you still have access to things like BCP, SQLCMD, the BULK INSERT command, etc, so there are options. The challenging part would be dealing with 4 or 6 columns. Do you have a way to easily tell by filename or day of week or something when the CSV file will be 4 and when it will be 6 columns?
Is it possible to always export the CSV from its source as 6 columns with the extra 2 empty on the cases where you only want the 4 columns?
November 17, 2016 at 5:06 pm
I'm not quite sure when and where the data will be 4 or 6 columns since the data source is a scientific device which measures temperatures. It has its own program that could export its data to CSV. What I am thinking is, could there be a command or a "reader" in SQL express 2008 R2 that could somehow "read" how many columns does a CSV file have? and then, when the CSV has a column count greater than the database; SQL would adjust, It will insert the new columns then update the table, this is somehow possible in DTS since we could create a package which in by steps, we could declare what the DBMS would do(but I think the counting of columns of a CSV file is NOT POSSIBLE even in DTS).
November 18, 2016 at 10:05 pm
could anybody answer this please?
November 19, 2016 at 6:14 pm
joshua 15769 (11/17/2016)
I'm not quite sure when and where the data will be 4 or 6 columns since the data source is a scientific device which measures temperatures. It has its own program that could export its data to CSV. What I am thinking is, could there be a command or a "reader" in SQL express 2008 R2 that could somehow "read" how many columns does a CSV file have? and then, when the CSV has a column count greater than the database; SQL would adjust, It will insert the new columns then update the table, this is somehow possible in DTS since we could create a package which in by steps, we could declare what the DBMS would do(but I think the counting of columns of a CSV file is NOT POSSIBLE even in DTS).
Yes. Read just the first row as a single blob and count the number of delimiters. If you have 3, then there are 4 columns. If you have 5, then there are 6 columns. You can then make a path decision as to how to import the table from there.
If that first row is also a column header row and you're loaded your staging table, the column names from that can be used to map your columns the way you need to.
Now, here's my disclaimer... I don't do such a thing with DTS, SSIS, SSDT, or any other GUI based tool. I actually find it more complex to use such tools. It can all be done quite easily using BULK INSERT and a little dynamic SQL, especially since you're getting the output from a machine of one sort or another and such data tends to be very well formed and extremely consistent.
--Jeff Moden
Change is inevitable... Change for the better is not.
November 20, 2016 at 5:13 pm
Thank You Mr. Jeff Moden
And I would like to take this opportunity to say that your Profile Photo never fails to amuse me, even RBAR!
And, to the topic, I would also like to ask on the least how will SQL will read the CSV file without using any GUI tools? This information would be enough to get this project started.
Thank you!
November 20, 2016 at 7:43 pm
I just learned open rowset!
SELECT *
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0 Xml;
Database=C:\DataFiles\EmployeeData1.xlsx',
[vEmployee$]);
November 21, 2016 at 4:52 am
joshua 15769 (11/20/2016)
I just learned open rowset!
SELECT *
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0 Xml;
Database=C:\DataFiles\EmployeeData1.xlsx',
[vEmployee$]);
That's correct for an Excel file (although I typically add some other settings like IMEX=1). That's not the way you'd read a CSV file. You could still use OPEROWSET for such a thing by changing a couple of the parameters but I believe you'll find that BULK INSERT will import a consistent CSV much faster than the ACE drivers.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply