June 12, 2013 at 4:41 pm
Which tool do you prefer to use to import flat files into SQL Server tables and why? We're trying to migrate some DTS packages into Stored Procedures but some contain data pumps which do these kind of imports.
We've had lots of problems creating SSIS packages due to the lack of knowledge, that's why we're looking for alternatives.
Thanks for all the ideas we might get.
June 13, 2013 at 9:08 am
June 13, 2013 at 9:42 am
Each method has advantages and disadvantages, depending on what you want to do.
Where I currently work, I inherited a system that was very heavily OPENROWSET based because it had allowed (lazy?) developers to import any file using a generic process with a SELECT * INTO FROM OPENROWSET statement. The fact that caused all manner of validation errors and data control problems later was overlooked.
For purely importing, I like BULK INSERT because it means that you can do everything within T-SQL; but then I've also had a rather quirky problem with this with a "free text" type file that contained a very wide last column, one file received was a column short, but due to the potential size of the final column it resulted in the entire next row being imported into the final column, and the rowterminator happily being picked up.
Bcp is powerful, but then you can't call it from T-SQL unless you invoke xp_cmdshell.
Follow me on twitter @EvoDBACheck out my blog Natural Selection DBA[/url]
June 13, 2013 at 11:01 am
Thanks Matthew,
We're currently working with bcp with xp_cmdshell, but might try bulk_insert yo avoid possible DBAs complaints when applying code into production. There are several things to learn, but fortunately the files we're receiving have a defined format and won't change since it's all an automatic process (using bcp :-)).
For the different advantages and disadvantages is why I asked for personal preferences.
June 13, 2013 at 11:39 am
Well if you're going the bcp route with xp_cmdshell, you could consider using certificates; have a login mapped to that certificate and grant the execute privilege for xp_cmdshell to that login only. Then any stored procedures that need to use bcp get signed by that procedure. This would mean you could allow the functionality of bcp to take place in the stored procedure, but users wouldn't directly be able to call xp_cmdshell itself.
The other annoyance using bcp in this manner is when someone populates a table or whatever they're trying to export within a transaction and then call xp_cmdshell to bcp that out within the same transaction.....
Follow me on twitter @EvoDBACheck out my blog Natural Selection DBA[/url]
June 13, 2013 at 12:34 pm
I just keep laughing with your post, but only because it's completely right and won't fit in here. 😀
We have multiple users using same login (at least is not sa or any other sysadmin). No user should call the SPs, everything should be done in an automatic process every night and the only user intervention is when something fails.
And what keeps me rofl is that we're not allowed to use explicit transactions. I don't know why but that's a silly idea of our dbas.
June 13, 2013 at 1:01 pm
Well, considering some of the transaction handling I've seen in the past, I could see why someone might get pernickety about them! But I think education is always a better plan than banning the use of something.
If it was me, even if no user should call the sp, I'd enforce it so they can't.
Follow me on twitter @EvoDBACheck out my blog Natural Selection DBA[/url]
June 13, 2013 at 1:06 pm
Whenever I have to import data more than once, I very much prefer to use BULK INSERT. I create the format file in XML, write the SQL to do the work and I'm off an running. I have several cases where we receive external data from clients regularly and the consistency and performance of BULK INSERT have proven themselves repeatedly. I've found them to be very stable and reliable - perfect for repeated operations.
An added bonus is that they don't suffer from the ailment that plagues the basic INSERT statement that is "String or binary data would be truncated" and it doesn't tell you which field. :crazy: BULK INSERT tells you the row and column that causes the problem.
I know it isn't a one-size-fits-all solution, but it is my favorite approach by far.
June 14, 2013 at 4:04 am
My advice, for what it's worth, stick with SSIS - get the knowledge. I agree, it can be a pain (is frequently a pain), but it's matured a lot in recent iterations, and there's a big online community. I don't know of any other product that will give you the same possibilities.
Also the dtsx files are xml files...if you've got xml expertise, get to know the files. This will allow you to make some blanket changes that the designer will penalise you for.
June 14, 2013 at 6:00 am
I prefer BCP over the BULK_INSERT.
It doesn't require server role grant, SQL Server doesn't have to be able to see the file and you won't run into account delegation issue.
ETL processes have nothing to do in SQL Server, much better for everyone if it's scheduled, run and monitored from another machine.
As David says, if you have the time to learn SSIS then you should try to use it, way more options and possibilities than BCP or BULK_INSERT.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply