Excel Import

  • I have to design a Stored Procedure that imports an Excel File; (dont ask why; boss wants it that way). I have already been successfull importing csv files with bulk insert. Heared that I can make it with opendatasource and openrowset. Any hints?

    Please post an example

  • See the item in SQL BOL but here are an example of each.

    OPENDATASOURCE

    SELECT *

    FROM OpenDataSource( 'Microsoft.Jet.OLEDB.4.0',

    'Data Source="c:\Finance\account.xls";User ID=Admin;Password=;Extended properties=Excel 5.0')...xactions

    OPENROWSET

    SELECT c.*, o.*

    FROM

    OPENROWSET('Microsoft.Jet.OLEDB.4.0',

    'c:\Finance\account.xls';'admin';'';Extended properties=Excel 5.0', xactions)

    AS o

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

  • Thanks pal

  • Unfortunately, this SQL via OPENROWSET

    or DATASOURCE creates an error..

    OLE DB provider 'Microsoft.Jet.OLEDB.4.0' reported an error. The provider did not give any information about the error.

    Any thoughts?

  • As an aside, can anyone explain why someone would require a stored procedure when it's so easy to use DTS? Could it be portability?

    Don

    Donhttp://www.biadvantage.com

  • Try this version - we use it successfully to import an excel file

    FROM OpenRowset

    ('MSDASQL', -- Provider name

    'Driver=Microsoft Excel Driver (*.xls);

    DBQ=xxxx.xls',-- datasource; filename

    'SELECT * FROM [sheet1$]') as NewRates--sheet name goes in here

  • The way I did is to creat a Link server that is pointing to the excel file, and create a sp to insert the rowset into the table.

  • In answer to landrake, In my case, I go with Procs over DTS at every opportunity, and yes it's for portability mostly, but there are definite performance gains to be had as well. In most cases, DTS is used to perform functions not available nativly in SQL, or to simplify the process if it does exist. As an example, Exporting a file to excell format, used to be performed by bcp to file as tab delimited. Now we have the function as a DTS ability. I can tell you from my experience that the BCP method is faster with MUCH less overhead on the system than the DTS ability to do the same thing. I also see DTS as having many moving parts, meaning there's much more to go wrong. Up until 2000, DTS just wasn't a robust enough solution for me to rely on it too heavily. With this new version, I'm finding that it's much more usable, and not quite so buggy, but what ER software handles it? How do you integrate the documentation of it in your schema docs? Don't get me wrong, I think it's a good solution in some cases, but I have yet to find anything it does that cannot be accomplished in a more scalable and robust manner for someone with the skill to do it. I see it as a tool to be used to simplify things, or a crutch if you will.

    I worked on a job where they had hundreds of DTS packages written in 7.0 that had to be rewritten for a new server because the connection information was stored in the package. A very bad practice to start with, in my opinion, but it was the way they were done. I do not want to do that again. I don't blame DTS, but the dba who did it, and in re-writing the processes, I found that in most cases it was unneccessary anyway. In the end, I had four or five procs, and a handful of DTS packages to perform things I tested and found would be better in DTS. On talking to the DBA about it, he explained that although he wasn't able to write the procs to do it, DTS had a nice gui that allowed him to. He had no clue, but because DTS was easy, it allowed him to perform functions he otherwise wouldn't be able to. OMG, if that doesn't sound like a recipe for disaster, I don't know what does. I would argue for learning the required skills rather than jumping into DTS because it was easy. Over the years I have asked DBA's I worked with what their reason for using DTS was, and the answer I've recieved most often was that it was easier than coding. You even mention the ease of use in your comment. In my mind, that's not a legitimate answer. That's like saying that you don't need DBA skills because enterprise manager does it. Try to create a unioned view in EM, or use a case statement. And I'll put my indexes up against index wizards any day. I guess what it comes down to is whether your willing to rely on wizards and easy stuff, or if your willing to learn to perform it without them.

  • Scorpion, thanks for the great explanation. I also have converted DTS packages to BCP. Not only is it a little faster, but I hate the "black box" of DTS.

    You mentioned that storing connection info in the packages was a questionable practice. Where else can it be stored?

    Don

    Don

    http://www.biadvantage.com

    Donhttp://www.biadvantage.com

  • When it's needed, I use a secured table in my DBA database, to hold connection information, and a dynamic properties task to pass it. On password changes, I update a row rather than open DTS packages and change connection information. It's a little more complicated writing the package to start, but when your looking at hundreds of these, changing connection info can become a major job in and of itself.

    Another way of going about it is to create linked server connections on the fly using procs and mapping users. In that case, your user information can be static, but are mapped to the proper security on connection, then your DTS can reference the newly created connection, after the creation, and you destroy the connection and alias on completion. Again, I use a secured table to store the connection info and user info. You can create a linked server connection to practically any data source this way.

    I go to great lengths to keep from hard coding user information of connection information because there is no doubt that it WILL change. Chances are, it will be during an emergency. Has been for me.

    I hope my previous explanation didn't seem "dark" or anything. I forget that writing doesn't convey the facial's or expression, and a lot of what I say is kinda "Dry" I'm told, but I don't mean it badly at all.

    I'd be interested in knowing how others go about it as well?

    Edited by - scorpion_66 on 12/31/2002 9:45:57 PM

  • I am need to use dynamic file names and I am inserting the data into dynamic table names. When I generate the statements and try to EXEC I get the error

    Heterogeneous queries require the ANSI_NULLS and ANSI_WARNINGS options to be set for the connection. This ensures consistent query semantics. Enable these options and then reissue your query.

    I have tried setting them outside of the EXEC and inside of the string that I pass to the EXEC with no luck.

    Have you all dealt with this error before?

  • I found the answer. You have to turn the option on when creating the procedure.

    SET ANSI_NULLS ON

    SET ANSI_WARNINGS ON

    GO

    CREATE PROCEDURE xxx(...

    GO

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply