Data transformation from an Excel file

  • Dear friends,

    Require some help regarding this -

    I have an excel file with 6 columns and 100 rows. I want to execute a  stored proc whose parameters would be values from each of these 6 columns and 100 times ie. for each row.

    Kindly let me know what is the best possible way to accomplish this through SQL 2000 DTS.

    Many thanks in advance,

    KG


    KG

  • Unfortunately dts is not built to handle row by row data except for transformation purposes, which it is very good at.

    I cannot think of a way of passing each row into a stored procedure as a paramater. I would guess it would also be hard to troubleshoot if anything untowards occurs.

    As a workaround, I would use the dts package to import the data into a dummy table. You can then write a sql task within the package and call it at the end of the import. This sql task would use a cursor to step through all the rows calling the procedure each time. Update the rows in the dummy table as you step through them so that you can find out which row has caused an error.

    It's all a little messy and time consuming, but maybe someone else can come up with a more viable solution.


    ------------------------------
    The Users are always right - when I'm not wrong!

  • Thanks Jonathan. I don't know if this can be done in any way thorugh activex scripts??

    Thanks,

    kg


    KG

  • In theory it is possible to call a stored procedure through activex, but you would have to find a way of looping through a row, calling the sp, looping through the next row...etc

    Alternatively, you could use the logic being applied in the stored the procedure inside an activex task. IE take the logic from the sp so you don't have to call it and apply it as a transformation in activex script which will run for each row without you having to specify a loop. Again makes it very hard to check for errors if they occur.

    Without knowing exactly what this sp does, its hard to say.

    So many ways to skin that cat.


    ------------------------------
    The Users are always right - when I'm not wrong!

  • Without wishing to assume too much about the task, this does sound like a row-based vs. set-based problem. I agree with Jonathan's first post up to a point - bring the data into a table and then process it. However, it should be possible then to process the data as a set - rather than run the SP 100 times, re-write it to run once.

    Procedural programmers (and I am one) can find it hard to switch between the row based and set based paradigms. It is just so easy to forget that code like update tblA set colA = colB * colC where colD = colE does what seems like a row-by-row job in one go.

    My experience is that data manipulation by ADO within an ActiveX script is slower than running an SP via an Execute SQL task, and ActiveX transformations are slower still. OK for 100 rows but not scalable.

    HTH

     

    Bill.

  • KG,

    If I am not mis-understanding your original post, you can do your job easily in DTS.

    1. Create the DTS (say called ABC) by saving it through importing the xls into a temp table.

    2. Edit the ABC in Enterprise Manager->Data Transformation Service->Local Packages->ABC.

    3. Add a "Execute SQL Task" into the window.

    4. Then you can edit it.

       4.1 Give it Description

       4.2 Point the Connection to the one that connects to the temp table

       4.3 Write your SQL, like "EXEC SP_ABCD", or even a set of script like "create cursor" and loops.

    5. Add one "On Success" workflow through the menu to connect from the CONNECTION of Temp Table --> the SQL TASK in 3.

    6. Surely, execute the DTS

    It is quite flexible to do it in DTS. And the SQL TASK gives you many potential.

    You can also schedule it if you needed.

    Quite Powerful?

     

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply