June 16, 2004 at 5:21 am
I'm updating a DTS package to include a step to copy a SQL table to a flat file as a backup. Mind you this package already has several such steps, so I know I've done it before.
I define the SQL Database source object.
I define the flat file ("Text File") destination object. I previously exported a copy of the table as a flat file in CSV format so I'd have something to point to in the object navigation. I set the properties to Delimited and the usual delimiters.
I define the Transform Data Task between the SQL and flat file. I go into the properties on the task, set the source table, preview it, click on the Destination tab and I get the Define Columns dialog. It lists all the columns in the source table. I click "Populate from Source", then "Execute". The dialog closes and I'm in the Destination tab for the flat file, but there are no columns defined. It shows the appropriate "table name" (the flat file), but no columns. So being stubborn I try again and click the "Define Columns" button. I get an error in MMC.EXE "The instruction at "0x77c47e2f" referenced memory at "0x00000000". etc. etc and MMS (Enterprise Manager) goes away.
Am I missing something? As I said above this package already has several similar steps so I've built these before and I thought I had the process down cold.
Any thoughts?
June 16, 2004 at 5:27 am
Additional Info:
I went back to one of the steps in the existing job flow that runs successfullly every night and when I click on the Destination tab of the Transform Data task there are no columns!! Where'd they go?? I can execute the step, but when I try to define the columns for the existing, working step I get the same error as above.
HELP!!
June 16, 2004 at 7:49 am
On the machine you are editing it on is the path valid? When executed it will be executed by the job on the calling server. But say you have going to an E:\ drive and you are editing on a machine that does not have this drive or valid path then it will fail.
June 16, 2004 at 8:01 am
All my paths to the flat files use full UNC paths (\\server\share\directory\file.csv).
I also tried going to the machine on which the DTS job runs and using standard Windows paths (D:\directory\file.csv) with the same result....
June 16, 2004 at 10:29 am
I tried it and it worked fine.
Have you recently applied a service pack to the server?
Alternatively, the dts package may be corrupt.
Have you tried the same task in a new package?
Hope this gives you a pointer in the right direction
June 16, 2004 at 10:39 am
This has happened to me serveral times. CHANGING and existing job does not work that well. Many times I have deleted the step, saved the DTS Job, reopened it, and recreated the step and all seems to work. It is seems to be a 'feature' from MS. Try that.
June 16, 2004 at 10:54 am
All good ideas. All things I've tried. The problem happens with two similar packages on different SQL Servers. It happens from various client platforms including both XP and Win2K. I've tried just creating a new job and setting up a single export to flat file step and I get the same problem.
Oddly enough I can use EM's "All Tasks->Export" to create a package that exports a table to a flat file, save that package and then go into DTS designer and all the properties are right, I can change them and everything.
Very odd.....
June 17, 2004 at 4:52 am
Check you service pack settings and mdac version on both servers. Sounds like it is poitning in that direction somehow.
June 17, 2004 at 5:12 am
I have that question in to the SQL SA group. They may have changed something and not told us...
I'll keep everyone posted.
March 16, 2005 at 8:33 am
I'm encountering the same problem when exporting to a flatfile. I need to reconfigure the transform quite often, so using the Import/Export wizard each time to reconfigure is not the most ideal for me. I've been searching online for a solution but haven't found any definite one yet.
I did figure out a work-around. It seems like I only run into the error when there is a high number of destination columns, and the destination column names are collectively too long. So instead I rename the destination columns to 'A', 'B', 'C', and so forth... by doing so, the error doesn't happen anymore.
Hope this helps.
June 20, 2005 at 12:29 am
I am glad to see it is not just me having this problem. It seems to be a very strange bug in the Enterprise Manager UI.
You can use the 'long' column names in your select statement if you use an 'AS' statement to alias them as shorter output names. However as soon as you drop the 'AS' statement and redo 'execute' part of populating the column names, it craps itself. This is only a problem if you want the proper headings in the first line of your text output file.
One possible workaround (which I havent tried yet) is
1. do not select to use headings in the first row of output file
2. select the column names as text contants in your select statement
3. UNION this with the results of the real select statement, using the AS staement to shorten column names
There may be problems with daa types not matching in the select statement though.... in which case you can CASt all data types as text....bit of a pain in the arse though!
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply