November 30, 2005 at 10:53 am
Just post to the wrong category, here I am, I don't mean to have multiple postings.
Hi all,
I used DTS to create a package to do auto daily import.
The package is very simple. Delete all data in the table then import data from a text file into the table. Everything look OK except the last column Reissue, the char 'Y' did not import for three rows(only three rows have flag 'Y', all others have null), those three rows still have data null.The column definition look like the following.
[Reissue] [char] (1) NULL
But when I manually delete the data in the table and import data(DTS-->all task-->import data) from the text file to the table, I see the char 'Y' in the last column for those three row.
What's going on, I am completely confused.
can someone help? Please.
Betty
November 30, 2005 at 11:31 am
When you are running your manual import, are you using the same package as your job or are you using the wizard? Is this a bulk imort job? Does it use a format file?
November 30, 2005 at 11:36 am
Hi Hourneyman,
I used the wizard. It use a format file. with the first row have column name and each column data is seperated(delimited) by "|".
Don't know if it makes any difference.
Betty
November 30, 2005 at 11:41 am
When you are manually doing your import, are you using the wizard again, or running the actual scheduled job?
Can you post your table DDL, a copy of your format file and a few rows of sample data?
November 30, 2005 at 11:56 am
Yes, when I manually doing the import, I used wizard again.
DDL:
CREATE TABLE [TIS_StudyUSA_DataEase_test] (
[AgentCode] [char] (8) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[InsuredSurname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[InsuredFirstname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[InsuredBirthday] [datetime] NOT NULL ,
[Street] [char] (40) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[City] [char] (25) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[State] [char] (2) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[Zip] [char] (15) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Email] [char] (50) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[DayPhone] [char] (15) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[EveningPhone] [char] (15) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[InsuredArrivalDate] [datetime] NOT NULL ,
[PassportNumber] [char] (15) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[PassportCountry] [char] (2) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[School] [varchar] (30) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[Visa] [char] (15) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[CurrentExpDate] [datetime] NOT NULL ,
[PlanChosen] [char] (1) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL CONSTRAINT [DF__TIS_Study__PlanC__27AED5D5] DEFAULT (''),
[EffectiveDate] [datetime] NOT NULL ,
[SpouseSurname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[SpouseFirstname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[SpouseBirthday] [datetime] NULL ,
[SpouseArrivalDate] [datetime] NULL ,
[SpouseEffDate] [datetime] NULL ,
[SpouseExpDate] [datetime] NULL ,
[Child1Surname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Child1Firstname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Child1Birthday] [datetime] NULL ,
[Child1ArrivalDate] [datetime] NULL ,
[Child1EffDate] [datetime] NULL ,
[Child1ExpDate] [datetime] NULL ,
[Child2Surname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Child2Firstname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Child2Birthday] [datetime] NULL ,
[Child2ArrivalDate] [datetime] NULL ,
[Child2EffDate] [datetime] NULL ,
[Child2ExpDate] [datetime] NULL ,
[Child3Surname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Child3Firstname] [char] (20) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[Child3Birthday] [datetime] NULL ,
[Child3ArrivalDate] [datetime] NULL ,
[Child3EffDate] [datetime] NULL ,
[Child3ExpDate] [datetime] NULL ,
[total] [int] NOT NULL CONSTRAINT [DF__TIS_Study__total__28A2FA0E] DEFAULT (0),
[AMDoption] [varchar] (5) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL ,
[inoroutbound] [varchar] (5) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL CONSTRAINT [DF__TIS_Study__inoro__29971E47] DEFAULT ('in'),
[CertificateNumber] [char] (7) COLLATE Chinese_Taiwan_Stroke_CI_AS NOT NULL CONSTRAINT [DF__TIS_Study__Certi__2A8B4280] DEFAULT (8888888),
[NotRenewable] [char] (1) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL CONSTRAINT [DF__TIS_Study__NotRe__2B7F66B9] DEFAULT (''),
[Reissue] [char] (1) COLLATE Chinese_Taiwan_Stroke_CI_AS NULL
) ON [PRIMARY]
GO
sample data: You will see 'Y' at the last column for a couple of records
32701|Sun|Yi|11/05/86|18760 E. Colima Road|Rowland Heights|CA|91748|YiSun@sbcglobal.net|(626)-964-9588||10/05/05|1234|China|Oxford School|J1|07/04/06|A|10/05/05|||||||||||||||||||||||||351.00||in|0001387||
32701|Okeke|Susan|01/13/84|3902 College Main St.|Bryan|TX|77081|mary.wong@travelinsure.com|(979)-574-6602||03/22/06|006567047|NI|Texas A&M University|F1|06/21/06|A|03/22/06|||||||||||||||||||||||||117.00||out|0001400||
340087|Gutierrez|Charlene|05/19/74|43-14 40th St.|Long Island City|NY|11104|mary.wong@travelinsure.com|(718)-472-1051||10/22/05|00TRK-830417|TU|CNY Graduate|J1|12/21/05|A|10/22/05|||||||||||||||||||||||||94.00||in|0001401||Y
100295|Sun|Betty|05/07/81|2160 Lundy Ave.|San Jose|CA|95131||(408)-435-8989||05/20/06|00200328607|TW|Silicon Valley|F1|07/19/06|A|05/20/06|||||||||||||||||||||||||92.00|A|in|0001402||Y
32701|Chen|Jenica|09/15/80|47900 Fremont Blvd.|Fremont|CA|94306|JenicaChen@sbcglobal.net|(650)-251-9515||10/17/05|123456789|TW|MSU|B-1|12/16/05|A|10/17/05|Lu|Lu|04/07/78|10/17/05|10/17/05|12/16/05|||||||||||||||||||488.00|A|in|0010010||
00000|Si|Bob|03/07/86|193 Hemlock Ct.|Palo Alto|CA|94306|BobSi@sbcglobal.net|(650)-222-2222||10/20/05|12345|US|Oxford University|Other|07/19/06|B|10/20/05|||||||||||||||||||||||||468.00|B|out|0010011||
162000|Hu|Qi Jia|02/22/89|18760 E. Colima Road|Rowland Heights|CA|91748||(626)-964-9588||10/18/05|G04001195|CN|Oxford School|F-1|09/17/06|A|10/18/05|||||||||||||||||||||||||429.00||in|0010012||
November 30, 2005 at 12:21 pm
What I was getting at when asking if you are using the wizard again is that the package that the wizard is creating for your manual import must be different that the package that you have set up to run from your job. The DTS wizard will allow you to save your package after your manual run. I suggest that you do this. I would then run the package from DTS to make sure that it still works. If so, compare it to your original package and you should see a difference somewhere. I would guess that if your original DTS package is failing when run as a job, that it would also fail if you executed it from DTS? This seems to be a data/formatting problem. Try this. In the meantime, can you post your format (.fmt) file?
November 30, 2005 at 12:31 pm
what is .fmt file, I am just start to learn to use DTS. Is it generated SQL script, I saw somewhere can generate it. fotgot?
Betty
November 30, 2005 at 1:06 pm
If you don't know what a format file is, you are probably not using one. That's OK. It can be done many ways. Don't worry about the format file. Follow my suggestions of having the wizard save off your import package. Once your package is created, truncate your table data and try to run the package independant of the wizard. In Enterprise Manger, go to the DTS folder and look under local packages. You should see the package that the wizard created for you. Execute the package and verify that your data is OK. If it is, use this package as your daily scheduled package instead of the original package that is not correctly importing your last column.
Just curious, how big do you anticipate these daily imports getting? Who created the original package that is not working correctly, the wizard or a human?
November 30, 2005 at 1:34 pm
Hi Journeyman,
It worked out this time.
I delete the table data through sql command first then
I followed your instruction to manually recreated the DTS package through wizard, saved it and run it immediately without delete data command in the table first. I looked data in the data. The 'Y' is there and then click the package-->design package and added the delete table data sql command and create the workflow and saved the package.( and I chose package-->property and select the logging for the package)
I scheduled to run the package and the 'Y' are there. But still don't know what's the difference from the package I set up before through the wizard too.
One very interesting thing is after I created the new package, I clicked sql server agent job and try to set up the new job, but didn't see the new packge over there and I have to use menu action -->all tasks-->manage job category to manually add the new package and then I can schedule this package to run. The problem is I didn't see the log entry, I only can see that through job history, I noticed that it is successful. But I can see another one package log. Does that mean that only one package can be logged???
Thank you so much for your consistant support!
Betty
November 30, 2005 at 1:52 pm
There is probably a data transformation missing in the original package. Regarding your log file, do you mean the Windows application log? If you open up the scheduled job and go to the 'Notifications' tab, there is an option there that should allow you to have failure/success messages written out to the application log. Change this to notify you whenever the job completes and you should get both sucess and failure messages in your Windows Application log.
Just a suggestion, but depending on the size of your daily files, you may want to consider creating a Bulk Insert package or using bcp to import your data. These tasks are designed for large inserts and will perform them more efficiently than the package created by the wizard. I recommend looking into these as an option in the case where you daily files grow too large to import in a timely fashion.
November 30, 2005 at 2:07 pm
Don't know the number of limit DTS can handle.
My records is around 10,000 for now. It could grow to several ten thousand later on.
OK, good idea, initially, I didn't pay attention to this one and it was set to default when job is failed. With this option, I can see whenever the job was done.
Betty
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply