October 17, 2017 at 2:21 am
Hai Friends,
I have this TXT FILE
And I want to create Stored Procedure to insert to TEMP TABLE With each name of that txt file with bulk insert
and with field TERMINATOR ='|',ROWTERMINATOR = '\n'.
how to make the Stored Procedure and looping all of the .txt file ?
thank you
October 17, 2017 at 2:27 am
Providing a screenshot of the file names isn't really helpful. It's tells us nothing about the file's contents. Do all the files have the same format; same columns, same data types, etc? Can you provide sample file(s)?
Do you have to do this in T-SQL? Would alternatives, such as SSIS be an option?
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
October 17, 2017 at 2:42 am
Thom A - Tuesday, October 17, 2017 2:27 AMProviding a screenshot of the file names isn't really helpful. It's tells us nothing about the file's contents. Do all the files have the same format; same columns, same data types, etc? Can you provide sample file(s)?Do you have to do this in T-SQL? Would alternatives, such as SSIS be an option?
XLcardproduct
xlcardatm
for SSIS, I want to build those files in foreach looping and then insert into temporary table, and then Finally insert into Real table Staging
October 17, 2017 at 2:55 am
unas_sasing - Tuesday, October 17, 2017 2:42 AMThom A - Tuesday, October 17, 2017 2:27 AMProviding a screenshot of the file names isn't really helpful. It's tells us nothing about the file's contents. Do all the files have the same format; same columns, same data types, etc? Can you provide sample file(s)?Do you have to do this in T-SQL? Would alternatives, such as SSIS be an option?
Hi Thom,
All the files have different formats, for example :XLcardproduct
xlcardatm
for SSIS, I want to build those files in foreach looping and then insert into temporary table, and then Finally insert into Real table Staging
Pictures are also not helpful I'm afraid, we can't interact with your data.
Will each the file always be the same. So, for example, the file xlcardatm will contain the same columns and data types? As each file is different, I don't think you need to loop through them. Looping is generally done when you have same task to complete for different items; you doing different tasks (as they have different definitions) for different files, thus each iteration would be different.
Will the files be stored in the same folder each time, overwriting the previous files, or is it a different directory each time.
Why do you want store them in a temporary table and then a staging table? I'm not sure that'll work with SSIS, as the temporary table will be dropped once the session is closed; thus SSIS would load the data, disconnect and your temporary table will be dropped. Load the data directly into your staging table, and then from there into your production table (doing any additional transformations).
For SSIS, you'll need to have x flat file connections, where x is the number of different files you have. Then, you can either build a different data flow for each file, or use one data flow, and have x Sources and Destinations (which do not interact with each other). After your data flow task(s) you'll need to use a T-SQL Task to complete the movement from staging to production.
Most likely you'll then want to use a File System Task to archive the files. How you archive depends on your set up. For example, if you want the whole process to fail if one load fails, then a single Data Flow task would be better, and then you can use a for each loop on your File System Task. If you want each file to be run and treated seperately, independantly of the others, you'd be better having a task and data flow separate for each file; thus is one ETL process for a file fails, the others are unaffected and will still archive.
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
October 17, 2017 at 3:45 am
Thom A - Tuesday, October 17, 2017 2:55 AMunas_sasing - Tuesday, October 17, 2017 2:42 AMThom A - Tuesday, October 17, 2017 2:27 AMProviding a screenshot of the file names isn't really helpful. It's tells us nothing about the file's contents. Do all the files have the same format; same columns, same data types, etc? Can you provide sample file(s)?Do you have to do this in T-SQL? Would alternatives, such as SSIS be an option?
Hi Thom,
All the files have different formats, for example :XLcardproduct
xlcardatm
for SSIS, I want to build those files in foreach looping and then insert into temporary table, and then Finally insert into Real table Staging
Pictures are also not helpful I'm afraid, we can't interact with your data.
Will each the file always be the same. So, for example, the file xlcardatm will contain the same columns and data types? As each file is different, I don't think you need to loop through them. Looping is generally done when you have same task to complete for different items; you doing different tasks (as they have different definitions) for different files, thus each iteration would be different.
Will the files be stored in the same folder each time, overwriting the previous files, or is it a different directory each time.
Why do you want store them in a temporary table and then a staging table? I'm not sure that'll work with SSIS, as the temporary table will be dropped once the session is closed; thus SSIS would load the data, disconnect and your temporary table will be dropped. Load the data directly into your staging table, and then from there into your production table (doing any additional transformations).
For SSIS, you'll need to have x flat file connections, where x is the number of different files you have. Then, you can either build a different data flow for each file, or use one data flow, and have x Sources and Destinations (which do not interact with each other). After your data flow task(s) you'll need to use a T-SQL Task to complete the movement from staging to production.
Most likely you'll then want to use a File System Task to archive the files. How you archive depends on your set up. For example, if you want the whole process to fail if one load fails, then a single Data Flow task would be better, and then you can use a for each loop on your File System Task. If you want each file to be run and treated seperately, independantly of the others, you'd be better having a task and data flow separate for each file; thus is one ETL process for a file fails, the others are unaffected and will still archive.
1. the file xlcardatm will contain the same columns and data types? Yes
2. Will the files be stored in the same folder each time, overwriting the previous files, or is it a different directory each time ? The files will be stored in the same folder
3. Why do you want store them in a temporary table and then a staging table? I have used SSIS for all of the .txt files, but some of them got error,
4. The temporary table that I want to use is using dbo.
October 17, 2017 at 3:53 am
unas_sasing - Tuesday, October 17, 2017 3:44 AM1. the file xlcardatm will contain the same columns and data types? Yes
2. Will the files be stored in the same folder each time, overwriting the previous files, or is it a different directory each time ? The files will be stored in the same folder
3. Why do you want store them in a temporary table and then a staging table? I have used SSIS for all of the .txt files, but some of them got error,
4. The temporary table that I want to use is using dbo.
Temporary tables are stored in TempDB, not on the schema; their names start with a # (or ## if global).
When you tried with SSIS, what were the errors you were getting?
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
October 17, 2017 at 5:27 pm
What did you do? Download from Equifax?
I'm truly hoping that none of these files actually contain credit card numbers and other PII in plain text.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 17, 2017 at 8:45 pm
I have this flow:
bUt always get error:
And when I fix the error, but still I got the same error..
October 17, 2017 at 8:50 pm
unas_sasing - Tuesday, October 17, 2017 8:45 PMI have this flow:bUt always get error:
And when I fix the error, but still I got the same error..
this is the Flatfile source:
"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component"
"MemberXID" "DT_I2" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberProfileXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberID" "DT_I4" "0" "0" "0" "0" "0" "" "Flat File Source"
"FirstName" "DT_STR" "0" "0" "30" "1252" "0" "" "Flat File Source"
"MiddleName" "DT_STR" "0" "0" "20" "1252" "0" "" "Flat File Source"
"LastName" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Alias" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"BirthDate" "DT_I4" "0" "0" "0" "0" "0" "" "Flat File Source"
"Email" "DT_STR" "0" "0" "30" "1252" "0" "" "Flat File Source"
"HP" "DT_STR" "0" "0" "14" "1252" "0" "" "Flat File Source"
"Remark" "DT_STR" "0" "0" "19" "1252" "0" "" "Flat File Source"
"Background" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Photo" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Signature" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"CreationTimeStamp" "DT_DATE" "0" "0" "0" "0" "0" "" "Flat File Source"
"ModifiedTimeStamp" "DT_DBDATE" "0" "0" "0" "0" "0" "" "Flat File Source"
"Folder" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"InstXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"AutoDebitAccountNo" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePresentmentTime" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePresentmentCycleParam" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePaymentTime" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePaymentCycleParam" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"PresentmentTimeStamp" "DT_DBTIMESTAMP" "0" "0" "0" "0" "0" "" "Flat File Source"
"PaymentTimeStamp" "DT_DBTIMESTAMP" "0" "0" "0" "0" "0" "" "Flat File Source"
"AutoDebit" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePresentment" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePayment" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberStatusXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"CIF" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"MotherName" "DT_STR" "0" "0" "40" "1252" "0" "" "Flat File Source"
October 18, 2017 at 1:45 am
unas_sasing - Tuesday, October 17, 2017 8:50 PMthis is the Flatfile source:
"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component"
"MemberXID" "DT_I2" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberProfileXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberID" "DT_I4" "0" "0" "0" "0" "0" "" "Flat File Source"
"FirstName" "DT_STR" "0" "0" "30" "1252" "0" "" "Flat File Source"
"MiddleName" "DT_STR" "0" "0" "20" "1252" "0" "" "Flat File Source"
"LastName" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Alias" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"BirthDate" "DT_I4" "0" "0" "0" "0" "0" "" "Flat File Source"
"Email" "DT_STR" "0" "0" "30" "1252" "0" "" "Flat File Source"
"HP" "DT_STR" "0" "0" "14" "1252" "0" "" "Flat File Source"
"Remark" "DT_STR" "0" "0" "19" "1252" "0" "" "Flat File Source"
"Background" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Photo" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Signature" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"CreationTimeStamp" "DT_DATE" "0" "0" "0" "0" "0" "" "Flat File Source"
"ModifiedTimeStamp" "DT_DBDATE" "0" "0" "0" "0" "0" "" "Flat File Source"
"Folder" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"InstXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"AutoDebitAccountNo" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePresentmentTime" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePresentmentCycleParam" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePaymentTime" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePaymentCycleParam" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"PresentmentTimeStamp" "DT_DBTIMESTAMP" "0" "0" "0" "0" "0" "" "Flat File Source"
"PaymentTimeStamp" "DT_DBTIMESTAMP" "0" "0" "0" "0" "0" "" "Flat File Source"
"AutoDebit" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePresentment" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePayment" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberStatusXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"CIF" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"MotherName" "DT_STR" "0" "0" "40" "1252" "0" "" "Flat File Source"
Rather than pasting, please attach a copy of the file.
Edit: Also, in your original post you said your files were pipe (|) delimited. The above is clearly tab delimited. What is it...?
Thom~
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.
Larnu.uk
October 18, 2017 at 5:52 am
unas_sasing - Tuesday, October 17, 2017 8:50 PMunas_sasing - Tuesday, October 17, 2017 8:45 PMI have this flow:bUt always get error:
And when I fix the error, but still I got the same error..
this is the Flatfile source:
"Name" "Data Type" "Precision" "Scale" "Length" "Code Page" "Sort Key Position" "Comparison Flags" "Source Component"
"MemberXID" "DT_I2" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberProfileXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberID" "DT_I4" "0" "0" "0" "0" "0" "" "Flat File Source"
"FirstName" "DT_STR" "0" "0" "30" "1252" "0" "" "Flat File Source"
"MiddleName" "DT_STR" "0" "0" "20" "1252" "0" "" "Flat File Source"
"LastName" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Alias" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"BirthDate" "DT_I4" "0" "0" "0" "0" "0" "" "Flat File Source"
"Email" "DT_STR" "0" "0" "30" "1252" "0" "" "Flat File Source"
"HP" "DT_STR" "0" "0" "14" "1252" "0" "" "Flat File Source"
"Remark" "DT_STR" "0" "0" "19" "1252" "0" "" "Flat File Source"
"Background" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Photo" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"Signature" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"CreationTimeStamp" "DT_DATE" "0" "0" "0" "0" "0" "" "Flat File Source"
"ModifiedTimeStamp" "DT_DBDATE" "0" "0" "0" "0" "0" "" "Flat File Source"
"Folder" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"InstXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"AutoDebitAccountNo" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePresentmentTime" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePresentmentCycleParam" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePaymentTime" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"OverridePaymentCycleParam" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"PresentmentTimeStamp" "DT_DBTIMESTAMP" "0" "0" "0" "0" "0" "" "Flat File Source"
"PaymentTimeStamp" "DT_DBTIMESTAMP" "0" "0" "0" "0" "0" "" "Flat File Source"
"AutoDebit" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePresentment" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"OverridePayment" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"MemberStatusXID" "DT_I1" "0" "0" "0" "0" "0" "" "Flat File Source"
"CIF" "DT_STR" "0" "0" "255" "1252" "0" "" "Flat File Source"
"MotherName" "DT_STR" "0" "0" "40" "1252" "0" "" "Flat File Source"
Again, I don't know much about SSIS but, like Thom says, that's NOT pipe delimited. It's TAB separated. Setup your import to use TAB as the delimiter and click the checkbox that allows you to use a quoted text identifier.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 19, 2017 at 1:50 pm
And just assuming that you have your data files properly described, the error you are getting clearly indicates that there's data in your file that doesn't fit into your data type for that column. Only choice is to either manually correct that data, or change the data type to a character-based one and then create an error checking process for the data to validate that every value coming in is valid, and if not, flag it as an error row and mark the staging table record in some fashion as not valid or in need of correction. There's a lot more work involved, because then you also need a business process to follow up with the data provider and either get a corrected file or a data correction that someone has to then apply to the staging table data. Your load to production would also have to be a on a regular basis, and have to take into account that some records are not ready for production load.
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
October 19, 2017 at 5:24 pm
sgmunson - Thursday, October 19, 2017 1:50 PMAnd just assuming that you have your data files properly described, the error you are getting clearly indicates that there's data in your file that doesn't fit into your data type for that column. Only choice is to either manually correct that data, or change the data type to a character-based one and then create an error checking process for the data to validate that every value coming in is valid, and if not, flag it as an error row and mark the staging table record in some fashion as not valid or in need of correction. There's a lot more work involved, because then you also need a business process to follow up with the data provider and either get a corrected file or a data correction that someone has to then apply to the staging table data. Your load to production would also have to be a on a regular basis, and have to take into account that some records are not ready for production load.
The "overflowed the datatype" problem will also occur if you have the wrong delimiter selected because it tries to load the whole shebang into the first column.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 20, 2017 at 1:35 pm
Jeff Moden - Thursday, October 19, 2017 5:24 PMsgmunson - Thursday, October 19, 2017 1:50 PMAnd just assuming that you have your data files properly described, the error you are getting clearly indicates that there's data in your file that doesn't fit into your data type for that column. Only choice is to either manually correct that data, or change the data type to a character-based one and then create an error checking process for the data to validate that every value coming in is valid, and if not, flag it as an error row and mark the staging table record in some fashion as not valid or in need of correction. There's a lot more work involved, because then you also need a business process to follow up with the data provider and either get a corrected file or a data correction that someone has to then apply to the staging table data. Your load to production would also have to be a on a regular basis, and have to take into account that some records are not ready for production load.The "overflowed the datatype" problem will also occur if you have the wrong delimiter selected because it tries to load the whole shebang into the first column.
Hai Guys,
Thank you so much,I've found the solution for this, I create a new table named dbo.beforestaging,
And then I create Stored Procedure to insert its textfile into that table..
After that,i use SSIS to insert from dbo.beforestaging onto dbo.afterstaging with column design exactly like the afterstaging table
October 20, 2017 at 1:57 pm
unas_sasing - Friday, October 20, 2017 1:35 PMJeff Moden - Thursday, October 19, 2017 5:24 PMsgmunson - Thursday, October 19, 2017 1:50 PMAnd just assuming that you have your data files properly described, the error you are getting clearly indicates that there's data in your file that doesn't fit into your data type for that column. Only choice is to either manually correct that data, or change the data type to a character-based one and then create an error checking process for the data to validate that every value coming in is valid, and if not, flag it as an error row and mark the staging table record in some fashion as not valid or in need of correction. There's a lot more work involved, because then you also need a business process to follow up with the data provider and either get a corrected file or a data correction that someone has to then apply to the staging table data. Your load to production would also have to be a on a regular basis, and have to take into account that some records are not ready for production load.The "overflowed the datatype" problem will also occur if you have the wrong delimiter selected because it tries to load the whole shebang into the first column.
Hai Guys,
Thank you so much,I've found the solution for this, I create a new table named dbo.beforestaging,
And then I create Stored Procedure to insert its textfile into that table..
After that,i use SSIS to insert from dbo.beforestaging onto dbo.afterstaging with column design exactly like the afterstaging table
Glad you got it worked out.
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply