November 19, 2008 at 7:31 pm
Hi,
I would like seek for an advice regarding mass loading of data from a text file (called Batch File) to a maintenance database. The batch file contains Add(A), Delete(D), and Modify(M) Items with comma separated values for per column data. So inside the file has also a code for which table to Add,Modify,and or Delete.
The batch file will look like this:
A,1,1,12,c:\test.txt,22,the quick brown fox,....
A,1,1,12,c:\test.txt,22,the quick brown fox,....
A,1,1,12,c:\test.txt,22,the quick brown fox,....
D,1,1,12,c:\test.txt,22,the quick brown fox,....
D,1,1,12,c:\test.txt,22,the quick brown fox,....
A,1,1,12,c:\test.txt,22,the quick brown fox,....
A,1,1,12,c:\test.txt,22,the quick brown fox,....
A,1,1,12,c:\test.txt,22,the quick brown fox,....
M,1,1,12,c:\test.txt,22,the quick brown fox,....
M,1,1,12,c:\test.txt,22,the quick brown fox,....
M,1,1,12,c:\test.txt,22,the quick brown fox,....
M,1,1,12,c:\test.txt,22,the quick brown fox,....
M,1,1,12,c:\test.txt,22,the quick brown fox,....
A,1,1,12,c:\test.txt,22,the quick brown fox,....
What I've taken into account is to use SSIS for this one but the thing is, SSIS won't run on SQL Server 2005 Express which we only have as our software to use in development.
Any thoughts and suggested solution is highly appreciated.
Thanks.
November 19, 2008 at 7:55 pm
Hi,
You could look at using BULK INSERT specifying column and row delimiters...
this could be useful:
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
November 19, 2008 at 7:59 pm
P_DBA (11/19/2008)
Hi,You could look at using BULK INSERT specifying column and row delimiters...
this could be useful:
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
before doing this you can look at changing the recovery model to bulk logged
November 19, 2008 at 9:26 pm
Wow, thank you for this idea though I am wondering if this is the fastest and safest way to do mass loading of data from a text file? What if I need to check the validity of each column data (like PK Constraints and etc.) after mass load? are there any better solution to that?
Is SSIS any useful at all?
This information you gave is really helpful, thanks for that.
November 19, 2008 at 10:03 pm
Hi,
Yes this is the fastest way to load bulk data into the table. It works smoothly when both the target table and the input file has same number of columns. however, if you have different columns then you should probably think of using format files or you can create a view with only the columns that you need and you can bulk import the data into the view.
Bulk insert only inserts the data it does not validate your data.
If you are creating a new table:
You can create a temp table with PK then load data OR
you can create temp table load data then create PK
Comparing- bulk load into indexed table is slower than into a non-indexed table
November 20, 2008 at 8:34 am
P_DBA (11/19/2008)
Hi,Yes this is the fastest way to load bulk data into the table. It works smoothly when both the target table and the input file has same number of columns. however, if you have different columns then you should probably think of using format files or you can create a view with only the columns that you need and you can bulk import the data into the view.
Bulk insert only inserts the data it does not validate your data.
If you are creating a new table:
You can create a temp table with PK then load data OR
you can create temp table load data then create PK
Comparing- bulk load into indexed table is slower than into a non-indexed table
Is there a possibility to bulk load a data by batch? any ideas on this?
Thanks for the useful information and help.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply