November 23, 2006 at 5:35 am
I want to load data from text files into the db
i can't use bcp or bulk load because i have to do some transformation on the data before
i bulk inserted into a temp table then read the temp table...
is there any other faster way?
CREATE
TABLE #textfile (line varchar(8000))
-- Read the text file into the temp table
BULK INSERT #textfile FROM 'c:\init_newsl.txt'
-- Now read it
DECLARE table_cursor CURSOR FOR SELECT line FROM #textfile
OPEN table_cursor
FETCH NEXT FROM table_cursor INTO @oneline
WHILE @@FETCH_STATUS = 0
BEGIN
FETCH NEXT FROM table_cursor INTO @oneline
etc...
November 23, 2006 at 6:56 am
DTS
it will allow you to transform the data during load
Far away is close at hand in the images of elsewhere.
Anon.
November 23, 2006 at 7:07 am
yes but i don't want to use DTS or SSIS
November 23, 2006 at 7:30 am
yes but i don't want to use DTS or SSIS |
The your only option is to
1. BULK INSERT data to temp/staging table
2. Transform data to real table (use SET based UPDATES though)
Far away is close at hand in the images of elsewhere.
Anon.
November 23, 2006 at 7:57 am
...which also is likely to be the most efficient method.
/Kenneth
November 23, 2006 at 8:03 am
that's good to know!
but then again (excuse my ignorance :blush ... what do you mean by SET based UPDATES ?
I'm probably doing it without knowing
thanks again !
November 23, 2006 at 8:15 am
...what do you mean by SET based UPDATES... |
e.g.
UPDATE
SET column = value, column = value
WHERE [condition]
CURSORS have their place but are notoriously slow, avoid them like the plague
There are not many problems that cannot be solved by SET based UPDATES
(one way or another )
Far away is close at hand in the images of elsewhere.
Anon.
November 23, 2006 at 9:20 am
well yes i get the point, but i have to use a cursor to scan each line in the temp table then create or update records in real tables...
so....
November 23, 2006 at 9:40 am
So join the tables and transform the data, e.g
You have inserted your data into a temp table as one column and you wish to update a real table using this data
Therefore
UPDATE r
SET r.[columntoupdate] = SUBSTRING(r.[data],34,20)
FROM [realtable] r
INNER JOIN [temp] t
ON r. = SUBSTRING(r.[data],10,8)
will update [columntoupdate] with chars 34-53 from the single column [data] in the temp table where the real table column matches chars 10-17 of the [data] column
We need the ddl of the imported temp table and the 'real' table, plus sample data (non sensitive of course) and example of the result, then we can give you accurate solution
Far away is close at hand in the images of elsewhere.
Anon.
November 23, 2006 at 9:52 am
mmmm not bad teacher
that is if i want to update the records in my real table, but what if i have to add a new record in case i don't find it already in ly real table?
i'll still have to scan the temp table again... mmmm...
November 24, 2006 at 3:02 am
That's also 'normal' behaviour when loading data. You want to update existing rows, and insert the new ones.
No need for a cursor! Just a single update on all that qualifies, then insert the rest = two statements.
There are a few different syntaxes to do this, which to use is mostly a matter of personal taste.
The most common is to decide whether to insert or not based on EXISTS checks, or to insert by a SELECT based on a LEFT JOIN between the destination and the source tables.
/Kenneth
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply