April 7, 2004 at 3:10 am
Wondering if anyone could help, I urgently need to import ( not query) mulitple .ini files with the format of
UsersName=david
TimeLoggedIn=12/12/01
Location=London
UsersName=david
TimeLoggedIn=12/12/01
Location=London
UsersName=david
TimeLoggedIn=12/12/01
Location=London
etc...
into a SQl Server table with a format of
User Time Location
--------------- -------------------- --------------
UsersName=david TimeLoggedIn=12/12/01 Location=London
How can I map the relevant line to the relevant columns ie all username in ColA Timeloggedin in ColB and location on ColC
Thanks
April 7, 2004 at 6:06 am
How about the following. Afraid I haven't been able to test it, but any mistakes should be easy to spot and fix.
First create the following table and import your file data into the import_string field :
create table import_tab (
record_id int,
import_string varchar(100)
)
Then create your destination table :
create table logins (
record_id int,
username varchar(30),
logintime smalldatetime,
location varchar(30)
)
--Update the record_id field to an incrementing value starting at 1. So, the username records will have ids 1, 4, 7, 10 etc., the logintime records will have ids 2, 5, 8, 11 etc.
declare @record_id int
select @record_id = 1
update
import_tab
set
record_id = @record_id,
@record_id = @record_id + 1
--retrieve all of the username records into your destination table
insert
logins (record_id, username)
select
record_id,
substring(import_string, 11,40)
from
import_tab
where
floor(convert(decimal(10,2), record_id + 2) / 3.0) convert(decimal(10,2), record_id + 2) / 3.0
--finally update the logintime and location which relates to each username
update
a
set
a.logintime = convert(smalldatetime, substring(import_string, 14,8))
from
logins as a
join import_tab as b on b.record_id = a.record_id + 1
update
a
set
a.location = convert(smalldatetime, substring(import_string, 10,40))
from
logins as a
join import_tab as b on b.record_id = a.record_id + 2
Regards
Rob
April 7, 2004 at 6:43 am
Thanks Rob, will try and let you know the results
April 8, 2004 at 6:00 am
I would do it like this, to make sure the inported rows stay in the same order :
if object_id('tempdb..#xpcmd') is not null drop table #xpcmd
go
create table #xpcmd ( id int identity (1,1), output varchar(4000) )
go
insert #xpcmd exec master..xp_cmdshell 'TYPE "D:\temp\tst.txt"'
go
delete #xpcmd where isnull(rtrim(ltrim(output)),'') = ''
go
if object_id('dbo.tst') is not null drop table dbo.tst
create table dbo.tst ( UsersName varchar(30), TimeLoggedIn varchar(30),Location varchar(30))
go
insert dbo.tst
select U.output,T.output,L.output
from #xpcmd U, #xpcmd T, #xpcmd L
where U.id % 3 = 1
and T.id % 3 = 2
and L.id % 3 = 0
and U.id+1 = T.id
and U.id+2 = L.id
go
April 15, 2004 at 5:57 am
Thanks to both of you, both ideas both fine except there seems to be a data issue in the sense that in some files do not have all the line, so the row is inserted into the wrong cloumn
Majority of files:
------------------
UsersName=david
TimeLoggedIn=12/12/01
Location=London
UsersName=david
TimeLoggedIn=12/12/01
Location=London
UsersName=david
TimeLoggedIn=12/12/01
Location=London
etc...
Some of the files:
--------------------
UsersName=JOHN
UsersName=david
TimeLoggedIn=12/12/01
Location=London
which results in
User time logged in Location
JOHN David time logged in
Any ideas would be greatly appreciated
April 15, 2004 at 7:19 am
CREATE TABLE stagingtable(rowid int IDENTITY(1,1), input varchar(100))
DTS into stagingtable, then select as follows
SELECT u.input as 'User',ISNULL(t.input,'') as 'Time',ISNULL(l.input,'') as 'Location'
FROM stagingtable u
LEFT OUTER JOIN stagingtable t
ON t.rowid = u.rowid + 1
AND LEFT(t.input,12) = 'TimeLoggedIn'
LEFT OUTER JOIN stagingtable l
ON l.rowid = u.rowid + 2
AND LEFT(l.input,8) = 'Location'
WHERE LEFT(u.input,9) = 'UsersName'
Far away is close at hand in the images of elsewhere.
Anon.
April 15, 2004 at 8:48 am
Thanks for all your help, finally got it working
Thanks again
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply