import delimite file into sql server.

  • USE [Clinical_Edu_2006]

    DROP TABLE Doc

    GO

    GO

    /****** Object: Table [dbo].[DOC] Script Date: 06/17/2009 13:58:14 ******/

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    SET ANSI_PADDING ON

    GO

    CREATE TABLE [dbo].[DOC](

    [PID] [char](17) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [SPECIALTY] [char](21) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [PNAME] [char](55) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [PLIC] [char](31) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [UPIN] [char](20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL

    ) ON [PRIMARY]

    GO

    SET ANSI_PADDING OFF

    go

    BULK INSERT doc

    FROM '\\dbase-svr\shared\Doc_master\doctestfile'

    WITH

    (

    FIELDTERMINATOR = ',',

    ROWTERMINATOR = ''

    )

    [highlight=#ffff11]

    Msg 4832, Level 16, State 1, Line 1

    Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.

    OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned 0x80004005: The provider did not give any information about the error.].

    Msg 7399, Level 16, State 1, Line 1

    OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.

    The statement has been terminated[/highlight].

    file is

    PID,SPECIALTY,PNAME,PLIC,UPIN

    "010199","MED","SER KENNETH","MD024E","40690"

    "010496","MED","MGUY","MD02236E","C9257"

  • You're specifying an empty string for the row terminator, but it looks like your file has a normal CRLF. So, just take out the rowterminator clause.

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • USE [Clinical_Edu_2006]

    DROP TABLE Doc

    GO

    GO

    /****** Object: Table [dbo].[DOC] Script Date: 06/17/2009 13:58:14 ******/

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    SET ANSI_PADDING ON

    GO

    CREATE TABLE [dbo].[DOC](

    [PID] [char](17) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [SPECIALTY] [char](21) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [PNAME] [char](55) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [PLIC] [char](31) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,

    [UPIN] [char](20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL

    ) ON [PRIMARY]

    GO

    SET ANSI_PADDING OFF

    go

    BULK INSERT doc

    FROM '\\dbase-svr\shared\Doc_master\doctestfile'

    WITH

    (

    FIELDTERMINATOR = ','

    )

    Msg 4832, Level 16, State 1, Line 1

    Bulk Insert: Unexpected end-of-file (EOF) encountered in data file.

    OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned 0x80004005: The provider did not give any information about the error.].

    Msg 7399, Level 16, State 1, Line 1

    OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error.

    The statement has been terminated.

    see attached file. I took off the header

  • If test.txt is where you're reading data from, that isn't delimited, it's fixed lengths. If it's a one-off job, or even if it isn't, DTS is more capable - in Enterprise Manager, right-click your table, choose "Import data", proceed from there. You can save a "package" to repeat the job with the same source and destination tables, which I usually work around, or open the package and edit those terms - it gets tricky though.

    Otherwise - well, I'm not sure if you can use a BCP format file, for instance, to indicate fixed field lengths. Look at the options for generating a format file from the table already created in SQL Server. They're not pretty to design from scratch. The system is very fussy about getting the format file right.

    Or, just import fixed length data as one long field in BULK INSERT, then copy parts of the field selected with appropriate SUBSTRING() expressions to the columns where you want to hold data.

    My own latest bad experience with BCP was a file format that is delimited, but different number of fields in different rows. Really it's multiple different tables stacked in one data file. (It's United Kingdom local government geographical records called DTF 6.3.) DTS takes it, but a crazy manager said don't use DTS this time. BCP just didn't get it, until I took the data and edited extra commas onto every line. That meant I also had unnecessary commas after data, so I added a table column to receive the waste commas. That worked. It also took days instead of minutes to get it to work, which is good because maybe crazy manager will think twice before giving unhelpful instructions.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply