export sql data

  • Is there any way export the sql data into pipe delimited text file.

    Currently we wrote scripts (joining multiple tables )and pulling the reports in to excel file, now user is asking the same report in pipe delimited as well.

    I know that if we want to export a table data , we can use import/export wizard and select destination as text file(comma or pipe delimited/tab) but joining multiple tables's data, how can export to pipe delmited text file. we are not configured SSRS.

  • BCP supports changing the default field terminator (default is tab ) to anything else;

    here's a simple example: see the delimiter in the dbl quotes after the -t command

    EXECUTE master.dbo.xp_cmdshell 'bcp "SELECT object_name(object_id) As TbLName,name as ColName FROM SandBox.sys.columns ORDER BY object_name(object_id), column_id" queryout C:\Data\Objects.txt -t"|" -c -T '

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Thanks Lowel!

    but my script is...see below, Presently I am exporting this data to excel(manual copy paste). Now I need this report in pipe delimited.

    DECLARE @accountID INT

    DECLARE @FromDate DATETIME

    DECLARE @ToDate DATETIME

    SET @accountID= 2

    SET @FromDate= '2010/01/01'-- From Date (YYYY/MM/DD)

    SET @ToDate= '2010/11/30'-- To Date (YYYY/MM/DD)

    CREATE TABLE #temp_patients

    (

    PatientID BIGINT,

    PatientDisplayID nvarchar(100),

    StatusName nvarchar(100),

    LastLoginDateTime datetime,

    SugarEntryCount BIGINT,

    CarbsEntryCount BIGINT,

    MedsEntryCount BIGINT

    )

    INSERT INTO #temp_patients(PatientID, PatientDisplayID, StatusName, LastLoginDateTime)

    (

    SELECT T1.LoginID, T2.AccountPatientDisplayID, T4.StatusName, T1.LastLoginDateTime

    FROM [Login] T1

    INNER JOIN [Patient] T2 ON T1.LoginID = T2.PatientID AND T1.StatusID IN (1,2) -- Active, Idle

    INNER JOIN [status] T4 ON T1.StatusID= T4.StatusID

    INNER JOIN [PatientAccount] T3 ON T2.PatientID = T3.PatientID AND T3.ACCOUNTID = @accountID

    )

    UPDATE #temp_patients

    SET SugarEntryCount = (

    Select count(ElogID) from

    (

    SELECT ELOGBGEntry.ElogID, dbo.ELogBGEntry.PatientID , dbo.ELogBGEntry.BGEntryTime, ROW_NUMBER() OVER

    (PARTITION BY dbo.Elog.ElogID, dbo.ELog.PatientID ORDER BY dbo.ELogBGEntry.BGEntryTime DESC) AS "RANK" from ELOG

    inner JOIN ELOGBGEntry ON ELOG.ELOGID = ELOGBGEntry.ELOGID

    AND dbo.ELog.PatientID = dbo.ELogBGEntry.PatientID

    AND ELOG.RetestIndicator =0 AND [ELogBGEntry].PatientID = #temp_patients.PatientID

    AND (CAST(CONVERT(CHAR(10), BGEntryTime, 126) AS SMALLDATETIME) BETWEEN @FromDate AND @ToDate)

    ) BGCount Where Rank = 1

    GROUP BY PatientID

    )

    UPDATE #temp_patients

    SET CarbsEntryCount = (

    select count(ElogID) from [ELogCarbsEntry]

    WHERE [ELogCarbsEntry].PatientID = #temp_patients.PatientID AND

    (CAST(CONVERT(CHAR(10), CarbsEntryTime, 126) AS SMALLDATETIME) BETWEEN @FromDate AND @ToDate)

    Group By PatientID)

    UPDATE #temp_patients

    SET MedsEntryCount = (

    select count(ElogID) from ELogMedEntry

    WHERE ELogMedEntry.PatientID = #temp_patients.PatientIDAND

    (CAST(CONVERT(CHAR(10), MedEntryTime, 126) AS SMALLDATETIME) BETWEEN @FromDate AND @ToDate)

    Group By PatientID

    )

    Select PatientDisplayID as 'Client ID', ISNULL(SugarEntryCount,0) as 'BG Count',

    ISNULL(CarbsEntryCount,0) as 'Carb Count', ISNULL(MedsEntryCount,0) as 'Med Count'

    FROM #temp_patients

    DROP TABLE #temp_patients

  • ok, because of the complexity, what you have to do is export the results to a global temp table, then BCP the the results;

    Adding this to the very end of your statement should work:

    --get the results into our global table

    Select PatientDisplayID as 'Client ID', ISNULL(SugarEntryCount,0) as 'BG Count',

    ISNULL(CarbsEntryCount,0) as 'Carb Count', ISNULL(MedsEntryCount,0) as 'Med Count'

    INTO ##PIPE_REPORT

    FROM #temp_patients

    --build our pipe delimited file

    EXECUTE master.dbo.xp_cmdshell 'bcp "SELECT * FROM ##PIPE_REPORT " queryout C:\Data\Objects.txt -t"|" -c -T '

    --cleanup

    DROP TABLE #temp_patients

    DROP TABLE ##PIPE_REPORT

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Thanks Lowell.

    I got two more requests from user.

    In pipe delimited file first line should contain column names, do I need to write manually? or is there any way to do it from script.

    In pipe delimited file, where the special characters are enclosed in quotes

    Well right now I have only alphabets and numbers in pipe delimited text file, not sure if any special character exists in future.

    Thanks for your help.

  • This is how I did it, although there's probably a better way.

    - Put the column names into a sql table with pipe delimiters as 1 long string.

    - BCP the columns into a text file

    - BCP the data into another text file

    - Merge the 2 text files into a new text file

    DECLARE @ObjectIDint-- system object ID #

    -- Get Object_ID #, which allows getting column names

    set @ObjectID = (select object_id from sys.objects where name = 'PostBack_Output')

    select name into #Columns

    from sys.columns

    where object_id = @ObjectID

    order by column_id

    set @#ofColumns = @@rowcount -- Get count of # of columns

    -- Build @String to have columns separated by '|'

    select Top(@#ofColumns) @String = COALESCE(@String + '|','') + name

    from #Columns

    drop table postback_output_Columns

    -- Put @String into a table

    select @String as 'ColumnsNames'

    into postback_output_Columns

    /*** Generate Column Names TXT File ***/

    SET @Cmd = 'BCP "SELECT * FROM ' + DB_NAME() + '.dbo.postback_output_Columns" queryout "' + @Folder + @FileHdrName + @Extension + '" /c /T '

    PRINT ' Executing xp_cmdshell: ' + CONVERT(varchar(1000), @Cmd)

    EXEC master..xp_cmdshell @Cmd--, NO_OUTPUT

    /*** Generate Data TXT File ***/

    SET @Cmd = 'BCP "SELECT * FROM ' + DB_NAME() + '.dbo.PostBack_Output" queryout "' + @Folder + @FileDataName + @Extension + '" /c /T /t "|"' -- pipe delimited

    PRINT ' Executing xp_cmdshell: ' + CONVERT(varchar(1000), @Cmd)

    EXEC master..xp_cmdshell @Cmd--, NO_OUTPUT

    -- Append Header & Data into 1 New File with final name eg: copy test1.txt + test2.txt test3.txt /b prevent end of file marker'

    set @CMD = 'copy ' + @Folder + @FileHdrName + @Extension + ' + ' + @Folder + @FileDataName + @Extension + ' ' + @Folder + @FileName + @Extension + ' /b'

    EXEC master..xp_cmdshell @Cmd

    select @cmd

  • homebrew01 's method is exactly how I've done it in the past.

    now for "special characters", what does that mean? high ascii?(so you have to export the results as NVarChar?), html entities?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Hi Lowel,

    Somehow I am not able to using homebrew01 script.

    Can youplease provide the steps based on my select query. Kind of new to this task. Appriciate your help.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply