Custom Header Row in Query Result

  • Don't laugh too hard.

    For 36 months I've been sending a monthly file to a vendor that requires a custom header on Row 1 of the flat text file. I've been opening the text file in Notepad++ and manually setting the header. I want to level up and stop doing this. There are only three elements in the Custom Header they want:

    CustomerID# | File Date | Number of Records (Not counting Row 1)

    To get what I copy and paste into NotePad++ I run this query:

    Select distinct
    '9999'
    ,(select distinct FILE_DT from MyTable)
    ,(select count(*) from MyTable)
    from MyTable

    Note: that the "FILE_DT" is a column in the table that has the exact same value in every record. I don't know why that is but that's how the vendor wants it.

    I thought about doing a new Row 1 with just these items and then filling in the following columns with empty data and then a Union All, but then when I export it I'd have a whole bunch of column delimiter characters I'd still have to remove via a Text editor.

     

    -Mark
    MSSQL 2019 Standard, Azure Hosted. Techie/Sysadmin by trade; Three years as a "DBA" now.

  • usererror wrote:

    I thought about doing a new Row 1 with just these items and then filling in the following columns with empty data and then a Union All, but then when I export it I'd have a whole bunch of column delimiter characters I'd still have to remove via a Text editor. 

    Is the main export data output such that multiple columns are delimited (as part of the OUTPUT)? and that is what is stopping you doing a UNION ALL?

    If so then maybe this (using "#" as an indication of a Delimiter Character)

    Select CONCAT('9999', '#', (select TOP 1 FILE_DT from MyTable), '#', (select count(*) from MyTable) )
    UNION ALL
    SELECT CONCAT(Col1, '#', Col2, '#', CONVERT(varchar(99), MyDateCol, 999), '#', ..., Col999)
    FROM TableToExport

    Untested !

  • If I am understanding your question correctly:

    The main export data output is 85 columns. They must be delimited by the bar character per the recipient's requirements.  The end result should look something like this:

    9999|DD-MMM-YYYY|500000
    9999|1|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|2|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|3|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|4|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|5|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    ...
    9999|500000|Data1|Data2|Data3|Data4|Data5|...+80 more columns

    • This reply was modified 3 years, 10 months ago by  usererror. Reason: Visual Clean up

    -Mark
    MSSQL 2019 Standard, Azure Hosted. Techie/Sysadmin by trade; Three years as a "DBA" now.

  • What process are you using to create this file?  Is it an SSIS package - BCP output - Powershell - something else?  It really depends on how you are building the file - and what options are available in that tool.

    For example, using SSIS it is possible to create 2 connection managers that use the *same* file with different specifications.  The first one would have the specification for the header - the second for the detail.  In the process you would create a new file using the first connection and write the header data - then use the second connection and append the data.

    It is also possible to use Powershell to create the file - same concept, open the file and write out the header - then write out each detail record and append to the existing (new) file.

    Using old DOS commands you can combine 2 files - so one file is the header, the second file is the detail and you combine the files into a 3rd (final) file.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • usererror wrote:

    If I am understanding your question correctly:

    The main export data output is 85 columns. They must be delimited by the bar character per the recipient's requirements.  The end result should look something like this:

    9999|DD-MMM-YYYY|500000
    9999|1|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|2|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|3|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|4|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    9999|5|Data1|Data2|Data3|Data4|Data5|...+80 more columns
    ...
    9999|500000|Data1|Data2|Data3|Data4|Data5|...+80 more columns

    How are you putting together the rest of the data above and then saving it to a file.  I ask because I've done this many times before and I might be able to help you automate it to the point where it's "push-button" easy and can even be done as an auto-magic scheduled task.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff and Jeff -

    Currently, all the detail data is in one table that gets refreshed each month. I'm not terribly good with SSIS. I am just now figuring out how to fix old packages using Visual Studio 2019 that created in the past two-three years using the Import/Export wizard and used the "Safe to File" option at the end of the Wizard. But I wondered if SSIS would be the best way to do this and maybe easiest.

    I've used the DOS prompt to 'cheat' in combining CSV files into one big CSV file in the past. I had not thought about that method to combine one text file with the custom header and the other file with all the data in it.

    Basically what I'm gathering is that there is not going to be a simple set of SELECT statements that I could through into the Export Data wizard to create the file. Which is fine, that is making me not feel so dumb at trying to figure it out.

    • This reply was modified 3 years, 10 months ago by  usererror. Reason: Clarity

    -Mark
    MSSQL 2019 Standard, Azure Hosted. Techie/Sysadmin by trade; Three years as a "DBA" now.

  • That is why I asked how you are doing this now...it seems you are using the import/export wizard, or you used that and saved the package.

    @JeffModen probably has a set of code that can do this all in SQL Server, which is probably going to be the better option for you.  I would probably use SSIS since I already have the infrastructure in place to support that method.

    For something like Powershell - you have to build the string with the delimiters and then output that string to the file.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Thanks for your tips. I am going to see if I can get something working with SSIS tomorrow or Friday.

    -Mark
    MSSQL 2019 Standard, Azure Hosted. Techie/Sysadmin by trade; Three years as a "DBA" now.

  • usererror wrote:

    Currently, all the detail data is in one table that gets refreshed each month.

    Could you post the CREATE TABLE statement for that one table and also identify the column that makes each row unique?  Thanks.

    p.s. Also, please verify that you're actually using SQL Server 2019... we need at least SQL Server 2017 for the "simple" method and something only slightly more complicated (but nasty fast) for methods for 2016 and less.

     

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • usererror wrote:

    The main export data output is 85 columns. They must be delimited by the bar character per the recipient's requirements.

    If you just want to output the data "raw" e.g. from a command line SQL tool, and then output/capture that to a text file (rather than using SSIS or similar) then this sort of query would do that

    SELECT CONCAT('9999'
    , '|', (SELECT TOP 1 CONVERT(char(11), FILE_DT, 113) FROM MyTable)
    , '|', (SELECT COUNT(*) FROM MyTable)
    )
    UNION ALL
    SELECT CONCAT('9999'
    , '|', StringCol1
    , '|', IntegerCol2
    , '|', CONVERT(char(11), MyDateCol, 113)
    , '|', ...
    , Col85
    )
    FROM TableToExport

    CONCAT() will concatenate all the normal data types, converting any NULL values to an empty string. However, the formatting of Dates and Floating point numbers etc. might need you to use CAST / CONVERT / ROUND etc. functions to control them. Depending on how much of that sort of "formatting" you have this approach can become tedious, and tools such as SSIS that will do all that stuff for you are better.

    I'm personally not a big fan of SSIS, although in fairness I haven't used it in anger for ages. Can't even remember what it was called way-back-then, but I always thought it daft that Clients who were using it (back then) and who had large numbers of such unattended processes, spent the first X hours of the morning sorting out whatever had gone wrong over night.  Those issues were a lack of "control" of the automated process when it encountered unexpected data and so on. I'm sure those tools could have been made to be bullet-proof, even then, but my view was that because the process so abstracted the data, and disconnected the various parts of the process (into sexy "Click this for this bit" buttons), it was hard to visualise the overall process and also hard to build in methods for detecting / massaging "unexpected data". Actually their data was shiite too, which never helps 🙂 Not in the part I was doing for them I hasten to add!

    But my view may be jaundiced because I am a dinosaur ... but I think that if you are not already using an earlier equivalent of SSIS then moving to it now, if you only have this single job, may expose you to those issues / learning curve, and if your current approach is just a "SQL Query and export" then IMHO you would be better to stick to that approach (and in that situation maybe CONCAT would get you the UNION ALL combination of Header / Items as a single resultset and save you having to do the subsequent/manual MERGE of the Header)

    Either way, you have a working solution now, so you can build a new solution, in duplicate, and generate output from both and then just DIFF them to see if there is any difference, as part of commissioning the shiny new replacement 🙂

  • Jeffrey Williams wrote:

    For something like Powershell - you have to build the string with the delimiters and then output that string to the file.

    Not necessarily. Use two selects in the query, build a dataset from the query and use Export-Csv adding -Append on the second export.

    I do this as my primary method of producing extract files

    Far away is close at hand in the images of elsewhere.
    Anon.

  • I would use SSIS. It will give you flexibility to add your custom text as a top row and total count. Also the SSIS can be wrapped up in the SQL job if you are using SQL server and scheduled to run as needed.

    Just select 'csv' as a destination. There are many posts or videos available to show 'How-to'

  • Jeff Moden wrote:

    usererror wrote:

    Currently, all the detail data is in one table that gets refreshed each month.

    Could you post the CREATE TABLE statement for that one table and also identify the column that makes each row unique?  Thanks.

    p.s. Also, please verify that you're actually using SQL Server 2019... we need at least SQL Server 2017 for the "simple" method and something only slightly more complicated (but nasty fast) for methods for 2016 and less.

    I can confirm I'm using SQL 2019 Standard Edition (version 15.0.4083.2). I just moved from SQL 2016 in December, 2020. 🙂

    Here is the Create Table query for the "Data" part of the table below.

    As a test this morning I walked through the Export Data wizard and created a basic SSIS package to just write the custom header row to a text file via the query shown above in my other post. When I run the Package multiple times it over writes the first row instead of appending the file each time. I'm going to keep trying between other tasks today.

     

    Create Table [MonthlyDataOutput] (
    CLIENT_ID varchar(50) null --This will always be '9999' for every record.
    ,ROW_ID int --This is what makes each row "unique"
    ,CONTRACT_NUM varchar(50) null
    ,MBR_FIRST_NAME varchar(50) null
    ,MBR_LAST_NAME varchar(50) null
    ,GENDER varchar(50) null
    ,BIRTH_DT varchar(50) null
    ,DEATH_DT varchar(50) null
    ,DataElement1 varchar(1)
    ,DataElement2 varchar(50) null
    ,DataElement3 varchar(50) null
    ,DataElement4 varchar(50) null
    ,DataElement5 varchar(50) null
    ,DataElement6 varchar(50) null
    ,DataElement7 varchar(50) null
    ,DataElement8 varchar(50) null
    ,DataElement9 varchar(50) null
    ,DataElement10 varchar(50) null
    ,DataElement11 varchar(50) null
    ,DataElement12 varchar(50) null
    ,DataElement13 varchar(50) null
    ,DataElement14 varchar(50) null
    ,DataElement15 varchar(50) null
    ,DataElement16 varchar(50) null
    ,DataElement17 varchar(50) null
    ,DataElement18 varchar(50) null
    ,DataElement19 varchar(50) null
    ,DataElement20 varchar(50) null
    ,DataElement21 varchar(50) null
    ,DataElement22 varchar(50) null
    ,DataElement23 varchar(50) null
    ,DataElement24 varchar(50) null
    ,DataElement25 varchar(50) null
    ,DataElement26 varchar(50) null
    ,DataElement27 varchar(50) null
    ,DataElement28 varchar(50) null
    ,DataElement29 varchar(50) null
    ,DataElement30 varchar(50) null
    ,DataElement31 varchar(50) null
    ,DataElement32 varchar(50) null
    ,DataElement33 varchar(50) null
    ,DataElement34 varchar(50) null
    ,DataElement35 varchar(50) null
    ,DataElement36 varchar(50) null
    ,DataElement37 varchar(50) null
    ,DataElement38 varchar(50) null
    ,DataElement39 varchar(50) null
    ,DataElement40 varchar(50) null
    ,DataElement41 varchar(50) null
    ,DataElement42 varchar(50) null
    ,DataElement43 varchar(50) null
    ,DataElement44 varchar(50) null
    ,DataElement45 varchar(50) null
    ,DataElement46 varchar(50) null
    ,DataElement47 varchar(50) null
    ,DataElement48 varchar(50) null
    ,DataElement49 varchar(50) null
    ,DataElement50 varchar(50) null
    ,DataElement51 varchar(50) null
    ,DataElement52 varchar(50) null
    ,DataElement53 varchar(50) null
    ,DataElement54 varchar(50) null
    ,DataElement55 varchar(50) null
    ,DataElement56 varchar(50) null
    ,DataElement57 varchar(50) null
    ,DataElement58 varchar(50) null
    ,DataElement59 varchar(50) null
    ,DataElement60 varchar(50) null
    ,DataElement61 varchar(50) null
    ,DataElement62 varchar(50) null
    ,DataElement63 varchar(50) null
    ,DataElement64 varchar(50) null
    ,DataElement65 varchar(50) null
    ,DataElement66 varchar(50) null
    ,DataElement67 varchar(50) null
    ,DataElement68 varchar(50) null
    ,DataElement69 varchar(50) null
    ,DataElement70 varchar(50) null
    ,DataElement71 varchar(50) null
    ,DataElement72 varchar(50) null
    ,DataElement73 varchar(50) null
    ,DataElement74 varchar(50) null
    ,DataElement75 varchar(50) null
    ,DataElement76 varchar(50) null
    ,DataElement77 varchar(50) null
    )

    • This reply was modified 3 years, 10 months ago by  usererror. Reason: Clarity

    -Mark
    MSSQL 2019 Standard, Azure Hosted. Techie/Sysadmin by trade; Three years as a "DBA" now.

  • Kristen-173977 wrote:

    usererror wrote:

    I thought about doing a new Row 1 with just these items and then filling in the following columns with empty data and then a Union All, but then when I export it I'd have a whole bunch of column delimiter characters I'd still have to remove via a Text editor. 

    Is the main export data output such that multiple columns are delimited (as part of the OUTPUT)? and that is what is stopping you doing a UNION ALL?

    If so then maybe this (using "#" as an indication of a Delimiter Character)

    Select CONCAT('9999', '#', (select TOP 1 FILE_DT from MyTable), '#', (select count(*) from MyTable) )
    UNION ALL
    SELECT CONCAT(Col1, '#', Col2, '#', CONVERT(varchar(99), MyDateCol, 999), '#', ..., Col999)
    FROM TableToExport

    Untested !

    I just gave this test drive and it works! I'm waiting on the final dataset to be ready but I'll be using this in production today. I would still like to learn how to do this with another method, but at least I have this method working. Thank you for your help! Also, I don't know why I never though of using the Select TOP 1 for the FILE_DT. I'm sure that method saves resources.

    • This reply was modified 3 years, 10 months ago by  usererror. Reason: Clarity

    -Mark
    MSSQL 2019 Standard, Azure Hosted. Techie/Sysadmin by trade; Three years as a "DBA" now.

  • This has a p

    usererror wrote:

    Kristen-173977 wrote:

    usererror wrote:

    I thought about doing a new Row 1 with just these items and then filling in the following columns with empty data and then a Union All, but then when I export it I'd have a whole bunch of column delimiter characters I'd still have to remove via a Text editor. 

    Is the main export data output such that multiple columns are delimited (as part of the OUTPUT)? and that is what is stopping you doing a UNION ALL?

    If so then maybe this (using "#" as an indication of a Delimiter Character)

    Select CONCAT('9999', '#', (select TOP 1 FILE_DT from MyTable), '#', (select count(*) from MyTable) )
    UNION ALL
    SELECT CONCAT(Col1, '#', Col2, '#', CONVERT(varchar(99), MyDateCol, 999), '#', ..., Col999)
    FROM TableToExport

    Untested !

    I just gave this test drive and it works! I'm waiting on the final dataset to be ready but I'll be using this in production today. I would still like to learn how to do this with another method, but at least I have this method working. Thank you for your help! Also, I don't know why I never though of using the Select TOP 1 for the FILE_DT. I'm sure that method saves resources.

    This has a *potential* problem - SQL does not guarantee the order of the data unless there is an ORDER BY.  This may work just fine - until it doesn't...

    Once you have that...then using SSIS or Powershell or BCP should be simply an export of that data directly to a file.

    For example - in Powershell:

    Invoke-SqlCommand -ServerInstance {your server here} -Query {Your Query/Procedure Here} | Out-File {your file name here};

    If you have something that is a bit more complex - for example, you need to sort the data by multiple columns...

    Select RecordType = 0    -- Header Record
    , RecordDetail = 0 -- Detail ID
    , RecordData = concat('0', '|', SomeHeaderData, '|', SomeOtherData)
    From YourTable
    Where ...
    Group By
    SomeGroup;

    Union All

    Select RecordType = 1 -- Detail Record
    , SomeDetailID
    , RecordData = concat('1', '|', SomeDetailID, '|', OtherData, '|', ...)
    From YourTable
    Where ...

    Order By
    RecordType
    , RecordDetail;

    Now - in Powershell you can do this:

    $recordData = Invoke-SqlCommand -ServerInstance {your server here} -Query {Your Query/Procedure Here};
    $recordData | Select -ExpandProperty RecordData | Out-File {your file name here};

     

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 15 posts - 1 through 15 (of 27 total)

You must be logged in to reply to this topic. Login to reply