Query to combine two fields into one output

  • Hello,

    I have to combine two fields into one output field with next rules:

    - ID1 ID2

    1234 76541

    Output must be:

    - first 7 digits: ID1 with leading 0s.

    - then next 7 digits: ID2 with leading 0s:

    Result: 00012340076541

    Thanks,

    Brano

  • SELECT

    RIGHT('0000000' + CAST(1234 AS VARCHAR(7)), 7) +

    RIGHT('0000000' + CAST(76541 AS VARCHAR(7)), 7)

  • Hmmm

  • Hi,

    Try this T-SQL.

    DECLARE @ID1 VARCHAR(7)

    DECLARE @ID2 VARCHAR(7)

    SET @ID1 = '1234'

    SET @ID2 = '76541'

    SELECT RIGHT('0000000'+@ID1,7)+RIGHT('0000000'+@ID2,7)

    Shatrughna

  • Thank you guys,

    It works!

    Thank you again,

    Brano

  • For a variable size code, you can do something like this:

    DECLARE @Size INT = 7;

    SELECT

    REPLICATE('0', @Size-LEN(CAST(1234 AS VARCHAR(MAX)))) + CAST(1234 AS VARCHAR(MAX)) +

    REPLICATE('0', @Size-LEN(CAST(76541 AS VARCHAR(MAX)))) + CAST(76541 AS VARCHAR(MAX));

    Best regards,

    Best regards,

    Andre Guerreiro Neto

    Database Analyst
    http://www.softplan.com.br
    MCITPx1/MCTSx2/MCSE/MCSA

  • codebyo (9/29/2011)


    For a variable size code, you can do something like this:

    DECLARE @Size INT = 7;

    SELECT

    REPLICATE('0', @Size-LEN(CAST(1234 AS VARCHAR(MAX)))) + CAST(1234 AS VARCHAR(MAX)) +

    REPLICATE('0', @Size-LEN(CAST(76541 AS VARCHAR(MAX)))) + CAST(76541 AS VARCHAR(MAX));

    Best regards,

    That will work but and I realize you implied "for any size", if fixed sizes are all that are required, fixed sizes will be a fair bit faster that calculating the number of zero's needed for every row. Same goes with conversions to VARCHAR(MAX) which is serious overkill for 7 digit conversions. In many cases, just switching from VARCHAR(n) to VARCHAR(MAX) will cause performance to slow down nearly 2 to 1.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden (10/2/2011)Same goes with conversions to VARCHAR(MAX) which is serious overkill for 7 digit conversions. In many cases, just switching from VARCHAR(n) to VARCHAR(MAX) will cause performance to slow down nearly 2 to 1.

    Is there an article or an example that I could read about that?

    Our clients have just upgraded SQL Server from 2000 to 2005 SP3 so I'm starting to use VARCHAR(MAX) and NVARCHAR(MAX) more widely for new variables in my code whenever I needed them. And also for new table columns that will contain a lot of text.

    I thought that SQL Server would be more flexible when to choose the right size for a variable length string using MAX.

    Is there any recommendations about using VARCHAR(n) instead of VARCHAR(MAX)? I would appreciate any help in this because we're starting to use that feature.

    Thank you for your attention and help.

    Best regards,

    Best regards,

    Andre Guerreiro Neto

    Database Analyst
    http://www.softplan.com.br
    MCITPx1/MCTSx2/MCSE/MCSA

  • I don't believe that anyone has written about the performance penalty of using VARCHAR(MAX) and I haven't made a specific test for it. My statement comes from when I was doing the testing for the new delimited splitter I wrote... as soon as I changed the input parameter from VARCHAR(8000) to VARCHAR(MAX) and made sure to account for any of the implicit conversions that CHARINDEX would change to (from INT to BIGINT as an output), duration of the code doubled even though the data was the exact same data when I was using VARCHAR(8000).

    Although a lot of folks will tell you that "pre-optimization" is a waste of time, I'll tell you that correct datatyping can really save your code. A lot of folks don't understand that many SQL functions, such as CHARINDEX, will change the datatype of their output without alerting you to the change. That can make for some seriously slow code because of all the implit changes the code may have to make when comparing data in a column or constant to the output of such functions.

    A great example of such a hidden problem actually goes the "other way" for scale on datatypes...

    SELECT whatever

    FROM sometable

    WHERE someBITcolumn = 1

    Can you see what's wrong with the code above? A lot of folks can't. Think "implicit conversion" for the answer.

    I suppose I should write a million row test for a couple of such scenarios. I've just gotta find some time to do that.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Never thought about bit fields and constants before.

    If I do the code above and the bit field is non-indexed, I get an index scan with the predicate:

    CONVERT_IMPLICIT(tinyint, [SomeTable].[SomeBITField],0)=(1)

    so the bit field is being implicitly converted to a tinyint to compare with the constant, which is bad because it makes the clause non-SARGable, right?

    But... if I create an index on the bit field, I then actually get an Index Seek (not a scan)

    with the search predicate of:

    [SomeTable].[SomeBITField]=ScalarOperator(@1)

    which suggests that when an index is involved, the optimizer is smart enough to convert the constant into a bit and use the index anyway ???

  • For that business logics needed , you could use the below fastest query :

    select convert (varchar (20),'000')+ convert (varchar (20),id1)+convert (varchar (20),'000')+convert (varchar (20),id2)

    [font="Arial Black"]Performance Guard[/font]

    [font="Courier New"]Just ask me to get 0 sec for all queries[/font]

  • For that business case , you could use also eaily the below query :

    select convert (varchar (20),'000')+ convert (varchar (20),@id1)+convert (varchar (20),'000')+convert (varchar (20),@id2)

    Regarding varchar (max)+ nvarchar (max), they have really 2 serious considerations that need us to pa high cautions for them :

    1-Performnace from the base line of concept that they couldn’t be indexed at all but just only , they could be kepy within include columns

    2-SQL injection since it posts a high opportunutiy for verunability due to its allownance for LOB of more than 8000 characters even which could be > 1 GB, thereby it might subject production cluster to a catatrophic point of failure for storage capacity

    [font="Arial Black"]Performance Guard[/font]

    [font="Courier New"]Just ask me to get 0 sec for all queries[/font]

  • Your logic is incorrect, the OP's original request was to pad out the numbers to 7 places, not add 3 zeroes on the front.

  • That could be just :

    select convert (varchar (20),'000')+ convert (varchar (20),id1)+convert (varchar (20),'00')+convert (varchar (20),id2)

    since I have added additional 0 by mistake and eventually we reach the same result required by business logics

    what I meant to say , that using Right or Left or Substring could be dispensed , but just assure using convert (varchar (n))for neededc variables

    [font="Arial Black"]Performance Guard[/font]

    [font="Courier New"]Just ask me to get 0 sec for all queries[/font]

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply