Transpose Values from Rows to Columns based on Name and Shipment Type

  • Hi,

    I am working on a sql data that has a list of product names, shipment type and the count of shipments. The values are listed as rows in the database. it will be in the below format

    I want to transpose only the shipment type and the corresponding count of each product name in the below format.

    I tried to do this but i am not able to achieve in the correct format.

    Any help is highly appreciated. I have attached the data for reference.

    Regards,

    Karthik

  • This works, but I'm sure someone will correct it so it works better...

    You would use your table name where I have "x" (just aliasing the select statement that contains the data).

    SELECT Product_Name

    , SUM(Air) As Air

    , SUM(Cargo) AS Cargo

    FROM (

    SELECT Product_Name

    , CASE WHEN ShipType = 'Air' THEN Freq ELSE 0 END AS Air

    , CASE WHEN ShipType = 'Cargo' THEN Freq ELSE 0 END AS Cargo

    FROM

    (SELECT 'P1' As Product_Name, 'Air' AS ShipType, 88 As Freq

    UNION ALL SELECT 'P1', 'Cargo',25

    UNION ALL SELECT 'P2', 'Air',17

    UNION ALL SELECT 'P2','Cargo',45

    UNION ALL SELECT 'P3','Air',19

    UNION ALL SELECT 'P3','Cargo',4) x) y

    GROUP BY Product_Name

    ORDER BY Product_Name;

  • Another slightly simpler version (thanks pietlinden for the data subquery):

    with data

    as (select 'P1' as Product_Name

    ,'Air' as ShipType

    ,88 as Freq

    union all

    select 'P1'

    ,'Cargo'

    ,25

    union all

    select 'P2'

    ,'Air'

    ,17

    union all

    select 'P2'

    ,'Cargo'

    ,45

    union all

    select 'P3'

    ,'Air'

    ,19

    union all

    select 'P3'

    ,'Cargo'

    ,4

    )

    select d.Product_Name

    ,Air = sum(case when d.ShipType = 'Air' then d.Freq

    else 0

    end)

    ,Cargo = sum(case when d.ShipType = 'Cargo' then d.Freq

    else 0

    end)

    from data d

    group by d.Product_Name

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Hi,

    The values comes from the database, so i will not be able to hardcore them.

    Is there any way to get the desired output?

  • karthik82.vk (11/13/2014)


    Hi,

    The values comes from the database, so i will not be able to hardcore them.

    Is there any way to get the desired output?

    Of course. Just remove the CTE and change the FROM clause to select from your table.

    As you did not provide DDL and sample data in your original post (follow the link in my signature for details on how to do that), we could not give you anything better than a CTE-based solution.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Phil Parkin (11/13/2014)


    karthik82.vk (11/13/2014)


    Hi,

    The values comes from the database, so i will not be able to hardcore them.

    Is there any way to get the desired output?

    Of course. Just remove the CTE and change the FROM clause to select from your table.

    As you did not provide DDL and sample data in your original post (follow the link in my signature for details on how to do that), we could not give you anything better than a CTE-based solution.

    By the way, I think you meant HARD-CODE. If not, this forum may not be the right place for you :hehe:

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Quick solution using dynamic cross-tab

    😎

    USE tempdb;

    GO

    SET NOCOUNT ON;

    IF OBJECT_ID('dbo.TBL_SAMPLE_PRODUCTS') IS NOT NULL DROP TABLE dbo.TBL_SAMPLE_PRODUCTS;

    CREATE TABLE dbo.TBL_SAMPLE_PRODUCTS

    (

    [Product Name] VARCHAR(50) NOT NULL

    ,[Shippent Type] VARCHAR(10) NOT NULL

    ,[Count] INT NOT NULL

    );

    INSERT INTO dbo.TBL_SAMPLE_PRODUCTS

    (

    [Product Name]

    ,[Shippent Type]

    ,[Count]

    )

    VALUES

    ('Product 01', 'Air', 88)

    ,('Product 01', 'Cargo', 25)

    ,('Product 02', 'Air', 17)

    ,('Product 02', 'Cargo', 46)

    ,('Product 02', 'Train', 46)

    ,('Product 03', 'Air', 19)

    ,('Product 03', 'Cargo', 4)

    ,('Product 04', 'Air', 76)

    ,('Product 04', 'Cargo', 22)

    ,('Product 05', 'Air', 31)

    ,('Product 05', 'Courier',31)

    ,('Product 05', 'Cargo', 14)

    ,('Product 06', 'Air', 49)

    ,('Product 06', 'Cargo', 19)

    ,('Product 07', 'Air', 15)

    ,('Product 07', 'Train', 15)

    ,('Product 07', 'DBH', 2)

    ,('Product 07', 'Cargo', 44)

    ,('Product 08', 'Air', 17)

    ,('Product 08', 'Cargo', 2)

    ,('Product 09', 'Air', 1)

    ,('Product 09', 'Cargo', 12)

    ,('Product 10', 'Air', 19)

    ,('Product 10', 'Cargo', 3);

    DECLARE @SQL_STR NVARCHAR(MAX) = N'SELECT

    SP.[Product Name]

    ' + (

    SELECT

    N',ISNULL(MAX(CASE WHEN SP.[Shippent Type] = ' + NCHAR(39)

    + X.[Shippent Type] + NCHAR(39) + N'THEN SP.[Count] END),0) AS '

    + QUOTENAME(X.[Shippent Type])

    FROM

    (

    SELECT DISTINCT

    SP.[Shippent Type]

    FROM dbo.TBL_SAMPLE_PRODUCTS SP

    ) AS X

    FOR XML PATH(''), TYPE).value('.[1]','NVARCHAR(MAX)') +

    N'

    FROM dbo.TBL_SAMPLE_PRODUCTS SP

    GROUP BY SP.[Product Name];';

    EXEC (@SQL_STR);

    Results

    Product Name Air Cargo Courier DBH Train

    -------------- ----------- ----------- ----------- ----------- -----------

    Product 01 88 25 0 0 0

    Product 02 17 46 0 0 46

    Product 03 19 4 0 0 0

    Product 04 76 22 0 0 0

    Product 05 31 14 31 0 0

    Product 06 49 19 0 0 0

    Product 07 15 44 0 2 15

    Product 08 17 2 0 0 0

    Product 09 1 12 0 0 0

    Product 10 19 3 0 0 0

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply