How to preserve leading zeros with insert into

  •  

     

     

     

     

    Below is the create table script and the insert into statement. Both work fine in SSMS. However, the insert into drops my leading zeros. These are important because I have a query that I want to test out that clearly specifics certain character counts and values. I am hoping to not have to re-write a very lengthy and complex query. Is there a way to do this using the insert into statement and not having to import using the import wizard.?

     

    Insert into cestest (stfips,areatype,area,periodyear,periodtype,period,seriescode,adjusted,benchmark,prelim,empces,empprodwrk,empfemale,hours,earnings,hourearn,supprecord,supphe,supppw,suppfem,hoursallwrkr,earningsallwrkr,hourearnallwrkr,suppheallwrkr)

    Values (32,01,000000,2019,03,11,00000000,0,2018,0,1453600,0,0,0.0,0.00,0.00,0,1,1,1,0.0,0.00,0.00,1),

    (32,01,000000,2019,03,11,00000000,1,2018,0,1443600,0,0,0.0,0.00,0.00,0,1,1,1,0.0,0.00,0.00,1),

    (32,01,000000,2019,03,12,00000000,0,2018,1,1446400,0,0,0.0,0.00,0.00,0,1,1,1,0.0,0.00,0.00,1),

    (32,01,000000,2019,03,12,00000000,1,2018,1,1440100,0,0,0.0,0.00,0.00,0,1,1,1,0.0,0.00,0.00,1);

    CREATE TABLE cestest

    (

    stfips char(2) NOT NULL,

    areatype char(2) NOT NULL,

    area char(6) NOT NULL,

    periodyear char(4) NOT NULL,

    periodtype char(2) NOT NULL,

    period char(2) NOT NULL,

    seriescode char(8) NOT NULL,

    adjusted char(1) NOT NULL,

    benchmark char(4) NULL,

    prelim char(1) NULL,

    empces numeric(9) NULL,

    empprodwrk numeric(9) NULL,

    empfemale numeric(9) NULL,

    hours numeric(3,1) NULL,

    earnings numeric(8,2) NULL,

    hourearn numeric(6,2) NULL,

    supprecord char(1) NULL,

    supphe char(1) NULL,

    supppw char(1) NULL,

    suppfem char(1) NULL,

    hoursallwrkr numeric(3,1) NULL,

    earningsallwrkr numeric(8,2) NULL,

    hourearnallwrkr numeric(6,2) NULL,

    suppheallwrkr char(1) NULL

    );

  • Not sure why it did not work the first time but put single quotes around each item in the values and it works. Previously, the values all turned red in SSMS and got an incorrect syntax error.

  • Nevada, as you mentioned the single quotes around each item being inserted into the corresponding "char" columns will fix the issue. The values turning red seems to indicate there was an additional or missing quote left somewhere. Just be sure that the single quotes are added to each row within the list of VALUES.

    Data Type Precedence will cause implicit conversion as numeric or decimal (or integer in this scenario) data types have higher precedence than the char data type. For example, if you were to do the following:

    DECLARE @Test table
    (
    CharCol char (2) NOT NULL,
    NumericCol numeric (13, 2) NOT NULL
    );

    INSERT INTO @Test (CharCol, NumericCol)
    VALUES ('00', 5.43),
    (00, 5.43)

    SELECT *
    FROM @Test AS T;

    You'll notice although the first row in the list of values contained single quotes, it still inserted and removed the leading zeroes from both rows.

    • This reply was modified 4 years, 9 months ago by  BTylerWhite.
    • This reply was modified 4 years, 9 months ago by  BTylerWhite.
  • Not directly related to the question, but surely SMALLINT would have been a better datatype for PeriodYear?

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • @Phil Parkin,

    You are probably correct. However, this schema/structure is not decided upon by me.

  • Maybe I should create a new question but using that create table script and insert into statement, trying to run this query and getting no results. Any thoughts as to why?

    WITH c1 AS

    (SELECT e.stfips, e.areatype, e.area, e. periodyear, e.period, e.seriescode, e.empces

    FROM cestest as e

    WHERE e.periodtype='03'

    And e.supprecord='0'

    and e.periodyear=

    (Select Max(periodyear)

    From cestest)

    and e.period=

    (Select Top(1)period

    From cestest

    Order by periodyear desc, period desc)

    and e.stfips='32'

    and e.adjusted='1'

    and e.areatype='01'

    ),

    C2 AS

    (Select Distinct c1.periodyear, c1.period

    From c1

    ),

    C3 As

    (Select

    (Case When c2.period='01' Then (c2.period + 11) Else (c2.period-1) END) As 'month',

    (Case When c2.period='01' Then (c2.periodyear -1) Else (c2.periodyear) END) As 'year'

    From C2

    ),

    C4 AS

    (Select c.stfips, c.areatype, c.area, c.periodyear, c.period, c.seriescode, c.empces

    From cestest as c, c2

    Where c.period = c2.period

    And c.periodyear = (c2.periodyear-1)

    And c.supprecord='0'

    and c.stfips='32'

    and c.adjusted='1'

    and c.areatype='01'

    ),

    C5 AS

    (Select s.stfips, s.areatype, s.area, s.periodyear, s.period, s.seriescode, s.empces

    From cestest as s, C1, C3

    Where s.supprecord='0'

    And s.period = c3.month

    And s.periodyear = c3.year

    and s.stfips='32'

    and s.adjusted='1'

    and s.areatype='01'

    ),

    C6 As

    (Select d.seriescode, d.seriesttls, d.stfips

    From cescode d, C1

    Where d.stfips=C1.stfips

    And d.seriescode=C1.seriescode

    And (d.serieslvl ='0' Or d.serieslvl='2' Or d.serieslvl='7')

    )

    Select Distinct C6.seriesttls as 'Super Sector', c6.seriescode As 'Code', C1.empces As 'Employment(p)', C5.empces As 'Last Month', C4.empces As 'Last Year',

    c1.empces-c5.empces As 'Net Change Mnth',

    (Cast(Cast(((c1.empces /(c5.empces)-1)*100 )AS decimal (6,2) )as varchar)+ '%') '% Chg Mnth',

    c1.empces-c4.empces As 'Net Change Year',

    (Cast(Cast(((c1.empces /(c4.empces)-1)*100 )AS decimal (6,2) )as varchar)+ '%') '% Chg Year'

    From C1, C4, C5, C6

    Where (c6.seriescode = c1.seriescode and c6.seriescode = c4.seriescode and c6.seriescode = c5.seriescode)

    order by c6.seriescode

    ;

     

  • >> Below is the CREATE TABLE script [sic: statement] NSERT INTO drops my leading zeros. These are important because I have a query that I want to test out that clearly specifics certain character counts and values. <<

    Your problem is that you don't understand how SQL and RDBMS work. The concept of leading zeros makes no sense whatsoever. That's a display function and the data in SQL is in some unspecified internal format (no, we didn't even say they were binary when we were writing the standards). The idea is that there is a display layer or tier in RDBMS that receives the unspecified abstract data and formats it. Now looking at you did post, which was almost usable, you're better than 70 to 80% of the other noobs on these forums.

    You don't understand that by definition, not as an option, a table must have a key.

    You have no concept whatsoever of ISO 11179 naming rules or a valid data model. You don't seem to understand the difference between an attribute and attribute property and a value. For example, there is no such thing as a generic "period"; when you had your logic courses in college. did you remember the law of identity? (To be is to be something in particular; to be nothing in particular or anything in general, is to be nothing at all). So this has to be by the most fundamental law of Western logic "<something, in particular,>_period" and then it can have attribute properties. In the ISO model data elements are defined as "<attribute>_<attribute property>" when you declare your tables.

    Virtually every one of your columns has some fundamental conceptual error. When you mix things like "area" preceded by "area_type" this implies you have a denormalized schema that mixes multiple tables together. For example, why is there not a reference to the "Something_Periods" table, whose columns include "period_year CHAR(4)) NOT NULL CHECK (period_year LIKE '[12][0-9][0-9][0-9]') " and "period_type CHAR(2) NOT NULL CHECK(period_type IN (...))"

    I also hope in spite of what I've seen in your posting, that you are not using bit flags in SQL. We did that in assembly language programming and gave it up decades ago because we knew how to design a schema.

    Please post DDL and follow ANSI/ISO standards when asking for help. 

  • I have no idea what bit flags are so no not using them. Second, the schema is not designed by myself. I am simply problem solving after having been assigned a task that seems like it should work but as of yet, has not. The query above was not written by myself either. If I am being honest, I did not take classes in college in computer science, information systems, programming.. nothing of that sort. My major was economics.

  • Could you share the (appropriately quoted) insert statement from your original question?  You've asked a second question without having shared the solution to the original question.  It's frustrating because we would first have to repeat what you've apparently already done.  Also, my undergrad major was Economics too so please don't let it be your excuse!

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • @scdecade,

     

    The solution to the original was that I put single quotes around each item listed. The first time I did this, it did not work and so I took the quotes off. From there, discovered that the leading zeros were getting left off. For reasons I do not know, put the quotes back on and data imported fine.

    The purpose of creating this table and inserting the data (with leading zeros ) is to test the query above in a "smaller setting" and find a way to modify/rewrite the C3 section. However, first things first... getting no results for the query above.

  • Could you please share the (appropriately quoted) insert statement from your original question?

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • @scdecade

    the correct values are listed below. Upon looking at it more closely, noticed that I left out cescode so the create table script and the values are below.

    Insert into cestest (stfips,areatype,area,periodyear,periodtype,period,seriescode,adjusted,benchmark,prelim,empces,empprodwrk,empfemale,hours,earnings,hourearn,supprecord,supphe,supppw,suppfem,hoursallwrkr,earningsallwrkr,hourearnallwrkr,suppheallwrkr)

    Values ('32','01','000000','2019','03','11','00000000','0','2018','0','1453600','0','0','0.0','0.00','0.00','0','1','1','1','0.0','0.00','0.00','1'),

    ('32','01','000000','2019','03','11','00000000','1','2018','0','1443600','0','0','0.0','0.00','0.00','0','1','1','1','0.0','0.00','0.00','1'),

    ('32','01','000000','2019','03','12','00000000','0','2018','1','1446400','0','0','0.0','0.00','0.00','0','1','1','1','0.0','0.00','0.00','1'),

    ('32','01','000000','2019','03','12','00000000','1','2018','1','1440100','0','0','0.0','0.00','0.00','0','1','1','1','0.0','0.00','0.00','1');

    CREATE TABLE cescodetest
    (
    stfips char(2) NOT NULL,
    seriescode char(8) NOT NULL,
    seriesdesc varchar(MAX) NULL,
    seriesttls varchar(60) NULL,
    seriesttll varchar(120) NULL,
    serieslvl char(1)
    );

    insert into cescodetest (stfips,seriescode,seriesdesc,seriesttls,seriesttll,serieslvl)
    values ('32','00000000','Total nonfarm','Total nonfarm','Total nonfarm','0'),
    ('00','00000000','Total nonfarm','Total nonfarm','Total nonfarm','0')
  • nevada19785 wrote:

    using that create table script and insert into statement, trying to run this query and getting no results. Any thoughts as to why? 

    I think the reason people want to see what you have as the correct INSERT statement is so that they can more accurately troubleshoot the query that's using this data as opposed to what we believe the correct INSERT is.

    Looking at the CTE queries, I believe the problem is in C4.  You only have 4 records in your sample data here, which are all periodyear=2019, but C4 would be looking for periodyear=2018.

  • C4 returns no rows

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply