Cant extract values from JSON Array

  • ZZartin wrote:

    Well the flip side of that question is, if you're passing the information on to a system that wants to read JSON do you store the sqlified data in your database and then convert to JSON every time it's requested or just store the JSON and pass that along?  Especially given the somewhat volatile nature of JSON/XML/etc... being that if your initial parse of the input was incomplete for whatever reason that information is now lost?

    JSON has not changed data modeling at all as far I'm concerned.  From a design and admin point of view how many NVARCHAR(max) columns belong in a table?  Imo not more than 1.  Whenever possible I've normalized NVARCHAR(max) columns into their own table with a surrogate key and a foreign key reference to previously containing table.  As Jeff wrote tho it depends

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Bump.  Maybe the prior posting was never visible

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Steve Collins wrote:

    Bump.  Maybe the prior posting was never visible

    This site has had the problem of not posting the 1st post on subsequent pages until an additional post is made for a very long time, now.  I do wish they fix that and the bad "connect to a given post" and incorrect page counts caused the SPAM removals and a whole lot more.  Steve recently said they were going to work on some of these soon.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Steve Collins wrote:

    ZZartin wrote:

    Well the flip side of that question is, if you're passing the information on to a system that wants to read JSON do you store the sqlified data in your database and then convert to JSON every time it's requested or just store the JSON and pass that along?  Especially given the somewhat volatile nature of JSON/XML/etc... being that if your initial parse of the input was incomplete for whatever reason that information is now lost?

    JSON has not changed data modeling at all as far I'm concerned.  From a design and admin point of view how many NVARCHAR(max) columns belong in a table?  Imo not more than 1.  Whenever possible I've normalized NVARCHAR(max) columns into their own table with a surrogate key and a foreign key reference to previously containing table.  As Jeff wrote tho it depends

    I actually set the table option to have "large data types (LOBs)" be stored "out of row" (to prevent "trapped short rows" and other page density issues), which is effectively the same thing you've done except I also take it a step further.  I also recently taken to using the TEXTIMAGE option when creating a table to have the large objects get stored on a different file group to help prevent a wealth of issues.

    Of course, that doesn't work on new table and so I'm also doing the swap'n'drop thing on some of my larger tables in some of my databases.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    I actually set the table option to have "large data types (LOBs)" be stored "out of row" (to prevent "trapped short rows" and other page density issues), which is effectively the same thing you've done except I also take it a step further.  I also recently taken to using the TEXTIMAGE option when creating a table to have the large objects get stored on a different file group to help prevent a wealth of issues.

    Of course, that doesn't work on new table and so I'm also doing the swap'n'drop thing on some of my larger tables in some of my databases.

    Both of those table options are unfortunately unavailable in Azure SQL where the db's I manage reside.  Azure SQL doesn't have FILESTREAM either.  For most cases we try not to store the JSON at all.  In the best case scenario JSON is a convenient, disposable medium of exchange.  Instead of writing custom C# methods which are specific for the parameters of the stored procedure(s) referenced, if you pass the procedure parameters as JSON and the data as JSON, the whole data access "ORM" type mismatch problem can be eliminated in the best possible way.  From a .NET API if you map HTTP headers, URL parameters, and request content bodies to NVARCHAR(max) input parameters to stored procedures, you can handle ANY request without needing custom code by using a common set of .NET methods which can be rigorously tested and have really good exception handling and error handling.

    In the other topic, with the looping through hierarchical C# objects to create non-hierarchical JSON (to convert back into hierarchically structured data in SQL), the .NET developer is likely complaining to his colleagues about how working with SQL Server data types is obtuse and I don't think it's necessarily the case anymore

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Heh... lordy.  OK... Two more reasons why I'm glad I don't have to deal with Azure.  "Trapped Short Rows" can relatively large decreases in performance.  Now I see why you used a "sister table".

    Heh... and SQL Datatype are "obtuse".  That's not the first time I've heard front-end developers make that claim and then resort to the likes of JSON or XML.  If they just made regular TSV data, that would work a treat compared to JSON.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    If they just made ...

    Yes the current situation of set based development is quite conditional/permissional.  We're not at the final end state tho

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

Viewing 7 posts - 16 through 21 (of 21 total)

You must be logged in to reply to this topic. Login to reply