January 11, 2023 at 2:25 pm
Hi there
I am trying to extract values from a JSON array object but struggling
so my code is as follows:
DECLARE @JSON NVARCHAR(MAX) =
'{
"ChannelIDs": [11,14,15,16],
"SerialNumber": "939029",
"ReadingStartDate": "2022-05-20T13:49:13",
"ReadingEndDate": "2023-01-09T17:44:05",
"Amount": 50,
"SortOrder": "Ascending",
"IncludeFirstDate": 1
}'
SELECT
device.ChannelIDs
FROM
OPENJSON(@JSON)
WITH
(
ChannelIDs NVARCHAR(MAX) '$.ChannelIDs' AS JSON
) AS device
OUTER APPLY
OPENJSON(device.ChannelIDs)
WITH
(
ChannelId INT '$.ChannelIDs'
) AS Readings
this produces the following results:
ChannelIDs
[11,14,15,16]
[11,14,15,16]
[11,14,15,16]
[11,14,15,16]
But what I want is the following:
SerialNumber ChannelID
939029 11
939029 12
939029 14
939029 15
939029 16
How can i get this output please?
January 11, 2023 at 3:30 pm
Not pure json, but this will do it:
SELECT
SerialNumber,
channelIDs.value AS ChannelID
FROM
OPENJSON(@JSON)
WITH
(
SerialNumber INT '$.SerialNumber',
ChannelIDs NVARCHAR(MAX) '$.ChannelIDs' AS JSON
) AS device
CROSS APPLY STRING_SPLIT(REPLACE(REPLACE(ChannelIDs,'[',''),']',''),',') channelIDs
Note: You're not going to get a ChannelID of 12 w/ that input data.
January 11, 2023 at 3:33 pm
Hi Ratbak
Thank you very much for that. That works perfectly!
January 11, 2023 at 6:38 pm
It's valid JSON. You could try either with two OPENJSON's or one OPENJSON and 2 JSON functions
/* two OPENJSON's */
select SerialNumber, ojc.[value] as ChannelID
from openjson(@json) with (SerialNumber int '$.SerialNumber',
ChannelIDs nvarchar(max) '$.ChannelIDs' as json) oj
cross apply openjson(oj.ChannelIDs) ojc;
/* one OPENJSON and two functions */
select json_value(@json, '$.SerialNumber') SerialNumber,
oj.[value] as ChannelID
from openjson(json_query(@json, '$.ChannelIDs')) oj;
Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können
January 12, 2023 at 5:23 pm
It's valid JSON. You could try either with two OPENJSON's or one OPENJSON and 2 JSON functions
/* two OPENJSON's */
select SerialNumber, ojc.[value] as ChannelID
from openjson(@json) with (SerialNumber int '$.SerialNumber',
ChannelIDs nvarchar(max) '$.ChannelIDs' as json) oj
cross apply openjson(oj.ChannelIDs) ojc;
/* one OPENJSON and two functions */
select json_value(@json, '$.SerialNumber') SerialNumber,
oj.[value] as ChannelID
from openjson(json_query(@json, '$.ChannelIDs')) oj;
I don't work with JSON but, I learned some new things today. Thanks, Steve.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 12, 2023 at 7:45 pm
It's valid JSON. You could try either with two OPENJSON's or one OPENJSON and 2 JSON functions
/* two OPENJSON's */select SerialNumber, ojc.[value] as ChannelID
from openjson(@json) with (SerialNumber int '$.SerialNumber',
ChannelIDs nvarchar(max) '$.ChannelIDs' as json) oj
cross apply openjson(oj.ChannelIDs) ojc;
/* one OPENJSON and two functions */select json_value(@json, '$.SerialNumber') SerialNumber,
oj.[value] as ChannelID
from openjson(json_query(@json, '$.ChannelIDs')) oj;
I like option one best, because that'll also work if you process an array of objects, like this:
DECLARE @json NVARCHAR(MAX) =
'[
{
"ChannelIDs": [11,14,15,16],
"SerialNumber": "939029",
"ReadingStartDate": "2022-05-20T13:49:13",
"ReadingEndDate": "2023-01-09T17:44:05",
"Amount": 50,
"SortOrder": "Ascending",
"IncludeFirstDate": 1
},
{
"ChannelIDs": [11,14,15,16],
"SerialNumber": "939028",
"ReadingStartDate": "2022-05-20T13:49:13",
"ReadingEndDate": "2023-01-09T17:44:05",
"Amount": 40,
"SortOrder": "Ascending",
"IncludeFirstDate": 1
}
]'
But I would probably have used a WITH specification on both OpenJSONs, in order to assign a variable type to ChannelID (the array of unnamed values):
/* two OPENJSON's */select
SerialNumber,
ojc.ChannelID
from openjson(@json) with (SerialNumber int '$.SerialNumber',
ChannelIDs nvarchar(max) '$.ChannelIDs' as json
) oj
cross apply openjson(oj.ChannelIDs) with (ChannelID int '$') ojc;
January 12, 2023 at 7:59 pm
TBH, I have an extreme dislike for all of the solutions on this thread, not because of how they were coded, but because such code was required to begin with. I'll never understand why people think JSON, XML, EDI, or any of the other stuff is in anyway effective for data transmission. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
January 13, 2023 at 3:00 am
TBH, I have an extreme dislike for all of the solutions on this thread, not because of how they were coded, but because such code was required to begin with. I'll never understand why people think JSON, XML, EDI, or any of the other stuff is in anyway effective for data transmission. 😀
From an API perspective in many cases JSON is the only format application data clients are prepared to receive. For most web/javascript and UI components it's all JSON. For iPhone/Android every request and response gets sent as JSON. Imo it's a good simple model and it solves inconvenient issues like escape characters and delimiters. The alternative involves custom coding/recoding to and from SQL Server data types, no? JSON helps level the playing field by making SQL Server more easily integrated with how external systems work imo
Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können
January 13, 2023 at 3:13 am
And there's nothing wrong with ratbak's approach. It could be an efficient way. The downside is having to do double REPLACE of the array brackets
Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können
January 13, 2023 at 6:31 am
The alternative involves custom coding/recoding to and from SQL Server data types, no?
You just wrote custom code on the database side of the house by including INT and NVARCHAR(MAX) in the code, no? 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
January 13, 2023 at 3:59 pm
You just wrote custom code on the database side of the house by including INT and NVARCHAR(MAX) in the code, no? 😛
Of course, the differentiated database code is custom. But why should all of the code outside of SQL be custom too? For my part I redrew the system lines to include delivering/receiving the data to/from data clients and then refactored the whole integration several times and across generations of systems. In the past it's always seemed like non-relational programmers have a set of tools to assist them with application development but when it comes to SQL everything is custom? My original goal was to make my own situation sane and to build and deploy data models. Explaining these issues by analogy or first principles has been a struggle so far. In the end would even a well crafted explanation be enough to convince other developers? Meh, probably not. About as far as Ming's books go. What gets people's attention? Applications. What they can see and interact with with! So I finished Forum API and I built a demonstration web application. JSON is a huge enabler of all of this
Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können
January 13, 2023 at 4:46 pm
Totally understood on that... but do you suppose that they'll actually use your good code to "normalize the data" and store that normalized data in a database or store the JSON and parse it every time they need to look at it?
My original point was meant to be that people actually use JSON (and the other things like XML) to transmit large data in batches and the tag bloat and readable numbers (instead of passing the 4 byte binaries, for example) is a bit crazy... especially when they do it from from SQL Server to another.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 13, 2023 at 8:03 pm
but do you suppose that they'll actually use your good code to "normalize the data" and store that normalized data in a database or store the JSON and parse it every time they need to look at it?
Not supposing the data is stored in a normalized data model and accessed using ORM-free API endpoints which reference stored procedures 🙂 The demonstration website is a template for clients' further development (as well as my own two sites). As a product the forum website is an upgrade on the base package which is just a "users" table, auth/auth, and a pre-configured API
Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können
January 13, 2023 at 8:50 pm
Totally understood on that... but do you suppose that they'll actually use your good code to "normalize the data" and store that normalized data in a database or store the JSON and parse it every time they need to look at it?
Well the flip side of that question is, if you're passing the information on to a system that wants to read JSON do you store the sqlified data in your database and then convert to JSON every time it's requested or just store the JSON and pass that along? Especially given the somewhat volatile nature of JSON/XML/etc... being that if your initial parse of the input was incomplete for whatever reason that information is now lost?
January 13, 2023 at 9:12 pm
Jeff Moden wrote:Totally understood on that... but do you suppose that they'll actually use your good code to "normalize the data" and store that normalized data in a database or store the JSON and parse it every time they need to look at it?
Well the flip side of that question is, if you're passing the information on to a system that wants to read JSON do you store the sqlified data in your database and then convert to JSON every time it's requested or just store the JSON and pass that along? Especially given the somewhat volatile nature of JSON/XML/etc... being that if your initial parse of the input was incomplete for whatever reason that information is now lost?
"It Depends"... can they handle loading 2-6 times the amount of data caused by the tags? 😀 You would also have to consider the change rate and size of the data on the sender side, etc, etc.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 21 total)
You must be logged in to reply to this topic. Login to reply