February 22, 2018 at 5:08 am
Hi,
I am looking for any guidance you maybe able to offer to help grouping of nested data.
Currently I have 4 tables which are essentially locations, location sources, depths, sample results. I would like to be able to have all the sample results for the same depth within the same JSON.
My currently code is below, this does create the correct number of levels but what it doesn't do it group the sample results for the same depth.SELECT top(100)
LOCATIONS.well_id as 'locations.location_id',
LOCATIONS.name as 'locations.name',
REL_LOCATIONS_ABSTRACT.USER_ABSTRACT_ID as 'locations.source.abstract_id',
SAMPLE_DEPTH.DEPTH_TOP as 'locations.source.sample_depth.depth_top',
SAMPLE_DEPTH.DEPTH_BASE as 'locations.source.sample_depth.depth_base',
ANALYTE_DATA.VALUE_ANALYTE_ID as 'locations.source.sample_depth.analyte_data.analyte',
ANALYTE_DATA.VALUE_USABLE as 'locations.source.sample_depth.analyte_data.data'
FROM ANALYTE_DATA INNER JOIN
SAMPLE_DEPTH ON ANALYTE_DATA.GC_SAMPLE_DEPTH_ID = SAMPLE_DEPTH.GC_SAMPLE_DEPTH_ID INNER JOIN
LOCATIONS INNER JOIN
REL_LOCATIONS_ABSTRACT ON LOCATIONS.well_id = REL_LOCATIONS_ABSTRACT.WELL_ID ON SAMPLE_DEPTH.REL_GCHEADER_ABSTRACT_ID = REL_LOCATIONS_ABSTRACT.REL_WELLS_ABSTRACT_ID
where locations.name = 'GEBEL_MAGHARA' and SAMPLE_DEPTH.DEPTH_BASE = 3
FOR JSON PATH
Output is below, as you can see all this content is from the same sample depth. I am sure I need to use cross apply somewhere but I dont really know where/how.
Any help would be very much appreciated.
Many Thanks
Oliver
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8013",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":1,
"data":5.330000000000000e+000
}
}
}
}
},
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8013",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":3,
"data":3.100000000000000e+001
}
}
}
}
},
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8013",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":8,
"data":4.260000000000000e+002
}
}
}
}
},
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8013",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":4,
"data":9.700000000000000e-001
}
}
}
}
},
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8013",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":5,
"data":1.069000000000000e+001
}
}
}
}
},
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8007",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":7,
"data":1.400000000000000e-001
}
}
}
}
},
{
"locations":{
"location_id":9935,
"name":"GEBEL_MAGHARA",
"source":{
"abstract_id":"NABIB8007",
"sample_depth":{
"depth_base":3.00,
"analyte_data":{
"analyte":14,
"data":0.000000000000000e+000
}
}
}
}
}
February 22, 2018 at 5:27 am
February 22, 2018 at 5:45 am
Thank you for the response, sorry for the confusion I am trying to write to JSON rather than read from it. I don't understand how this will help me.
Thanks
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply