February 3, 2020 at 11:43 am
Hi
I am after some advice on working with JSON extract file sizes from SQL 2017 stored procedures for usage in other no-SQL systems.
Currently I have a JSON fie size of around 1GB and I really need this to be split into smaller chunks for processing into no-SQL systems. Wondering if this is better undertaken within the export to JSON in SQL Server or post the output to JSON via python or other processing? If i were to do it in SQL what would be the most performant method of say splitting into chunks of 75k recs per file?
Thanks
Mark
February 3, 2020 at 12:24 pm
at a rough guess you might make a loop (i'm assuming you have a numerical sequential value for your row)
something like
declare @rc int=0
while @rc>0
begin
declare @lastrow int=1
select top 75000 * for JSON from mytable where id>@lastrow
set @rc=@@rowcount
set @lastrow=@lastrow+@rc
End
not sure how you are shredding the files, but this will give you multiple result sets... if that doesn't work for you then you'll have to do something similar in code. -
MVDBA
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply