None of the solutions mentioned worked for me, because the issue is in the json file content in memory assignment operation;
I found these solutions not viable as they are not consistent, somehow the '-Depth 100' worked in some instances. The '-AsArray' parameter converts the entire memory loaded json file into an json array object.
I have many json files with single element arrays that continue to be converted from array objects to just objects.
I rewrote the json array object assignment to fix this issue. Snippet of my code below
hidden [bool] RunSort( [string] $redacted_data_key)
{
$jsonNodes = $redacted_data_key -split '\.'
if($this.content.$($jsonNodes[0]).$($jsonNodes[1]).$($jsonNodes[2]).Redacteds)
{
$this.content.$($jsonNodes[0]).$($jsonNodes[1]).$($jsonNodes[2]).Redacteds = @($($this.RedactedData[$redacted_data_key]) | Sort-Object);
$this.SortData["RedactedsSorted"].Add($redacted_data_key)
return $true
}
else
{
return $false
}
}
...
$this.content | ConvertTo-Json -depth 100 | Format-Json | Out-File $this.SortData["FilePath"] -NoNewline -Encoding $this.jsonFileEncodingType
So in conclusion check your assignment operations. The problem is occurring there and is not caused by the powershell command 'ConvertTo-Json'. i rewrote my assignment operation within the @() powershell standard array variable brackets. This solved the issue for me
,@("one") | ConvertTo-Json
:[ "one" ]
– Photojournalism