Perl JSON to treat all numbers as string - json

In order to create an API that's consistent for strict typing languages, I need to modify all JSON to return quoted strings in place of integers without going through one-by-one and modifying underlying data.
This is how JSON is generated now:
my $json = JSON->new->allow_nonref->allow_unknown->allow_blessed->utf8;
$output = $json->encode($hash);
What would be a good way to say, "And quote every scalar within that $hash"?

Both of JSON's backends (JSON::PP and JSON::XS) base the output type on the internal storage of the value. The solution is to stringify the non-reference scalars in your data structure.
sub recursive_inplace_stringification {
my $reftype = ref($_[0]);
if (!length($reftype)) {
$_[0] = "$_[0]" if defined($_[0]);
}
elsif ($reftype eq 'ARRAY') {
recursive_inplace_stringification($_) for #{ $_[0] };
}
elsif ($reftype eq 'HASH') {
recursive_inplace_stringification($_) for values %{ $_[0] };
}
else {
die("Unsupported reference to $reftype\n");
}
}
# Convert numbers to strings.
recursive_inplace_stringification($hash);
# Convert to JSON.
my $json = JSON->new->allow_nonref->utf8->encode($hash);
If you actually need the functionality provided by allow_unknown and allow_blessed, you will need to reimplement it inside of recursive_inplace_stringification (perhaps by copying it from JSON::PP if licensing allows), or you could use the following before calling recursive_inplace_stringification:
# Convert objects to strings.
$hash = JSON->new->allow_nonref->decode(
JSON->new->allow_nonref->allow_unknown->allow_blessed->encode(
$hash));

Related

Parsing Json in powershell

I have an automation runbook in azure that receives a json string in a variable like so:
["value1","value2"]
The problem is that I can't figure out how to handle this variable and parse it.
$array = "'$jsonInput'"
$Services = $array | ConvertFrom-Json
switch($Services)
{
value1 { $requested = $true ; break }
Default { $requested = $false }
}
Using only single quotes does not resolve my input variable. Using "'$jsonInput'" does nothing and the variable $requested returns false. I believe this is because of the use of double quotes in the input variable. What is the correct way to parse this?
If I manually enter this when testing, it works.
$array = '["value1","value2"]'
But then again, not sure how to take this json string without quotes, append the quotes and resolve the variable.

Perl JSON::XS non-OO interface

All of the documentation and examples I have seen for the Perl JSON::XS module use a OO interface, e.g.
print JSON::XS->new->ascii()->pretty()->canonical()->encode($in);
But I don't necessarily want all those options every time, I'd prefer to send them in a hash like you can with the basic JSON module, e.g.
print to_json($in, { canonical => 1, pretty => 1, ascii => 1 } );
sending to that encode_json yields
Too many arguments for JSON::XS::encode_json
Is there any way to do that?
JSON's to_json uses JSON::XS if it's installed, so if you want a version of to_json that uses JSON::XS, simply use the one from JSON.
Or, you could recreate to_json.
sub to_json
my $encoder = JSON::XS->new();
if (#_ > 1) {
my $opts = $_[1];
for my $method (keys(%$opts)) {
$encoder->$_($opts->{$_});
}
}
return $encoder->encode($_[0]);
}
But doesn't help stop passing in the options every time. If you're encoding multiple data structures, it's best to create a single object and reuse it.
my $encoder = JSON::XS->new->ascii->pretty->canonical;
print $encoder->encode($in);

how to create javascript array from tcl array

please assist me in this ,,,
I have a tcl array called all_tags ,, but the thing is i need to convert it into a javascript array in my page but i am weak when it comes to javascript .
please advise me if below is correct and if not ,,what is the right way ?
<script>
var mytags = new Array();
<%
foreach tag $all_tags {
ns_puts [subst {
mytags.push('$tag');
}]
}
%>
</script>
and afterwards is it possible to use my javascript array in a tcl proc ?
To turn data in Tcl into JSON, you want the json::write package from Tcllib. You'd use it like this to make a JSON object from a Tcl array (and a similar approach works for Tcl dictionaries):
package require json::write
set accumulate {}
foreach {key value} [array get yourArray] {
lappend accumulate $key [json::write string $value]
}
set theJsonObject [json::write object {*}$accumulate]
To turn a Tcl list into a JSON array:
package require json::write
set accumulate {}
foreach item $yourList {
lappend accumulate [json::write string $value]
}
set theJsonArray [json::write array {*}$accumulate]
Note in these two cases I've assumed that the values are all to be represented as JSON strings. If the values to embed are numbers (or true or false) you don't need to do anything special; the values as Tcl sees them work just fine as JSON literals. Embedding lists/arrays/dicts takes “recursive” use of json::write and a bit more planning — it's not automatic as Tcl and JSON have really very different concepts of types.

YAML::Tiny does not support JSON::XS::Boolean

When reading some JSON data structures, and then trying to Dump them using YAML::Tiny, I sometimes get the error
YAML::Tiny does not support JSON::XS::Boolean
I understand why this is the case (in particular YAML::Tiny does not support booleans, which JSON is keen to clearly distinguish from other scalars), but is there a quick hack to turn those JSON::XS::Boolean objects into plain 0's and 1's just for quick dump-to-the-screen purposes?
YAML::Tiny doesn't support objects. Unfortunately, it doesn't even have an option to just stringify all objects, which would handle JSON::XS::Boolean.
You can do that fairly easily with a recursive function, though:
use strict;
use warnings;
use 5.010; # for say
use JSON::XS qw(decode_json);
use Scalar::Util qw(blessed reftype);
use YAML::Tiny qw(Dump);
my $hash = decode_json('{ "foo": { "bar": true }, "baz": false }');
# Stringify all objects in $hash:
sub stringify_objects {
for my $val (#_) {
next unless my $ref = reftype $val;
if (blessed $val) { $val = "$val" }
elsif ($ref eq 'ARRAY') { stringify_objects(#$val) }
elsif ($ref eq 'HASH') { stringify_objects(values %$val) }
}
}
stringify_objects($hash);
say Dump $hash;
This function doesn't bother processing scalar references, because JSON won't produce them. It also doesn't check whether an object actually has overloaded stringification.
Data::Rmap doesn't work well for this because it will only visit a particular object once, no matter how many times it appears. Since the JSON::XS::Boolean objects are singletons, that means it will only find the first true and the first false. It's possible to work around that, but it requires delving into the source code to determine how keys are generated in its seen hash:
use Data::Rmap qw(rmap_ref);
use Scalar::Util qw(blessed refaddr);
# Stringify all objects in $hash:
rmap_ref { if (blessed $_) { delete $_[0]->seen->{refaddr $_};
$_ = "$_" } } $hash;
I think the recursive function is clearer, and it's not vulnerable to changes in Data::Rmap.

Data type conversion issues in Powershell

I'm writing a Powershell script that will extract a set of data files from a ZIP file and will then attach them to a server. I've written a function that takes care of the unzip and since I need to grab all of the files so that I know what I'm attaching I return that from the function:
function Unzip-Files
{
param([string]$zip_path, [string]$zip_filename, [string]$target_path, [string]$filename_pattern)
# Append a \ if the path doesn't already end with one
if (!$zip_path.EndsWith("\")) {$zip_path = $zip_path + "\"}
if (!$target_path.EndsWith("\")) {$target_path = $target_path + "\"}
# We'll need a string collection to return the files that were extracted
$extracted_file_names = New-Object System.Collections.Specialized.StringCollection
# We'll need a Shell Application for some file movement
$shell_application = New-Object -com shell.Application
# Get a handle for the target folder
$target_folder = $shell_application.NameSpace($target_path)
$zip_full_path = $zip_path + $zip_filename
if (Test-Path($zip_full_path))
{
$target_folder = $shell_application.NameSpace($target_path)
$zip_folder = $shell_application.NameSpace($zip_full_path)
foreach ($zipped_file in $zip_folder.Items() | Where {$_.Name -like $filename_pattern})
{
$extracted_file_names.Add($zipped_file.Name) | Out-Null
$target_folder.CopyHere($zipped_file, 16)
}
}
$extracted_file_names
}
I then call another function to actually attach the database (I've removed some code that checks for existence of the database, but that shouldn't affect things here):
function Attach-Database
{
param([object]$server, [string]$database_name, [object]$datafile_names)
$database = $server.Databases[$database_name]
$server.AttachDatabase($database_name, $datafile_names)
$database = $server.Databases[$database_name]
Return $database
}
I keep getting an error though, "Cannot convert argument "1", with value: "System.Object[]", for "AttachDatabase" to type "System.Collections.Specialized.StringCollection"".
I've tried declaring the data types explicitly at various points, but that just changes the location where I get the error (or one similar to it). I've also changed the parameter declaration to use the string collection instead of object with no luck.
I'm starting with a string collection and ultimately want to consume a string collection. I just don't seem to be able to get Powershell to stop trying to convert it to a generic Object at some point.
Any suggestions?
Thanks!
It looks like you should return the names using the comma operator:
...
, $extracted_file_names
}
to avoid "unrolling" the collection to its items and to preserve the original collection object.
There were several questions like this, here is just a couple:
Strange behavior in PowerShell function returning DataSet/DataTable
Loading a serialized DataTable in PowerShell - Gives back array of DataRows not a DataTable
UPDATE:
This similar code works:
Add-Type #'
using System;
using System.Collections.Specialized;
public static class TestClass
{
public static string TestMethod(StringCollection data)
{
string result = "";
foreach (string s in data)
result += s;
return result;
}
}
'#
function Unzip-Files
{
$extracted_file_names = New-Object System.Collections.Specialized.StringCollection
foreach ($zipped_file in 'xxx', 'yyy', 'zzz')
{
$extracted_file_names.Add($zipped_file) | Out-Null
}
, $extracted_file_names
}
function Attach-Database
{
param([object]$datafile_names)
# write the result
Write-Host ([TestClass]::TestMethod($datafile_names))
}
# get the collection
$names = Unzip-Files
# write its type
Write-Host $names.GetType()
# pass it in the function
Attach-Database $names
As expected, its output is:
System.Collections.Specialized.StringCollection
xxxyyyzzz
If I remove the suggested comma, then we get:
System.Object[]
Cannot convert argument "0", with value: "System.Object[]",
for "TestMethod" to type "System.Collections.Specialized.StringCollection"...
The symptoms look the same, so the solution presumably should work, too, if there are no other unwanted conversions/unrolling in the omitted code between Unzip-Files and Attach-Database calls.