Convert csv tree structure to json - json

I have a large excel document (2000+ lines) that uses the cells to specify a tree structure and I would like to parse that into a .json file. The excel-exported .csv document is formatted as follows, where in the excel file a comma would be an empty cell:
Layer1category1,,,,,
,Layer2category,,,,
...
,,,,Layer5category1,
,,,,,item1
,,,,,item2
,,,,Layer5category2,
,,,,,item1
,,,Layer4category2,,
...
Layer1category2,,,,,
...
Layer1category8,,,,, // this is the last category in the uppermost layer
In summary, Layer n categories are prefaced with n-1 commas and followed by 6-n commas, and rows prefaced with 5 commas are the final layer, which is in the format of a string and has many fields other than its name.
I would like this to be converted to a .json file similar to the following. I use "name" because aside from a name each field is also tied to a lot of statistics that also needs to go into the json file.
{"name" : "Layer1category1",
"children": [
{"name" : "Layer2category1",
"children" : [
{"name" : "Layer3category1"
"children" : [
...
{"name" : "Layer5category1",
"children" : [{"name" : "item1"}, {"name" : "item2"}],}
{"name" : "Layer5category2",
"children" : [{"name" : "item1"}],}
{"name" : "Layer4category2",
"children" : [
...
]}
"name" : "Layer1category2",
"children" : [ ... ]
}
Does anyone have any suggestions for how I can approach this? The csv to json converters I have found do not support multi-layered structures. Thanks!

I faced with the same issue and wrote simple php script:
Input
Level I,Level II,Level III,Level IV,Level V,Level VI,Level VII,Level VIII,,,,,,,,,,,,,,,,,,
Role Profile,,,,,,,,,,,,,,,,,,,,,,,,,
,Development,,,,,,,,,,,,,,,,,,,,,,,,
,,Security,,,,,,,,,,,,,,,,,,,,,,,
,,Cloud,,,,,,,,,,,,,,,,,,,,,,,
,,,Cloud technologies,,,,,,,,,,,,,,,,,,,,,,
,,,,IaaS,,,,,,,,,,,,,,,,,,,,,
,,,,,Amazon Web Service (AWS),,,,,,,,,,,,,,,,,,,,
,,,,,Microsoft Azure,,,,,,,,,,,,,,,,,,,,
,,,,,Google Compute Engine (GCE),,,,,,,,,,,,,,,,,,,,
,,,,,OpenStack,,,,,,,,,,,,,,,,,,,,
Output
{
"Role Profile":{
"Development":{
"Security":{},
"Cloud":{
"Cloud technologies":{
"IaaS":{
"Amazon Web Service (AWS)":{},
"Microsoft Azure":{},
"Google Compute Engine (GCE)":{},
"OpenStack":{}
}
}
}
}
}
}
Code
<?php
$fn = "skills.csv"; // input file name
$hasHeader = true; // input file has header, we will skip first line
//
function appendItem( &$r, $lvl, $item ) {
if ( $lvl ) {
$lvl--;
end( $r );
appendItem( $r[key($r)], $lvl, $item );
} else {
$r[$item] = array();
}
}
$r = array();
if ( ( $handle = fopen( $fn, "r" ) ) !== FALSE ) {
$header = true;
while ( ( $data = fgetcsv( $handle, 1000, "," ) ) !== FALSE ) {
if ( $header and $hasHeader ) {
$header = false;
} else {
$lvl = 0;
foreach( $data as $dv ) {
$v = trim( $dv );
if ( $v ) {
appendItem( $r, $lvl, $v );
break;
}
$lvl++;
}
}
}
}
echo json_encode( $r );
?>

Related

Powershell 7 Merge complex Json => Convert a PSCustomObject with 'path' information to json

I converted two json files with the Get-LeafProperty from this post
Powershell Selecting NoteProperty Type Objects From Object
This maked it easy to merge 2 json file which include different general.apps[] at least it was easy to merge the psobject created by the Get-LeafProperty.
Now I want to go back from the merged psobjects to the origninal json format.
I have a psobject like this which I like to convert to json.
`
[path] : [value]
general.apps[0].name : admin
general.apps[0].storageAccount.sku.name : Standard_LRS
general.apps[0].storageAccount.sku.tier : Standard
general.apps[0].hostingPlan.sku.name : Y1
general.apps[0].hostingPlan.sku.tier : Dynamic
general.apps[0].appconfig.AzureWebJobsDisableHomepage : True
general.apps[0].appconfig.AzureWebJobsStorage__accountName : ${Prefix}${tier}admin
general.apps[0].appconfig.CosmosDbConnectionStringOrManagedIdentity : AccountEndpoint=https://${Prefix}${tier}cosmosdb.documents.azure.com:443
general.apps[0].appconfig.cpo-blackbox-authorization-token :
general.apps[0].appconfig.DatabaseCacheRefreshMinutes :
general.apps[0].appconfig.HttpCallMaxSeconds :
general.apps[0].appconfig.EvgCpoOcpiUrl : https://${Prefix}${tier}${staging}cpoocpi.azurewebsites.net/api/
general.apps[0].appconfig.EvgCtrlWpsUrl : https://${Prefix}${tier}${staging}ctrlwps.azurewebsites.net/api/
general.apps[0].appconfig.EvgPingUrl : https://${Prefix}${tier}${staging}ping.azurewebsites.net/api/
general.apps[0].appconfig.Logging___DebugAsInformation :
general.apps[0].appconfig.Logging___TraceAsInformation :
general.apps[0].appconfig.NegotiatePostfix : /webpubsub/v100/negotiate
general.apps[0].appconfig.OcpiAuthenticationFailureDelayBaseMilliseconds :
general.apps[0].appconfig.OcpiEvgCountryCode :
general.apps[0].appconfig.OcpiLocationSuppressEvents : True
general.apps[0].appconfig.OcpiEvgPartyId :
general.apps[0].appconfig.OcpiSessionTokenMaxAgeSeconds :
general.apps[0].appconfig.OCPIv211___GetPagingLimit :
general.apps[0].appconfig.OCPPv16_OCPIv211___Cdr___Disconnected___TimerCheckConnection :
general.apps[0].appconfig.OCPPv16_OCPIv211___Cdr___Pump___TimerCheckConnection :
general.apps[0].appconfig.OCPPv16_OCPIv211___Cdr___Pump___TimerResponse :
general.apps[0].appconfig.OCPPv16_OCPIv211___LocationEvse___Disconnected___TimerCheckConnection :
general.apps[0].appconfig.OCPPv16_OCPIv211___LocationEvse___Pump___TimerCheckConnection :
general.apps[0].appconfig.OCPPv16_OCPIv211___LocationEvse___Pump___TimerResponse :
general.apps[0].appconfig.OCPPv16_OCPIv211___SessionRecovery___TimerDelayRecoveryRepeatSec :
general.apps[0].appconfig.OCPPv16_OCPIv211___SessionRecovery___TimerDelayRecoveryShotSec :
general.apps[0].appconfig.OCPPv16_OCPIv211___Statemachine___Default___TimerExitSec :
general.apps[0].appconfig.OCPPv16_OCPIv211___Statemachine___Default___TimerRecoverySec :
general.apps[0].appconfig.OCPPv16_OCPIv211___Statemachine___StartSessionState___TimerExitSec :
general.apps[0].appconfig.OCPPv16_OCPIv211___Statemachine___StartSessionRecoveryState___TimerExitSec :
general.apps[0].appconfig.OCPPv16_OCPIv211___Statemachine___WaitPatchSessionState___TimerExitSec :
general.apps[0].appconfig.OCPPv16_OCPIv211___Statemachine___WaitPatchSessionState___TimerRecoverySec :
general.apps[0].appconfig.PingUrls :
general.apps[0].appconfig.SCALE_CONTROLLER_LOGGING_ENABLED :
general.apps[0].appconfig.ServiceBusConnectionString__fullyQualifiedNamespace :
general.apps[0].appconfig.ServiceBusQueueList : cpoocpi:ctrlwps
general.apps[0].appconfig.ServiceBusName :
general.apps[0].appconfig.StatemachineCheckAgeCronSchedule :
general.apps[0].appconfig.StatemachineCheckAgeMinAgeDays :
general.apps[0].appconfig.StatemachineEngineDatabase :
general.apps[0].appconfig.StatemachineEngineContainerData :
general.apps[0].appconfig.StatemachineEngineContainerOperations :
general.apps[0].appconfig.StatemachineEngineContainerOperationsLeases :
general.apps[0].appconfig.StatemachineEngineQueueTimers :
general.apps[0].appconfig.StatemachineEngineServiceBusQueueOperations :
general.apps[0].appconfig.TimerKeepAliveCronSchedule : 0 * * * * *
general.apps[0].appconfig.TimerOcpiSessionTokenCleanupCronSchedule :
general.apps[0].appconfig.TimerOcpiV211GetCdrsCronSchedule :
general.apps[0].appconfig.TimerOcpiV211GetLocationsCronSchedule :
general.apps[0].appconfig.TimerTriggerCronSchedule :
general.apps[0].appconfig.WebPubSubEndpoint :
general.apps[0].appconfig.WebPubSubIdentityObjectId :
general.apps[0].appconfig.WebPubSubConnectionString :
general.apps[0].appconfig.WebPubSubHub :
general.apps[0].appconfig.WebPubSubHubLogging :
general.apps[0].appconfig.WsHostname : ${Prefix}${tier}admin.azurewebsites.net
general.apps[1].name : blackboxtestapi
general.apps[1].storageAccount.sku.name : Standard_LRS
general.apps[1].storageAccount.sku.tier : Standard
general.apps[1].hostingPlan.sku.name : Y1
general.apps[1].hostingPlan.sku.tier : Dynamic
general.apps[1].appconfig.AzureWebJobsDisableHomepage : True
general.apps[1].appconfig.AzureWebJobsStorage__accountName : ${Prefix}${tier}blackboxtestapi
general.apps[1].appconfig.CosmosDbConnectionStringOrManagedIdentity :
general.apps[1].appconfig.cpo-blackbox-authorization-token :
general.apps[1].appconfig.DatabaseCacheRefreshMinutes :
general.apps[1].appconfig.HttpCallMaxSeconds :
general.apps[1].appconfig.EvgCpoOcpiUrl :
general.apps[1].appconfig.EvgCtrlWpsUrl :
general.apps[1].appconfig.EvgPingUrl :
general.apps[1].appconfig.Logging___DebugAsInformation :
general.apps[1].appconfig.Logging___TraceAsInformation :
general.apps[1].appconfig.NegotiatePostfix :
general.apps[1].appconfig.OcpiAuthenticationFailureDelayBaseMilliseconds :
general.apps[1].appconfig.OcpiEvgCountryCode :
general.apps[1].appconfig.OcpiEvgPartyId :
general.apps[1].appconfig.OcpiLocationSuppressEvents : True
general.apps[1].appconfig.OcpiSessionTokenMaxAgeSeconds :
general.apps[1].appconfig.OCPIv211___GetPagingLimit :
general.apps[1].appconfig.OCPPv16_OCPIv211___Cdr___Disconnected___TimerCheckConnection :
general.apps[1].appconfig.OCPPv16_OCPIv211___Cdr___Pump___TimerCheckConnection :
general.apps[1].appconfig.OCPPv16_OCPIv211___Cdr___Pump___TimerResponse :
general.apps[1].appconfig.OCPPv16_OCPIv211___LocationEvse___Disconnected___TimerCheckConnection :
general.apps[1].appconfig.OCPPv16_OCPIv211___LocationEvse___Pump___TimerCheckConnection :
general.apps[1].appconfig.OCPPv16_OCPIv211___LocationEvse___Pump___TimerResponse :
general.apps[1].appconfig.OCPPv16_OCPIv211___SessionRecovery___TimerDelayRecoveryRepeatSec :
general.apps[1].appconfig.OCPPv16_OCPIv211___SessionRecovery___TimerDelayRecoveryShotSec :
general.apps[1].appconfig.OCPPv16_OCPIv211___Statemachine___Default___TimerExitSec :
general.apps[1].appconfig.OCPPv16_OCPIv211___Statemachine___Default___TimerRecoverySec :
general.apps[1].appconfig.OCPPv16_OCPIv211___Statemachine___StartSessionState___TimerExitSec :
general.apps[1].appconfig.OCPPv16_OCPIv211___Statemachine___StartSessionRecoveryState___TimerExitSec :
general.apps[1].appconfig.OCPPv16_OCPIv211___Statemachine___WaitPatchSessionState___TimerExitSec :
general.apps[1].appconfig.OCPPv16_OCPIv211___Statemachine___WaitPatchSessionState___TimerRecoverySec :
general.apps[1].appconfig.PingUrls :
general.apps[1].appconfig.SCALE_CONTROLLER_LOGGING_ENABLED :
general.apps[1].appconfig.ServiceBusConnectionString__fullyQualifiedNamespace :
general.apps[1].appconfig.ServiceBusQueueList :
general.apps[1].appconfig.ServiceBusName :
general.apps[1].appconfig.StatemachineCheckAgeCronSchedule :
general.apps[1].appconfig.StatemachineCheckAgeMinAgeDays :
general.apps[1].appconfig.StatemachineEngineDatabase :
general.apps[1].appconfig.StatemachineEngineContainerData :
general.apps[1].appconfig.StatemachineEngineContainerOperations :
general.apps[1].appconfig.StatemachineEngineContainerOperationsLeases :
general.apps[1].appconfig.StatemachineEngineQueueTimers :
general.apps[1].appconfig.StatemachineEngineServiceBusQueueOperations :
general.apps[1].appconfig.TimerKeepAliveCronSchedule : 0 * * * * *
general.apps[1].appconfig.TimerOcpiSessionTokenCleanupCronSchedule :
general.apps[1].appconfig.TimerOcpiV211GetLocationsCronSchedule :
general.apps[1].appconfig.TimerOcpiV211GetCdrsCronSchedule :
general.apps[1].appconfig.TimerTriggerCronSchedule :
general.apps[1].appconfig.WebPubSubConnectionString :
general.apps[1].appconfig.WebPubSubEndpoint :
general.apps[1].appconfig.WebPubSubIdentityObjectId :
general.apps[1].appconfig.WebPubSubHub : blackboxtest
general.apps[1].appconfig.WebPubSubHubLogging : logging
`
I want to convert this object to json.
ConvertTo-Json gives me this =>
`
{
"general.apps[0].name": "admin",
"general.apps[0].storageAccount.sku.name": "Standard_LRS",
"general.apps[0].storageAccount.sku.tier": "Standard",
"general.apps[0].hostingPlan.sku.name": "Y1",
"general.apps[0].hostingPlan.sku.tier": "Dynamic",
<snip>
`
But I want to get something like this.
`
{
"general": {
"apps":[
{
"name": "admin",
"storageAccount":
{
"sku": {
"name": "Standard_LRS",
"tier": "Standard"
},
"queues": []
},
"hostingPlan":{
"sku": {
"name": "Y1",
"tier": "Dynamic"
}
},
"insight": {},
"appconfig": {
"AzureWebJobsDisableHomepage": true,
"AzureWebJobsStorage__accountName": "${Prefix}${tier}admin",
<snip>
`
I can run a loop over the Path variable and split it on '.' and try to create path again.
I wonder if there is no easier solution.
I solved it differently.
Instead of merging via the Get-Leafeproperty, I use as script that merges the json directly.
The next function merges complex json just fine.
function Merge-Json( $source, $extend ){
if( $source -is [PSCustomObject] -and $extend -is [PSCustomObject] ){
# Ordered hashtable for collecting properties
$merged = [ordered] #{}
# Copy $source properties or overwrite by $extend properties recursively
foreach( $Property in $source.PSObject.Properties ){
if( $null -eq $extend.$($Property.Name) ){
$merged[ $Property.Name ] = $Property.Value
}
else {
$merged[ $Property.Name ] = Merge-Json $Property.Value $extend.$($Property.Name)
}
}
# Add $extend properties
foreach( $Property in $extend.PSObject.Properties ){
if( $null -eq $source.$($Property.Name) ) {
$merged[ $Property.Name ] = $Property.Value
}
}
# Convert hashtable into PSCustomObject and output
[PSCustomObject] $merged
}
elseif( $source -is [Collections.IList] -and $extend -is [Collections.IList] ){
$maxCount = [Math]::Max( $source.Count, $extend.Count )
[array] $merged = for( $i = 0; $i -lt $maxCount; ++$i ){
if( $i -ge $source.Count ) {
# extend array is bigger than source array
$extend[ $i ]
}
elseif( $i -ge $extend.Count ) {
# source array is bigger than extend array
$source[ $i ]
}
else {
# Merge the elements recursively
Merge-Json $source[$i] $extend[$i]
}
}
# Output merged array, using comma operator to prevent enumeration
, $merged
}
else{
# Output extend object (scalar or different types)
$extend
}
}

how to insert json to mysql database with codeigniter

I have html table with value like this :
i have convert the value on table to JSON object with jquery plugin tabletoJSON like this :
[{
harga_jual : "47025",
id_buku : "1",
judul_buku : "perempuan dam hak warisnya",
jumlah : "1",
subtotal : "47025"
},
{
harga_jual : "49500",
id_buku : "2",
judul_buku : "Keajaiban Operasi Plastik Korea Selatan",
jumlah : "2",
subtotal : "99000"
}]
I want when i click checkout button, it will insert the json data into mysql with codeigniter, how i write the code on my model and controller?
here my table structure :
id_buku : int
jumlah : double
subtotal :double
big thanks.
Send an object with value of your data(as JSON) to your Controller. e.g (with JQuery):
$.post('your_action_url', {sendData: JSON.stringify(yourData)}, function(res) {
console.log(res);
}, "json");
Then in your controller, you can get the Data with this CI method:
$data = json_decode($this->input->post('sendData'));
and if the $data is an Array of objects and you want to filter the $data, you can loop the $data then call the save method in your model
$this->db->trans_begin();
foreach($data as $row) {
$filter_data = array(
"id_buku" => $row->id_buku,
"jumlah" => $row->jumlah,
"subtotal" => $row->subtotal
);
//Call the save method
$this->your_model_alias->save_as_new($filter_data);
}
if ($this->db->trans_status() === FALSE) {
$this->db->trans_rollback();
echo json_encode("Failed to Save Data");
} else {
$this->db->trans_commit();
echo json_encode("Success!");
}
Consider to use a transaction to store a lot of data at once. This is necessary to avoid things that are not desirable.
your Model's save method should be like this :
public function save_as_new($data) {
$this->db->insert('your_table_name', $data);
}

File path into JSON data structure

I'm doing a disk space report that uses File::Find to collect cumulative sizing in a directory tree.
What I get (easily) from File::Find is the directory name.
e.g.:
/path/to/user/username/subdir/anothersubdir/etc
I'm running File::Find to collect sizes beneath:
/path/to/user/username
And build a cumulative size report of the directory and each of the subdirectories.
What I've currently got is:
while ( $dir_tree ) {
%results{$dir_tree} += $blocks * $block_size;
my #path_arr = split ( "/", $dir_tree );
pop ( #path_arr );
$dir_tree = join ( "/", #path_arr );
}
(And yes, I know that's not very nice.).
The purpose of doing this is so when I stat each file, I add it's size to the current node and each parent node in the tree.
This is sufficient to generate:
username,300M
username/documents,150M
username/documents/excel,50M
username/documents/word,40M
username/work,70M
username/fish,50M,
username/some_other_stuff,30M
But I'd like to now turn that in to JSON more like this:
{
"name" : "username",
"size" : "307200",
"children" : [
{
"name" : "documents",
"size" : "153750",
"children" : [
{
"name" : "excel",
"size" : "51200"
},
{
"name" : "word",
"size" : "81920"
}
]
}
]
}
That's because I'm intending to do a D3 visualisation of this structure - loosely based on D3 Zoomable Circle Pack
So my question is this - what is the neatest way to collate my data such that I can have cumulative (and ideally non cumulative) sizing information, but populating a hash hierarchically.
I was thinking in terms of a 'cursor' approach (and using File::Spec this time):
use File::Spec;
my $data;
my $cursor = \$data;
foreach my $element ( File::Spec -> splitdir ( $File::Find::dir ) ) {
$cursor -> {size} += $blocks * $block_size;
$cursor = $cursor -> {$element}
}
Although... that's not quite creating the data structure I'm looking for, not least because we basically have to search by hash key to do the 'rolling up' part of the process.
Is there a better way of accomplishing this?
Edit - more complete example of what I have already:
#!/usr/bin/env perl
use strict;
use warnings;
use File::Find;
use Data::Dumper;
my $block_size = 1024;
sub collate_sizes {
my ( $results_ref, $starting_path ) = #_;
$starting_path =~ s,/\w+$,/,;
if ( -f $File::Find::name ) {
print "$File::Find::name isafile\n";
my ($dev, $ino, $mode, $nlink, $uid,
$gid, $rdev, $size, $atime, $mtime,
$ctime, $blksize, $blocks
) = stat($File::Find::name);
my $dir_tree = $File::Find::dir;
$dir_tree =~ s|^$starting_path||g;
while ($dir_tree) {
print "Updating $dir_tree\n";
$$results_ref{$dir_tree} += $blocks * $block_size;
my #path_arr = split( "/", $dir_tree );
pop(#path_arr);
$dir_tree = join( "/", #path_arr );
}
}
}
my #users = qw ( user1 user2 );
foreach my $user (#users) {
my $path = "/home/$user";
print $path;
my %results;
File::Find::find(
{ wanted => sub { \&collate_sizes( \%results, $path ) },
no_chdir => 1
},
$path
);
print Dumper \%results;
#would print this to a file in the homedir - to STDOUT for convenience
foreach my $key ( sort { $results{$b} <=> $results{$a} } keys %results ) {
print "$key => $results{$key}\n";
}
}
And yes - I know this isn't portable, and does a few somewhat nasty things. Part of what I'm doing here is trying to improve on that. (But currently it's a Unix based homedir structure, so that's fine).
If you do your own dir scanning instead of using File::Find, you naturally get the right structure.
sub _scan {
my ($qfn, $fn) = #_;
my $node = { name => $fn };
lstat($qfn)
or die $!;
my $size = -s _;
my $is_dir = -d _;
if ($is_dir) {
my #child_fns = do {
opendir(my $dh, $qfn)
or die $!;
grep !/^\.\.?\z/, readdir($dh);
};
my #children;
for my $child_fn (#child_fns) {
my $child_node = _scan("$qfn/$child_fn", $child_fn);
$size += $child_node->{size};
push #children, $child_node;
}
$node->{children} = \#children;
}
$node->{size} = $size;
return $node;
}
Rest of the code:
#!/usr/bin/perl
use strict;
use warnings;
no warnings 'recursion';
use File::Basename qw( basename );
use JSON qw( encode_json );
...
sub scan { _scan($_[0], basename($_[0])) }
print(encode_json(scan($ARGV[0] // '.')));
In the end, I have done it like this:
In the File::Find wanted sub collate_sizes:
my $cursor = $data;
foreach my $element (
File::Spec->splitdir( $File::Find::dir =~ s/^$starting_path//r ) )
{
$cursor->{$element}->{name} = $element;
$cursor->{$element}->{size} += $blocks * $block_size;
$cursor = $cursor->{$element}->{children} //= {};
}
To generate a hash of nested directory names. (The name subelement is probably redundant, but whatever).
And then post process it with (using JSON):
my $json_structure = {
'name' => $user,
'size' => $data->{$user}->{size},
'children' => [],
};
process_data_to_json( $json_structure, $data->{$user}->{children} );
open( my $json_out, '>', "homedir.json" ) or die $!;
print {$json_out} to_json( $json_structure, { pretty => 1 } );
close($json_out);
sub process_data_to_json {
my ( $json_cursor, $data_cursor ) = #_;
if ( ref $data_cursor eq "HASH" ) {
print "Traversing $key\n";
my $newelt = {
'name' => $key,
'size' => $data_cursor->{$key}->{size},
};
push( #{ $json_cursor->{children} }, $newelt );
process_data_to_json( $newelt, $data_cursor->{$key}->{children} );
}
}

encoding json with no quote around numerical values

I have a perl code snippet
use JSON::XS;
$a = {"john" => "123", "mary" => "456"};
print encode_json($a),"\n";
The output is
{"john":"123","mary":"456"}
Wonder if there is an option to cause encode_json function (from JSON::XS module) to encode it so that the values (123, 456) are not surrounded by double-quote. i.e., like
{"john":123,"mary":456}
Unfortunately I can't change the hash in $a because it's passed to me from another function. Wonder if there is any trick on encode_json().
Thanks!
You probably need to preprocess the data yourself, prior to JSON serialization.
This solution uses Data::Leaf::Walker to traverse an arbitrary structure, converting strings to numbers.
use JSON;
use Data::Leaf::Walker;
use Scalar::Util qw();
my $a = {"john" => "123",
"mary" => ["456","aa"],
"fred" => "bb",
"nested" => {"Q" => undef, "A" => 42},
};
my $walker = Data::Leaf::Walker->new( $a );
while (my ( $key_path, $value ) = $walker->each ) {
$walker->store($key_path, $value + 0)
if Scalar::Util::looks_like_number $value;
};
print to_json($a);
Output: {"john":123,"nested":{"A":42,"Q":null},"mary":[456,"aa"],"fred":"bb"}
You shouldn't use JSON::XS directly, just JSON will already load JSON::XS if available.
A scalar in Perl is tagged whether it is a string or a number, and here you're providing strings. Remove the quotes from your numbers, and they should show up unquoted as JSON already does that automatically.
If you're reading strings (from say a database) then you can coerce the strings to numbers like this:
{ john => 0+$john, mary => 0+$mary }
Update, here's a recursive replacement:
#!/usr/bin/env perl
use JSON;
use Modern::Perl;
use Scalar::Util qw( looks_like_number );
my $structure = {
john => "123",
mary => 456,
deeper => {
lucy => "35zz",
names => [
"john",
"123",
456,
],
},
};
sub make_numbers_recursively {
my ( $data ) = shift;
if ( ref $data eq 'HASH' ) {
# Replace hash values with recurisvely updated values
map { $data->{ $_ } = make_numbers_recursively( $data->{ $_ } ) } keys %$data;
} elsif ( ref $data eq 'ARRAY' ) {
# Replace each array value with recursively processed result
map { $_ = make_numbers_recursively( $_ ) } #$data;
} else {
$data += 0 if looks_like_number( $data );
}
return $data;
}
my $json = JSON->new->pretty;
say $json->encode( $structure );
make_numbers_recursively( $structure );
say $json->encode( $structure );
This outputs:
{
"mary" : 456,
"deeper" : {
"names" : [
"john",
"123",
456
],
"lucy" : "35zz"
},
"john" : "123"
}
{
"mary" : 456,
"deeper" : {
"names" : [
"john",
123,
456
],
"lucy" : "35zz"
},
"john" : 123
}
Beware that it modifies the structure in-place, so if you need the original data for anything you might want to Clone or Data::Clone it first.

JSON formatting in Perl

I am trying to create a JSON object that lists maps associated to a particular user, but haven't ever worked with nested JSON objects. This is what I want:
{
"success":"list of users maps",
"maps":[
{
"id":"1",
"name":"Home to LE",
"date_created":"1366559121"
},
{
"id":"2",
"name":"Test 1",
"date_created":"1366735066"
}
]
}
with this perl code:
my $maps = [];
for (my $x = 0; $x < $sth->rows; $x++) {
my ($id, $name, $date) = $sth->fetchrow_array();
my $map = qq{{"id":"$id","name":"$name","date_created":"$date"}};
push $maps, $map;
}
my $j = JSON::XS->new->utf8;
my $output = $j->encode({
"success"=>"list of users maps",
"maps"=>$maps
});
But the output I am getting is:
{
"success":"list of users maps",
"maps":[
"{\"id\":\"1\",\"name\":\"Home to LE\",\"date_created\":\"1366559121\"}",
"{\"id\":\"2\",\"name\":\"Test 1\",\"date_created\":\"1366735066\"}"
]
}
So when I process it in my Javascript, the data.maps[x].id is undefined. I am pretty sure that the JSON being output is incorrectly formatted.
Can anyone help me fix it?
It's undefined because what you have at data.maps[x] is not an object, but a string. Since a string has no property called id, you're getting undefined. I'd probably do something like this (if I couldn't change the perl script):
var mapData = JSON.parse(data.maps[x]);
//do stuff with mapData.id
But the better thing to do, is to make sure that it doesn't encode it as a string, but as proper JSON.
This part in your perl script:
my $map = qq{{"id":"$id","name":"$name","date_created":"$date"}};
Is simply making a quoted string out of all that data. Instead, what you want is an actual perl hash that can be translated into a JSON map/associative-array. So try this:
my $map = {
"id" => "$id",
"name" => "$name",
"date_created" => "$date"
};
push $maps, $map;
This way you actually have a perl hash (instead of just a string) that will get translated into proper JSON.
As an example, I wrote some test code:
use strict;
use JSON::XS;
my $maps = [];
push $maps, { id => 1, blah => 2 };
push $maps, { id => 3, blah => 2 };
my $j = JSON::XS->new->utf8->pretty(1);
my $output = $j->encode({
success => "list of blah",
maps => $maps
});
print $output;
When you run this, you get:
{
"success" : "list of blah",
"maps" : [
{
"blah" : 2,
"id" : 1
},
{
"blah" : 2,
"id" : 3
}
]
}