Printing JSON like console.log prints it in Node.js - json

Wondering if it's possible to print JSON like Node.js prints it out. If there is some standard module or something to do that.
It has the spacing between keys, and when it's too long it goes onto a new line, etc. The coloring would be a bonus.
JSON.stringify(object, null, 2) pretty prints the JSON, wondering if there is something more hidden in there, or any standards, for doing it like Node.js does. Thank you.

This seems to be achievable by using util.inspect():
const util = require('util');
const testJson = `{
"id": "0001",
"type": "donut",
"name": "Cake",
"ppu": 0.55,
"batters":
{
"batter":
[
{ "id": "1001", "type": "Regular" },
{ "id": "1002", "type": "Chocolate" },
{ "id": "1003", "type": "Blueberry" },
{ "id": "1004", "type": "Devil's Food" }
]
},
"topping":
[
{ "id": "5001", "type": "None" },
{ "id": "5002", "type": "Glazed" },
{ "id": "5005", "type": "Sugar" },
{ "id": "5007", "type": "Powdered Sugar" },
{ "id": "5006", "type": "Chocolate with Sprinkles" },
{ "id": "5003", "type": "Chocolate" },
{ "id": "5004", "type": "Maple" }
]
}`
x = JSON.parse(testJson);
x.x = x;
console.log(util.inspect(x, { colors: true }));
The options object takes a parameter called depth that determines how deep it will recurse. The default value is 2. If I increase it to 3:
console.log(util.inspect(x, { colors: true, depth: 3 }));
I get the following:
To make it recurse indefinitely pass depth: null. The default value seems to be 2 in the node cli as well.

Related

Filtering on nested JSON array: Rest Assured

I am trying to filter my GET response(JSON) based on a value in a nested JSON array. For eg: In the following JSON i want to filter an JSON array and print the names of cakes using Chocolate as batter.
{
"id": "0001",
"type": "donut",
"name": "Choco Blueberry Cake",
"ppu": 0.55,
"batter":[
{ "id": "1001", "type": "Regular" },
{ "id": "1002", "type": "Chocolate" },
{ "id": "1003", "type": "Blueberry" }
]
}
I have tried something like:
List<String> chocolateCakeList =jsonPath.getList("findAll{it.batter.type=='chocolate'}.name");
and
List<String> chocolateCakeList =jsonPath.getList("findAll{it.batter.it.type=='chocolate'}.name");
Both return empty Lists.
Problem
First, if you want to use findAll then the object to be axtract must be an array. Your json is object {}, not an array []. Actually, your nested object batter is an array [].
keyword to search is Chocolate, not chocolate. Case-sensitive is applied here.
Solution
-If your whole response is exactly what you posted, then the path to extract is
String name = jsonPath.getString("name")
-If your response has structrure like this.
[
{
"id": "0001",
"type": "donut",
"name": "Choco Blueberry Cake",
"ppu": 0.55,
"batter": [
{
"id": "1001",
"type": "Regular"
},
{
"id": "1002",
"type": "Chocolate"
},
{
"id": "1003",
"type": "Blueberry"
}
]
},
{
"id": "0002",
"type": "donut",
"name": "Choco Blueberry Cake",
"ppu": 0.55,
"batter": [
{
"id": "1001",
"type": "Regular"
},
{
"id": "1002",
"type": "Chocolate 2"
},
{
"id": "1003",
"type": "Blueberry"
}
]
}
]
Then the extraction is
List<String> list = jsonPath.getList("findAll {it.batter.findAll {it.type == 'Chocolate'}}.name");

Integromat - Dynamically render spec of a collection from an rpc

I am trying to dynamically render a spec (Specification) of a collection from an RPC. Can't get it to work. Here I have attached the code of both 'module->mappable parameters' and the 'remote procedure->communication' here.
module -> mappable parameters
[
{
"name": "birdId",
"type": "select",
"label": "Bird Name",
"required": true,
"options": {
"store": "rpc://selectbird",
"nested": [
{
"name": "variables",
"type": "collection",
"label": "Bird Variables",
"spec": [
"rpc://birdVariables"
]
}
]
}
}
]
remote procedure -> communication
{
"url": "/bird/get-variables",
"method": "POST",
"body": {
"birdId": "{{parameters.birdId}}"
},
"headers": {
"Authorization": "Apikey {{connection.apikey}}"
},
"response": {
"iterate":{
"container": "{{body.data}}"
},
"output": {
"name": "{{item.name}}",
"label": "{{item.label}}",
"type": "{{item.type}}"
}
}
}
Thanks in advance.
Just tried the following and it worked. According to Integromat's Docs you can use the wrapper directive for the rpc like so:
{
"url": "/bird/get-variables",
"method": "POST",
"body": {
"birdId": "{{parameters.birdId}}"
},
"headers": {
"Authorization": "Apikey {{connection.apikey}}"
},
"response": {
"iterate":"{{body.data}}",
"output": {
"name": "{{item.name}}",
"label": "{{item.label}}",
"type": "{{item.type}}"
},
"wrapper": [{
"name": "variables",
"type": "collection",
"label": "Bird Variables",
"spec": "{{output}}"
}]
}
}
Your mappable parameters would then look like:
[
{
"name": "birdId",
"type": "select",
"label": "Bird Name",
"required": true,
"options": {
"store": "rpc://selectbird",
"nested": "rpc://birdVariables"
}
}
]
Needing this myself. Pulling in custom fields that have different types but would like them all to show for the user to update customs fields or when creating a contact be able to update them. Not sure if best to have them all show or have a select drop down then let the user use the map for more than one.
Here is my response from a Get for custom fields. Could you show how my code should look. Got little confused as usualy look for add a value in the output and do you need two separate RPC's in integromat? Noticed your store and nested were different.
{
"customFields": [
{
"id": "5sCdYXDx5QBau2m2BxXC",
"name": "Your Experience",
"fieldKey": "contact.your_experience",
"dataType": "LARGE_TEXT",
"position": 0
},
{
"id": "RdrFtK2hIzJLmuwgBtAr",
"name": "Assisted by",
"fieldKey": "contact.assisted_by",
"dataType": "MULTIPLE_OPTIONS",
"position": 0,
"picklistOptions": [
"Tom",
"Jill",
"Rick"
]
},
{
"id": "uyjmfZwo0PCDJKg2uqrt",
"name": "Is contacted",
"fieldKey": "contact.is_contacted",
"dataType": "CHECKBOX",
"position": 0,
"picklistOptions": [
"I would like to be contacted"
]
}
]
}

Extracting lines from nested json file using Ubuntu

I have nested json file (each field can have one or more sons) with many lines. Something like this:
{
"id": "0001",
"type": "donut",
"name": "Cake",
"ppu": 0.55,
"batters":
{
"batter":
[
{ "id": "1001", "type": "Regular" },
{ "id": "1002", "type": "Chocolate" },
{ "id": "1003", "type": "Blueberry" },
{ "id": "1004", "type": "Devil's Food" }
]
},
"topping":
[
{ "id": "5001", "type": "None" },
{ "id": "5002", "type": "Glazed" },
{ "id": "5005", "type": "Sugar" },
{ "id": "5007", "type": "Powdered Sugar" },
{ "id": "5006", "type": "Chocolate with Sprinkles" },
{ "id": "5003", "type": "Chocolate" },
{ "id": "5004", "type": "Maple" }
]
}
Using Ubuntu (or Unix) I would like to find the first 10 lines that their "batters" field is not empty (i.e. in the line one or more of its sons contain value (not null or "").

How to flatten two json arrays using apache drill

{
"id": "0001",
"type": "donut",
"name": "Cake",
"ppu": 0.55,
"batters":
{
"batter":
[
{ "id": "1001", "type": "Regular" },
{ "id": "1002", "type": "Chocolate" },
{ "id": "1003", "type": "Blueberry" },
{ "id": "1004", "type": "Devil's Food" }
]
},
"topping":
[
{ "id": "5001", "type": "None" },
{ "id": "5002", "type": "Glazed" },
{ "id": "5005", "type": "Sugar" },
{ "id": "5007", "type": "Powdered Sugar" },
{ "id": "5006", "type": "Chocolate with Sprinkles" },
{ "id": "5003", "type": "Chocolate" },
{ "id": "5004", "type": "Maple" }
]
}
I have the above json and i want to flatten both batter and topping arrays.
so i tried doing:
SELECT flatten(topping) as toping,flatten(batters.batter) as bat FROM json.jsonfiles.`batter.json`;
which gives me
org.apache.drill.common.exceptions.UserRemoteException: VALIDATION
ERROR: From line 1, column 43 to line 1, column 49: Table 'batters'
not found SQL Query null [Error Id:
33cf80f2-f283-4401-90ce-c262474e0778 on acer:31010]
How can i solve this? Can we flatten two arrays in a single query?
You need to add table alias and refer it in columns. Below query works for me with sample data you have provided.
SELECT flatten(a.topping) as toping,flatten(a.batters.batter) as bat FROM dfs.tmp.`batter.json` a;

Azure Data Factory Copy Activity

I have been working on this for a couple days and cannot get past this error. I have 2 activities in this pipeline. The first activity copies data from an ODBC connection to an Azure database, which is successful. The 2nd activity transfers the data from Azure table to another Azure table and keeps failing.
The error message is:
Copy activity met invalid parameters: 'UnknownParameterName', Detailed message: An item with the same key has already been added..
I do not see any invalid parameters or unknown parameter names. I have rewritten this multiple times using their add activity code template and by myself, but do not receive any errors when deploying on when it is running. Below is the JSON pipeline code.
Only the 2nd activity is receiving an error.
Thanks.
Source Data set
{
"name": "AnalyticsDB-SHIPUPS_06shp-01src_AZ-915PM",
"properties": {
"structure": [
{
"name": "UPSD_BOL",
"type": "String"
},
{
"name": "UPSD_ORDN",
"type": "String"
}
],
"published": false,
"type": "AzureSqlTable",
"linkedServiceName": "Source-SQLAzure",
"typeProperties": {},
"availability": {
"frequency": "Day",
"interval": 1,
"offset": "04:15:00"
},
"external": true,
"policy": {}
}
}
Destination Data set
{
"name": "AnalyticsDB-SHIPUPS_06shp-02dst_AZ-915PM",
"properties": {
"structure": [
{
"name": "SHIP_SYS_TRACK_NUM",
"type": "String"
},
{
"name": "SHIP_TRACK_NUM",
"type": "String"
}
],
"published": false,
"type": "AzureSqlTable",
"linkedServiceName": "Destination-Azure-AnalyticsDB",
"typeProperties": {
"tableName": "[olcm].[SHIP_Tracking]"
},
"availability": {
"frequency": "Day",
"interval": 1,
"offset": "04:15:00"
},
"external": false,
"policy": {}
}
}
Pipeline
{
"name": "SHIPUPS_FC_COPY-915PM",
"properties": {
"description": "copy shipments ",
"activities": [
{
"type": "Copy",
"typeProperties": {
"source": {
"type": "RelationalSource",
"query": "$$Text.Format('SELECT COMPANY, UPSD_ORDN, UPSD_BOL FROM \"orupsd - UPS interface Dtl\" WHERE COMPANY = \\'01\\'', WindowStart, WindowEnd)"
},
"sink": {
"type": "SqlSink",
"sqlWriterCleanupScript": "$$Text.Format('delete imp_fc.SHIP_UPS_IntDtl_Tracking', WindowStart, WindowEnd)",
"writeBatchSize": 0,
"writeBatchTimeout": "00:00:00"
},
"translator": {
"type": "TabularTranslator",
"columnMappings": "COMPANY:COMPANY, UPSD_ORDN:UPSD_ORDN, UPSD_BOL:UPSD_BOL"
}
},
"inputs": [
{
"name": "AnalyticsDB-SHIPUPS_03shp-01src_FC-915PM"
}
],
"outputs": [
{
"name": "AnalyticsDB-SHIPUPS_03shp-02dst_AZ-915PM"
}
],
"policy": {
"timeout": "1.00:00:00",
"concurrency": 1,
"executionPriorityOrder": "NewestFirst",
"style": "StartOfInterval",
"retry": 3,
"longRetry": 0,
"longRetryInterval": "00:00:00"
},
"scheduler": {
"frequency": "Day",
"interval": 1,
"offset": "04:15:00"
},
"name": "915PM-SHIPUPS-fc-copy->[imp_fc]_[SHIP_UPS_IntDtl_Tracking]"
},
{
"type": "Copy",
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "$$Text.Format('select distinct ups.UPSD_BOL, ups.UPSD_BOL from imp_fc.SHIP_UPS_IntDtl_Tracking ups LEFT JOIN olcm.SHIP_Tracking st ON ups.UPSD_BOL = st.SHIP_SYS_TRACK_NUM WHERE st.SHIP_SYS_TRACK_NUM IS NULL', WindowStart, WindowEnd)"
},
"sink": {
"type": "SqlSink",
"writeBatchSize": 0,
"writeBatchTimeout": "00:00:00"
},
"translator": {
"type": "TabularTranslator",
"columnMappings": "UPSD_BOL:SHIP_SYS_TRACK_NUM, UPSD_BOL:SHIP_TRACK_NUM"
}
},
"inputs": [
{
"name": "AnalyticsDB-SHIPUPS_06shp-01src_AZ-915PM"
}
],
"outputs": [
{
"name": "AnalyticsDB-SHIPUPS_06shp-02dst_AZ-915PM"
}
],
"policy": {
"timeout": "1.00:00:00",
"concurrency": 1,
"executionPriorityOrder": "NewestFirst",
"style": "StartOfInterval",
"retry": 3,
"longRetryInterval": "00:00:00"
},
"scheduler": {
"frequency": "Day",
"interval": 1,
"offset": "04:15:00"
},
"name": "915PM-SHIPUPS-AZ-update->[olcm]_[SHIP_Tracking]"
}
],
"start": "2017-08-22T03:00:00Z",
"end": "2099-12-31T08:00:00Z",
"isPaused": false,
"hubName": "adf-tm-prod-01_hub",
"pipelineMode": "Scheduled"
}
}
Have you seen this link?
They get the same error message and suggest using AzureTableSink instead of SqlSink
"sink": {
"type": "AzureTableSink",
"writeBatchSize": 0,
"writeBatchTimeout": "00:00:00"
}
It would make sense for you too since your 2nd copy activity is Azure to Azure
It could be a red herring but I'm pretty sure "tableName" is a require entry in the typeProperties for a sqlSource. Yours is missing this for the input dataset. Appreciate you have a join in the sqlReaderQuery so probably best to put a dummy (but real) table name in there.
Btw, not clear why you are using $$Text.Format and WindowStart/WindowEnd on your queries if you're not transposing these values into the query; you could just put the query between double quotes.