How can I convert to Json Object to Json Array in Karate? - json

I want to convert Json Object to Json Array in Karate to use 'match each' func.
I am getting to ('match each' failed, not a json array) error when I use match each func with Json Object.
Here is My Json Object:
{
{ "a": "q"
"b": "w",
"c": "t"
},
{ "a": "x"
"b": "y",
"c": "z"
}
}
And here is what I need:
[
{
{ "a": "q"
"b": "w",
"c": "t"
},
{ "a": "x"
"b": "y",
"c": "z"
}
}
]

Try this approach, using embedded expressions: https://github.com/intuit/karate#embedded-expressions
* def foo = { a: 1 }
* def list = [ '#(foo)' ]
* match each list == foo

Related

How do I collect values by keys from different levels in JQ?

Let's suppose I have a JSON like this:
[
{
"a": 1,
"l": [
{"b": "z"},
{"b": "x"}
]
},
{
"a": 2,
"l": [
{"b": "c"}
]
}
]
I want to collect the data from all embedded arrays and to get an array of all objects with "a" and "b" values. For the JSON above the result should be:
[
{"a": 1, "b": "z"},
{"a": 1, "b": "x"},
{"a": 2, "b": "c"}
]
What JQ expression do I need to try to solve the issue?
You can use .l[] within the expression in order to return each element of the array returned in the response. So, use this one below
map({a} + .l[])
Demo

Groovy: escape lower level JSON

I need to keep the first level JSON keys and convert the values to escaped strings, but only in case the values are also JSON objects. How can this be done in Groovy?
Input sample:
{
"a": "1",
"b": {
"c": "2",
"d": {
"e": "3"
}
},
"f": "4"
}
Desired result:
{
"a": "1",
"b": "{ \"c\": \"2\", \"d\": { \"e\": \"3\"} }",
"f": "4"
}
If you use JsonSlurper to parse the input JSON, then any nested JSON object will be represented as a LazyMap. You can use this information to collect all entries from the parsed JSON object (which is also a map) and convert any map object to its JSON string representation. You can convert any value to a JSON string representation using groovy.json.JsonOutput.toJson(object) method.
import groovy.json.JsonOutput
import groovy.json.JsonSlurper
def input = '''{
"a": "1",
"b": {
"c": "2",
"d": {
"e": "3"
}
},
"f": "4"
}'''
def json = new JsonSlurper().parseText(input)
def escaped = json.collectEntries { k,v ->
[(k): v instanceof Map ? new JsonOutput().toJson(v) : v]
}
def output = new JsonOutput().prettyPrint(JsonOutput.toJson(escaped))
println output
Output:
{
"a": "1",
"b": "{\"c\":\"2\",\"d\":{\"e\":\"3\"}}",
"f": "4"
}

KarateException Missing Property in path - JSON

I was trying to match particular variable from response and tried as below. But im getting error saying KarateException Missing Property in path $['Odata']. My question is: how we can modify so that we won't get this error?
Feature:
And match response.#odata.context.a.b contains '<b>'
Examples:
|b|
|b1 |
|b2 |
Response is
{
"#odata.context": "$metadata#Accounts",
"a": [
{
"c": 145729,
"b": "b1",
"d": "ON",
},
{
"c": 145729,
"b": "b2",
"d": "ON",
}
]
}
I think you are confused with the structure of your JSON. Also note that when the JSON key has special characters, you need to change the way you use them in path expressions. You can try paste the below in a new Scenario and see it work:
* def response =
"""
{
"#odata.context": "$metadata#Accounts",
"a": [
{
"c": 145729,
"b": "b1",
"d": "ON",
},
{
"c": 145729,
"b": "b2",
"d": "ON",
}
]
}
"""
* match response['#odata.context'] == '$metadata#Accounts'
* match response.a[0].b == 'b1'
* match response.a[1].b == 'b2'

Snakemake : multi-level json parsing

I've a json configuration file that looks like:
{
"projet_name": "Project 1",
"samples": [
{
"sample_name": "Sample_A",
"files":[
{
"a": "file_A_a1.txt",
"b": "file_A_a2.txt",
"x": "x1"
},
{
"a": "file_A_b1.txt",
"b": "file_A_b2.txt",
"x": "x1"
},
{
"a": "file_A_c1.txt",
"b": "file_A_c2.txt",
"x": "x2"
}
]
},
{
"sample_name": "Sample_B",
"files":[
{
"a": "file_B_a1.txt",
"b": "file_B_a2.txt",
"x": "x1"
},
{
"a": "file_B_b1.txt",
"b": "file_B_b2.txt",
"x": "x1"
}
]
}]
}
I'm currently writing a snakemake file to process such json file. The idea is to for each sample (e.g. Sample_A , Sample_B) to concatenate the files that have the same "x" entry. For example in Sample_A, I would like to concatenate "a" files : file_A_a1.txt and file_A_b1.txt as they have the same "x" entry. Same for "b" files : file_A_a2.txt and file_A_b2.txt. file_A_c1.txt and file_A_c2.txt will not be concatenate with other files as they have a unique "x". At the end I would like a structure like this :
merged_files/Sample_A_a_x1.txt
merged_files/Sample_A_b_x1.txt
merged_files/Sample_A_a_x2.txt
merged_files/Sample_A_b_x2.txt
merged_files/Sample_B_a_x1.txt
merged_files/Sample_B_b_x1.txt
My issue is the grouping of files with same "sample_name" and same "x" .. Any suggestions ?
Thank you

Julia | DataFrame conversion to JSON

I have a dataframe in Julia like df = DataFrame(A = 1:4, B = ["M", "F", "F", "M"]). I have to convert it into a JSON like
{
"nodes": [
{
"A": "1",
"B": "M"
},
{
"A": "2",
"B": "F"
},
{
"A": "3",
"B": "F"
},
{
"A": "4",
"B": "M"
}
]
}
Please help me in this.
There isn't a method in DataFrames to do this. In a github issue where the following snippet, using JSON.jl, is offered as a method to write json:
using JSON
using DataFrames
function df2json(df::DataFrame)
len = length(df[:,1])
indices = names(df)
jsonarray = [Dict([string(index) => (isna(df[index][i])? nothing : df[index][i])
for index in indices])
for i in 1:len]
return JSON.json(jsonarray)
end
function writejson(path::String,df::DataFrame)
open(path,"w") do f
write(f,df2json(df))
end
end
JSONTables package provides JSON conversion to/from Tables.jl-compatible sources like DataFrame.
using DataFrames
using JSONTables
df = DataFrame(A = 1:4, B = ["M", "F", "F", "M"])
jsonstr = objecttable(df)