TCL to JSON : Writing JSON output using huddle in single line - json

Let us consider a tcl data as follows:
set arr {a {{c 1} {d {2 2 2} e 3}} b {{f 4 g 5}}}
Converted into Json format using huddle module:
set json_arr [huddle jsondump [huddle compile {dict * {list {dict d list}}} $arr]]
puts $json_arr
Json fromatted array:
{
"a": [
{"c": 1},
{
"d": [
2,
2,
2
],
"e": 3
}
],
"b": [{
"f": 4,
"g": 5
}]
}
Writing in a single line:
set json_arr [huddle jsondump [huddle compile {dict * {list {dict d list}}} $arr] {} {}]
puts $json_arr
Updated Json formatted array:
{"a":[{"c":1},{"d":[2,2,2],"e":3}],"b":[{"f":4,"g":5}]}
What is the meaning of {} {} here?
Can I use the same for single line in case of output by json and json::write module ?

The last three, optional, arguments to jsondump are offset, newline, and begin_offset. You can use those to specify strings that are to be used to format the output string. If you don’t specify them, default strings will be used.
If you do specify them, you need to follow the protocol for optional arguments, i.e. if you want to specify begin_offset, you need to specify offset and newline too, etc. In this case, offset and newline are specified to be empty strings, and begin_offset uses its default value.
Try invoking jsondump with dummy values to get an idea of how they are used:
% huddle jsondump [huddle compile {dict * {list {dict d list}}} $arr] <A> <B> <C>
{<B><C><A>"a": [<B><C><A><A>{"c": 1},<B><C><A><A>{<B><C><A><A><A>"d": [<B><C><A><A><A><A>2,<B><C><A><A><A><A>2,<B><C><A><A><A><A>2<B><C><A><A><A>],<B><C><A><A><A>"e": 3<B><C><A><A>}<B><C><A>],<B><C><A>"b": [{<B><C><A><A><A>"f": 4,<B><C><A><A><A>"g": 5<B><C><A><A>}]<B><C>}
A newline and a begin_offset string is inserted around each component, and one or more offset strings are inserted before a component to reflect the indentation level.
json::write uses the indented and aligned subcommands to customize formatting.

Related

Search and replace based on a dictionary

I have a json file filled with a list of data where each element has one field called url.
[
{ ...,
...,
"url": "us.test.com"
},
...
]
In a different file I have a list of mappings that I need to replace the affected url fields with, formatted like this:
us.test.com test.com
hello.com/se hello.com
...
So the end result should be:
[
{ ...,
...,
"url": "test.com"
},
...
]
Is there a way to do this in Vim or do I need to do it programmatically?
Well, I'd do this programmatically in Vim ;-) As you'll see it's quite similar to Python and many other scripting languages.
Let's suppose we have json file open. Then
:let foo = json_decode(join(getline(1, '$')))
will load json into VimScript variable. So :echo foo will show [{'url': 'us.test.com'}, {'url': 'hello.com/se'}].
Now let's switch to a "mapping" file. We're going to split all lines and make a Dictionary like that:
:let bar = {}
:for line in getline(1, '$') | let field = split(line) | let bar[field[0]] = field[1] | endfor
Now :echo bar shows {'hello.com/se': 'hello.com', 'us.test.com': 'test.com'} as expected.
To perform a substitution we do simply:
:for field in foo | let field.url = bar->get(field.url, field.url) | endfor
And now foo contains [{'url': 'test.com'}, {'url': 'hello.com'}] which is what we want. The remaining step is to write the new value into a buffer with
:put =json_encode(foo)
You could…
turn those lines in your mappings file (/tmp/mappings for illustration purpose):
us.test.com test.com
hello.com/se hello.com
...
into:
g/"url"/s#us.test.com#test.com#g
g/"url"/s#hello.com/se#hello.com#g
...
with:
:%normal Ig/"url"/s#
:%s/ /#
The idea is to turn the file into a script that will perform all those substitutions on all lines matching "url".
If you are confident that those strings are only in "url" lines, you can just do:
:%normal I%s#
:%s/ /#
to obtain:
%s#us.test.com#test.com#g
%s#hello.com/se#hello.com#g
...
write the file:
:w
and source it from your JSON file:
:source /tmp/mappings
See :help :g, :help :s, :help :normal, :help :range, :help :source, and :help pattern-delimiter.

Iterate over all possible combinations of input variables in mathematica

I have a self-defined function in Mathematica, which has the following syntax:
outputval = myfunc[r, sigma, S, K, T, lambda, eta1, eta2, p]
When the function is called as above with numeric input values, it outputs a single output value.
For each input variable I have 5 different values. I want to input all combinations of all 5 values of the 9 input variables in my function and export a CSV file containing the 9 input values and their respective output value in the 10th column.
I am very new to Mathematica and I have no clue how to do so. Any help is appreciated:)
A small example will illustrate how to get what you want:
xvals = {1, 2}
yvals = {3, 4}
{Sequence ## #, f ## #} & /# Tuples[{xvals, yvals}]
Warning: 5^9==1953125. So you may with to use a Do loop and write directly to file instead of creating these lists. To illustrate:
fmt = StringTemplate["``,``,``"];
Do[Print[fmt[x, y, f[x, y]]], {x, xvals}, {y, yvals}]
You'll want to replace Print with WriteLine.

Ruby output numbers to 2 decimal places

I'm having trouble serializing my ruby object to json, more specifically the format of the numbers.
I have written an rspec test to illustrate my issue more precisely.
expected = '{ "foo": 1.00, "bar": 4.50, "abc": 0.00, "xyz": 1.23 }'
it 'serializes as expected' do
my_hash = { "foo": 1, "bar": 4.5, "abc": 0, "xyz": 1.23}
expect(my_to_json_method(my_hash)).to eq expected
end
This is the case that I am having trouble with. I can use the sprintf but how do I get the string output as shown in the above example?
First of all, you should not use floats to represent monetary values. So instead, let's use a more appropriate type: (there's also the Ruby Money gem)
require 'bigdecimal'
my_hash = {
foo: BigDecimal.new('1.00'),
bar: BigDecimal.new('4.50'),
abc: BigDecimal.new('0.00'),
xyz: BigDecimal.new('1.23')
}
There are several options to represent monetary values. All of the following JSON strings are valid according to the JSON specification and all require special treatment upon parsing. It's up to you to choose the most appropriate.
Note: I'm implementing a custom to_json method to convert the BigDecimal instances to JSON using Ruby's default JSON library. This is just for demonstration purposes, you should generally not patch core (or stdlib) classes.
1. Numbers with fixed precision
This is what you asked for. Note that many JSON libraries will parse these numbers as floating point values by default.
class BigDecimal
def to_json(*)
'%.2f' % self
end
end
puts my_hash.to_json
Output:
{"foo":1.00,"bar":4.50,"abc":0.00,"xyz":1.23}
2. Numbers as strings
This will work across all JSON libraries, but storing numbers as strings doesn't look quite right to me.
class BigDecimal
def to_json(*)
'"%.2f"' % self
end
end
puts my_hash.to_json
Output:
{"foo":"1.00","bar":"4.50","abc":"0.00","xyz":"1.23"}
3. Numbers as integers
Instead of representing monetary values as fractional numbers, you simply output the cents as whole numbers. This is what I usually do.
class BigDecimal
def to_json(*)
(self * 100).to_i.to_s
end
end
puts my_hash.to_json
Output:
{"foo":100,"bar":450,"abc":0,"xyz":123}
User Sprintf
sprintf('%.2f', 5.5)
And simply interpolate into your JSON as an ERB template.
You can use, sprintf and can take as many decimal points as you needed by mentioning %.(number)f.
Eg: For two decimals, %.2f
Here is a real implementation,
2.2.2 :019 > test = { "foo": (sprintf "%.2f","1.11"), "bar": (sprintf "%.2f","4.55"), "abc": (sprintf "%.2f","0.2") }
=> {:foo=>"1.11", :bar=>"4.55", :abc=>"0.20"}
Here is the reference
puts '{' << my_hash.map { |k, v| %Q|"#{k}": #{"%.2f" % v}| }.join(', ') << '}'
#⇒ {"foo": 1.00, "bar": 4.50, "abc": 0.00, "xyz": 1.23}

Using RJSONIO and AsIs class

I am writing some helper functions to convert my R variables to JSON. I've come across this problem: I would like my values to be represented as JSON arrays, this can be done using the AsIs class according to the RJSONIO documentation.
x = "HELLO"
toJSON(list(x = I(x)), collapse="")
"{ \"x\": [ \"HELLO\" ] }"
But say we have a list
y = list(a = "HELLO", b = "WORLD")
toJSON(list(y = I(y)), collapse="")
"{ \"y\": {\n \"a\": \"HELLO\",\n\"b\": \"WORLD\" \n} }"
The value found in y -> a is NOT represented as an array. Ideally I would have
"{ \"y\": [{\n \"a\": \"HELLO\",\n\"b\": \"WORLD\" \n}] }"
Note the square brackets. Also I would like to get rid of all "\n"s, but collapse does not eliminate the line breaks in nested JSON. Any ideas?
try writing as
y = list(list(a = "HELLO", b = "WORLD"))
test<-toJSON(list(y = I(y)), collapse="")
when you write to file it appears as:
{ "y": [
{
"a": "HELLO",
"b": "WORLD"
}
] }
I guess you could remove the \n as
test<-gsub("\n","",test)
or use RJSON package
> rjson::toJSON(list(y = I(y)))
[1] "{\"y\":[{\"a\":\"HELLO\",\"b\":\"WORLD\"}]}"
The reason
> names(list(a = "HELLO", b = "WORLD"))
[1] "a" "b"
> names(list(list(a = "HELLO", b = "WORLD")))
NULL
examining the rjson::toJSON you will find this snippet of code
if (!is.null(names(x)))
return(toJSON(as.list(x)))
str = "["
so it would appear to need an unnamed list to treat it as a JSON array. Maybe RJSONIO is similar.

Check whether procedure return list or list with sub list

I am facing problem how to check whether the list returned by the procedure consist of a single list or may have sub list inside.
#simple list
set a { 1 2 3 4}
# list consisting of sub list
set a { {1 2 3 4} {5 6 7 7} }
As above some times the variable a will have a list and sometime proc will return list consisting of sub list.
Update part
set a [mysqlsel $db "SELECT * FROM abc" -list]
I do not know weather query will return a single list or list consisting of sublist
You should really rethink your approach: since Tcl is typeless, you can't really tell if {{1 2 3 4} {5 6 7 8}} is a list of two lists or a list of two strings or a literal string {1 2 3 4} {5 6 7 8}, because all these propositions are true depending on how you make Tcl interpret this value.
Another thing, is that even if you were to try something like catch {lindex $element 0} or string is list $element on each top-level element to see if it can be interpreted as a list, that would qualify as being non-lists only strings that really can't be parsed as lists, like aaa { bbb. And string foo is also a proper list (of length 1, containing "foo" as its sole element).
One approach you can consider using is wrapping the returned value in another value which has some sort of "tag" attached to it--the trick routinely used in some other typeless languages like LISP and Erlang. That would look like this:
If you need to return 1 2 3 4, return {flat {1 2 3 4}} instead.
If you need to return {1 2 3 4} {5 6 7 8}, return {nested {{1 2 3 4} {1 2 3 4 5}}}.
Then in the client code do switch on the "tag" element and decapsulate the payload:
lassign [your_procedure ...] tag payload
switch -- $tag {
flat {
# do something with $payload
}
nested {
# do something with $payload
}
}