I have such a environment to start from:
(defn field-name "blah")
(defn obj (js* "{
list: [1,2,3,4,5],
blah: \"vtha\",
o: { answer: 42 }
}")
How do I get (idiomatic way) blah field using field name var?
(aget obj field-name)
works, but it is intended for arrays (docs say)
You can use goog.object/get and I think this is idiomatic way to access the properties.
I would also recommend binaryage/cljs-oops that is addressing this very problem.
Related
I'm returning this kind os data to an variable:
{"numbers":[0.8832325122263557,0.9905363563950811, ...]}
How can i remove this string "numbers": and stay just with the [] ?
Here's a script that
Uses Mix.install/2 to instal Jason. In a mix project you would add jason to your deps in mix.exs instead.
Uses the ~S sigil to quote the JSON without needing to escape the " characters.
Uses Jason.decode!/2 to parse the JSON.
Uses Map.get/3 to get the value of the numbers key from the resulting map.
Inspects the value with IO.inspect/2, causing it to print out.
Uses |> pipes to pass the data between function calls succinctly.
Mix.install([:jason])
~S({"numbers":[0.8832325122263557,0.9905363563950811]})
|> Jason.decode!()
|> Map.get("numbers")
|> IO.inspect()
Running and output:
$ elixir example.exs
[0.8832325122263557, 0.9905363563950811]
Just to propose an alternative solution without a JSON parser, I would use Regex.
Captures any string inside brackets, e.g. [numbers]
Splits the string by ,
Converts the string numbers to floats
string = ~S({"numbers":[0.8832325122263557,0.9905363563950811]})
~r/(?:.+\[)(?'numbers'.+)(?:\].+)/
|> Regex.scan(string, capture: ["numbers"])
|> List.first()
|> List.first()
|> String.split(",")
|> Enum.map(&String.to_float/1)
This works well if your input is always expected to be like {"numbers":[0.8832325122263557,0.9905363563950811, ...]}.
I'm learning Elm and one thing that has puzzled me is 'Json.Decode.succeed'. According to the docs
succeed : a -> Decoder a
Ignore the JSON and produce a certain Elm value.
decodeString (succeed 42) "true" == Ok 42
decodeString (succeed 42) "[1,2,3]" == Ok 42
decodeString (succeed 42) "hello" == Err ...
I understand that (although, as a beginner, I don't yet see its use). But this method is also used in a Decode pipeline, thus:
somethingDecoder : Maybe Wookie -> Decoder Something
somethingDecoder maybeWookie =
Json.Decode.succeed Something
|> required "caterpillar" Caterpillar.decoder
|> required "author" (Author.decoder maybeWookie)
What is going on here? That is, if 'succeed' ignores the JSON that's passed to it, how is it used to read JSON and turn it into Elm values? Any clues appreciated!
Just to start, the intuition for a decoder pipeline is that it acts like a curried function where piping with required and optional applies arguments one-by-one. Expect that everything, both the function, its arguments and the return value are all wrapped in Decoders.
So as an example:
succeed Something
|> required (succeed 42)
|> required (succeed "foo")
is equivalent to
succeed (Something 42 "foo")
and
decodeString (succeed (Something 42 "foo")) whatever
will return Ok (Something 42 "foo") as long as whatever is valid JSON.
When everything succeeds it's just a really convoluted function call. The more interesting aspect of decoders, and the reason we use them in the first place, is in the error path. But since 'succeed' is what's of interest here, we'll ignore that and save a lot of time, text and brain cells. Just know that without considering the error path this will all seem very contrived.
Anyway, let's try to recreate this to see how it works.
Decode.map2
The key to the pipelines, apart form the pipe operator, is the Decode.map2 function. You've probably already used it, or its siblings, if you've tried writing JSON decoders without using pipelines. We can implement our example above using map2 like this:
map2 Something
(succeed 42)
(succeed "foo")
This will work exactly like the example above. But the problem with this, from a user POV, is that if we need to add another argument we also have to change map2 to map3. And also Something isn't wrapped in a decoder, which is boring.
Calling functions wrapped in Decoders
The reason this is useful anyway is because it gives us access to several values at the same time, and the ability to combine them in whatever way we want. We can use this to call a function inside a Decoder with an argument inside a Decoder and have the result also wrapped in a Decoder:
map2 (\f x -> f x)
(succeed String.fromInt)
(succeed 42)
Currying and partial application
Unfortunately this still has the problem of needing to change the map function if we need more arguments. If only there was a way to apply arguments to a function one at a time... like if we had currying and partial application. Since we have a way to call functions wrapped in decoders now, what if we return a partially applied function instead and apply the remaining arguments later?
map2 (\f x -> f x)
(succeed Something)
(succeed 42)
will return a Decoder (string -> Something), so now we just have to rinse and repeat with this and the last argument:
map2 (\f x -> f x)
(map2 (\f x -> f x)
(succeed Something)
(succeed 42))
(succeed "")
Et voila, we have now recreated JSON decode pipelines! Although it might not look like it on the surface.
Ceci n'est pas une pipe
The final trick is to use map2 with the pipe operator. The pipe is essentially defined as \x f -> f x. See how similar this looks to the function we've been using? The only difference is that the arguments are swapped around, so we need to swap the order we pass arguments as well:
map2 (|>)
(succeed "")
(map2 (|>)
(succeed 42)
(succeed Something))
and then we can use the pipe operator again to reach the final form
succeed Something
|> map2 (|>)
(succeed 42)
|> map2 (|>)
(succeed "")
It should now be apparent that required is just an alias for map2 (|>).
And that's all there is to it!
It's possible to invoke a clojurescript function from javascript, for example:
cljs.core.keyword("foobar")
returns the :foobar keyword, positional arguments work as you would expect. I'm trying to invoke js->clj with the :keywordize-keys argument but so far I haven't been successful. I've tried:
cljs.core.js__GT_clj({'foo': 42}, {'keywordize-keys': true})
// and
var k = cljs.core.keyword('keywordize-keys')
cljs.core.js__GT_clj({'foo': 42}, {k: true})
but neither seem to work as I had hoped. In general how do you specify keyword arguments when calling from js into cljs?
In cljs you call the function like this:
(js->clj #js {"foo" 42} :keywordize-keys true)
And the corresponding js code:
var k = cljs.core.keyword('keywordize-keys')
cljs.core.js__GT_clj({'foo': 42}, k, true)
Currently reading / working my way through "Programming in Scala, First Edition", specifically Chapter 31: Combinator Parsing
The author is describing how to parse a JSON file and offers the following more advanced tranformations:
def obj: Parser[Map[String, Any]] = // Can be improved
"{"~repsep(member, ",")~"}" ^^
{ case "{"~ms~"}" => Map() ++ ms }
later improved to:
def obj: Parser[Map[String, Any]] =
"{"~> repsep(member, ",") <~"}" ^^ (Map() ++ _)
However, when I enter such code into my IDE (IntelliJ IDEA 14.03), the compiler rejects it with:
Expression of type JSON.this.type#Parser[Iterable[Any]] doesn't
conform to expected type JSON.this.type#Parser[Map[String,Any]]
I can, of course, make this error go away by changing obj's type to Parser[Iterable[Any]], but this doesn't give the desired result.
What is the correct way to do this?
For whatever it is worth, I'm using jdk 1.7.0_71 and sdk 2.11.5
It depends on the parser for "member".
I guess you are using a parser like:
def member: Parser[Any]
Like in the example, try to use the member parser:
def member: Parser[(String, Any)] =
stringLiteral~":"~value ^^
{ case name~":"~value => (name, value) }
I have another question regarding decoding JSON in Common Lisp. I settled on ST-JSON as my tool. I am able to get a JSO object containing the JSON data, and access all fields with st-json:getjso. I wanted to write a macro similar in principle to destructuring-bind that would provide local bindings to variables named after JSON fields (since then I started doubting if this is a good idea, but that's a different question). I came up with the following:
(defmacro destructure-jso (jso &body body)
(let (keys values)
(st-json:mapjso #'(lambda (key value)
(push key keys)
(push value values))
jso)
`(destructuring-bind ,keys ,values
,#body)))
But when I try to use it on a JSO object, I get the error The value PARAMS is not of the expected type STRUCTURE. where PARAMS is the object. Can someone explain this to me?
Thanks.
Apparently, you are using destructure-jso like this:
(let ((params (st-json:read-json-from-string "{\"foo\":42,\"bar\":\"baz\"}")))
(destructure-jso params
(list foo bar)))
However, destructure-jso, being a macro, gets handled at macro expansion time, much before the JSON even gets parsed. params is passed to your macro as a symbol, without being evaluated; and even if its evaluation was attempted, it would be unbound.
So, if you want to write a destructure-jso, you will need the list of keys at macro expansion time. You could pass the list in a normal way:
> (defmacro destructure-jso-2 (vars json &body body)
`(let ,(mapcar #'(lambda (var)
(list var `(getjso ,(string-downcase (symbol-name var)) ,json)))
vars)
,#body))
DESTRUCTURE-JSO-2
> (let ((params (st-json:read-json-from-string "{\"foo\":42,\"bar\":\"baz\"}")))
(destructure-jso-2 (foo bar)
params
(list foo bar)))
(42 "baz")
Or, if you like, use a "template" JSON for creating the mappings:
> (defmacro destructure-jso-3 (template json &body body)
(let (bindings)
(st-json:mapjso #'(lambda (key val)
(declare (ignore val))
(push (list (intern (string-upcase key)) `(getjso ,key ,json))
bindings))
(st-json:read-json-from-string template))
`(let ,bindings
,#body)))
DESTRUCTURE-JSO-3
> (let ((params (st-json:read-json-from-string "{\"foo\":42,\"bar\":\"baz\"}")))
(destructure-jso-3 "{\"foo\":null,\"bar\":null}"
params
(list foo bar)))
(42 "baz")
Here, the variable bindings come from the first (template) JSON, values from the second one. The template JSON is parsed at macroexpansion time, the params JSON every time your code is executed.
Whether either of these is a useful approach for you or not, I do not know.