I have a series of ClojureScript objects, but only some of them implement a certain protocol. how can I detect if the protocol is extended on a particular object?
You can use satisfies? to check if the object extends the protocol.
(defprotocol p
(go [_] nil))
(deftype t []
p
(go [this] true))
(satisfies? p (t.)) ;=> true
Related
I have such a environment to start from:
(defn field-name "blah")
(defn obj (js* "{
list: [1,2,3,4,5],
blah: \"vtha\",
o: { answer: 42 }
}")
How do I get (idiomatic way) blah field using field name var?
(aget obj field-name)
works, but it is intended for arrays (docs say)
You can use goog.object/get and I think this is idiomatic way to access the properties.
I would also recommend binaryage/cljs-oops that is addressing this very problem.
Is it possible to marshall object in to JSON using any in-built Scala API? For some reason, I can't use any library like Jackson, Play etc.
I know Scala provides JSON parser(scala.util.parsing.json.JSON) but I am interested in marshaller.
case class and Play's JSON reads and writes macro can do it for you. Play's Doc
Not sure for what you are looking for. But if you need JSON to respond to some REST API and you are using
import org.springframework.web.bind.annotation.{RequestMapping, RequestMethod, RestController}
then you can just return a Map like this
Map(
"startDate" -> startDate.toDate,
"endDate" -> endDate.toDate,
"locations" -> myLocations.map { locationId =>
Map(
"location" -> locationId.name,
"locationZone" -> locationId.timeZone
)
}
)
Otherwise play make it easy along with case classes
I have another question regarding decoding JSON in Common Lisp. I settled on ST-JSON as my tool. I am able to get a JSO object containing the JSON data, and access all fields with st-json:getjso. I wanted to write a macro similar in principle to destructuring-bind that would provide local bindings to variables named after JSON fields (since then I started doubting if this is a good idea, but that's a different question). I came up with the following:
(defmacro destructure-jso (jso &body body)
(let (keys values)
(st-json:mapjso #'(lambda (key value)
(push key keys)
(push value values))
jso)
`(destructuring-bind ,keys ,values
,#body)))
But when I try to use it on a JSO object, I get the error The value PARAMS is not of the expected type STRUCTURE. where PARAMS is the object. Can someone explain this to me?
Thanks.
Apparently, you are using destructure-jso like this:
(let ((params (st-json:read-json-from-string "{\"foo\":42,\"bar\":\"baz\"}")))
(destructure-jso params
(list foo bar)))
However, destructure-jso, being a macro, gets handled at macro expansion time, much before the JSON even gets parsed. params is passed to your macro as a symbol, without being evaluated; and even if its evaluation was attempted, it would be unbound.
So, if you want to write a destructure-jso, you will need the list of keys at macro expansion time. You could pass the list in a normal way:
> (defmacro destructure-jso-2 (vars json &body body)
`(let ,(mapcar #'(lambda (var)
(list var `(getjso ,(string-downcase (symbol-name var)) ,json)))
vars)
,#body))
DESTRUCTURE-JSO-2
> (let ((params (st-json:read-json-from-string "{\"foo\":42,\"bar\":\"baz\"}")))
(destructure-jso-2 (foo bar)
params
(list foo bar)))
(42 "baz")
Or, if you like, use a "template" JSON for creating the mappings:
> (defmacro destructure-jso-3 (template json &body body)
(let (bindings)
(st-json:mapjso #'(lambda (key val)
(declare (ignore val))
(push (list (intern (string-upcase key)) `(getjso ,key ,json))
bindings))
(st-json:read-json-from-string template))
`(let ,bindings
,#body)))
DESTRUCTURE-JSO-3
> (let ((params (st-json:read-json-from-string "{\"foo\":42,\"bar\":\"baz\"}")))
(destructure-jso-3 "{\"foo\":null,\"bar\":null}"
params
(list foo bar)))
(42 "baz")
Here, the variable bindings come from the first (template) JSON, values from the second one. The template JSON is parsed at macroexpansion time, the params JSON every time your code is executed.
Whether either of these is a useful approach for you or not, I do not know.
I have a simple single key-valued Map(K,V) myDictionary that is populated by my program and at the end I want to write it as JSON format string in a text file - as I would need parse them later.
I was using this code earlier,
Some(new PrintWriter(outputDir+"/myDictionary.json")).foreach{p => p.write(compact(render(decompose(myDictionary)))); p.close}
I found it to be slower as the input size increased. Later, I used this var out = new
var out = new PrintWriter(outputDir+"/myDictionary.json");
out.println(scala.util.parsing.json.JSONObject(myDictionary.toMap).toString())
This is proving to be bit faster.
I have run this for sample input and found that this is faster than my earlier approach. I assuming my input map size would reach at least a million values( >1GB text file) (K,V) hence I want to make sure that I follow the faster and memory efficient approach for Map serialization process.What are other approaches that you would recommend,that I can look into to optimize this.
The JSON support in the standard Scala library is probably not the best choice. Unfortunately the situation with JSON libraries for Scala is a bit confusing, there are many alternatives (Lift JSON, Play JSON, Spray JSON, Twitter JSON, Argonaut, ...), basically one library for each day of the week... I suggest you have a look at these at least to see if any of them is easier to use and more performative.
Here is an example using Play JSON which I have chosen for particular reasons (being able to generate formats with macros):
object JsonTest extends App {
import play.api.libs.json._
type MyDict = Map[String, Int]
implicit object MyDictFormat extends Format[MyDict] {
def reads(json: JsValue): JsResult[MyDict] = json match {
case JsObject(fields) =>
val b = Map.newBuilder[String, Int]
fields.foreach {
case (k, JsNumber(v)) => b += k -> v.toInt
case other => return JsError(s"Not a (string, number) pair: $other")
}
JsSuccess(b.result())
case _ => JsError(s"Not an object: $json")
}
def writes(m: MyDict): JsValue = {
val fields: Seq[(String, JsValue)] = m.map {
case (k, v) => k -> JsNumber(v)
} (collection.breakOut)
JsObject(fields)
}
}
val m = Map("hallo" -> 12, "gallo" -> 34)
val serial = Json.toJson(m)
val text = Json.stringify(serial)
println(text)
val back = Json.fromJson[MyDict](serial)
assert(back == JsSuccess(m), s"Failed: $back")
}
While you can construct and deconstruct JsValues directly, the main idea is to use a Format[A] where A is the type of your data structure. This puts more emphasis on type safety than the standard Scala-Library JSON. It looks more verbose, but in end I think it's the better approach.
There are utility methods Json.toJson and Json.fromJson which look for an implicit format of the type you want.
On the other hand, it does construct everything in-memory and it does duplicate your data structure (because for each entry in your map you will have another tuple (String, JsValue)), so this isn't necessarily the most memory efficient solution, given that you are operating in the GB magnitude...
Jerkson is a Scala wrapper for the Java JSON library Jackson. The latter apparently has the feature to stream data. I found this project which says it adds streaming support. Play JSON in turn is based on Jerkson, so perhaps you can even figure out how to stream your object with that. See also this question.
I'm having some issues with clojure.data.xml in that when parsing bad XML, the exception thrown is not caught. I've found some issues perhaps with run-time wrappers, but my attempts to unwrap it have been unsuccessful can anyone point out to me why this may be happening?
(defn parse-xml-from-string
"takes in valid xml as a string and turns it into
#clojure.data.xml data, if bad xml returns false"
[xml]
(try
(do (parse (java.io.StringReader. xml)))
(catch javax.xml.stream.XMLStreamException e false)
(catch Exception ex
(cond (isa? (class (.getCause ex)) javax.xml.stream.XMLStreamException) false))))
method call
(viva-api.helpers.validation/parse-xml-from-string "<?xml version=\"1.0\"encoding=\"UTF-8\"?><foo><bar><baz>The baz value</baz></bar></foos>")
output
#clojure.data.xml.Element{:tag :foo, :attrs {}, :content (user=> XMLStreamException ParseError at [row,col]:[1,84]
Message: The end-tag for element type "foo" must end with a '>' delimiter. com.sun.org.apache.xerces.internal.impl.XMLStreamReaderImpl.next (XMLStreamReaderImpl.java:598)
I think the problem you are seeing is related to the laziness of the value returned by parse. According to its docstring "Parses the source, which can be an InputStream or Reader, and returns a lazy tree of Element records. [...]".
(ns xml
(:use clojure.data.xml))
(defn parse-xml-from-string
"takes in valid xml as a string and turns it into
#clojure.data.xml data, if bad xml returns false"
[xml]
(try
(parse (java.io.StringReader. xml))
(catch javax.xml.stream.XMLStreamException ex
false)))
(parse-xml-from-string "<bla/>") ;= #clojure.data.xml.Element{:tag :bla, :attrs {}, :content ()}
(parse-xml-from-string "<bla") ;= false
(parse-xml-from-string "<bla>") ; throws XMLStreamException
(def x (parse-xml-from-string "<bla>")) ; doesn't throw an exception unless it's evaluated
x ; throws XMLStreamException
EDIT
The value returned from parse is a lazy tree built top-down from an Element record and based on a lazy sequence of Event objects, as mentioned in the docstring for the event-tree function. The laziness lies in the :content field of the record which is realized when the field is accessed. One way I've found to force the realization of the whole tree is using the str function, this feels hacky and looks ugly but anyone who has a better idea can provide a better solution.
(defn parse-xml-from-string
"takes in valid xml as a string and turns it into
#clojure.data.xml data, if bad xml returns false"
[xml]
(try
(let [x (parse-str xml)]
(str x)
x)
(catch javax.xml.stream.XMLStreamException ex
false)))
This seems like going to great lengths to avoid laziness which is, as I understand it, one of the main reasons to use clojure.data.xml. Since you seem to want your whole XML string parsed at once maybe the clojure.xml/parse function is better suited for your needs.
(defn my-parse-str
[s]
(try
(xml/parse (java.io.ByteArrayInputStream. (.getBytes s)))
(catch Exception e false)))