How do I apply a given function to a JavaScript JSON structure ?
I would like to be able to do the following :
(update-in js-data [.-data] my-fn) ;; fails since .-data is not valid
update-in and similar functions only work on ClojureScript data structures.
In your specific example you could convert js-data to a ClojureScript data structure like this
(update-in (js->clj js-data) ["data"] my-fn)
If you cannot convert the Javascript object to a plain map you can always modify the original object in-place using set!.
(set! js-data -data my-fn)
Related
The 2htdp/batch-io library contains the useful read-csv-file procedure for reading a CSV file into a list. It takes a filename as its argument. Unfortunately, it does not take a string containing CSV as its argument. Suppose I have a CSV in a string variable and I want to use read-csv-file to parse it. Is there a way to avoid saving the CSV to a file just to be able to parse the CSV?
The documentation says:
reads the standard input device (until closed) or the content of file f and produces it as a list of lists of comma-separated values.
The standard input feature could probably be exploited to achieve this, but I don't know how to proceed with this idea.
The 2htdp libraries are meant for beginners are thus are less flexible than other csv libraries. Therefore, you have two options:
batch-io provides simulate-file which is something similar to what you want, but nothing as clean as wrapping your string in a function which makes it into a file like object:
> (simulate-file read-csv-file "test,test,test")
(list (list "test" "test" "test"))
Use csv-reading (csv-reading must be downloaded but just (require csv-reading) and continue through the errors it gives you):
#lang racket
(require csv-reading)
> (csv->list "test,test,test")
(list (list "test" "test" "test"))
If batch-io were more general it would be take in a string? or an input-port? but it does not.
I found a way to make read-csv-file read from a string instead:
> (require 2htdp/batch-io)
> (define csv-string "one,one,one\ntwo,two,two")
> (parameterize ([current-input-port (open-input-string csv-string)])
(read-csv-file 'stdin))
'(("one" "one" "one") ("two" "two" "two"))
This works by changing the standard input to be the CSV string.
#Ryan Schaefer's answer about simulate-file is great, but I feel a bit uncomfortable with using functionality that is still "under development" and not properly documented. As simulate-file's documentation says:
Note: this form is under development and will be documented in a precise manner after it is finalized and useful for a wide audience.
I would like to use a custom world map with Highmaps. I used mapshaper (great tool!) to reduce the number of vertices, and checked the JSON code with a JSON validator and imported it in QGIS, where it works fine. But I don't succeed in convincing Highmaps to use it:
I use:
series : [
{
data : data,
//mapData: Highcharts.maps['custom/world'],
mapData: 'etc/BNDA25_CTY.json',
...
}
I haven't found any example (code) of custom JSON maps so far. Guess it's pretty straightforward. But how does it work? Does the JSON to be made in a specific way?
Here is the JSON code.
You need to assign your JSON object as Highcharts.map('name'), where name is a value which we will use to the import.
Highcharts has the option joinBy set to hc-key by default. I can't see this property in your JSON, so we need to change this value, let's say to:
joinBy: 'ISO3CD',
API: https://api.highcharts.com/highmaps/plotOptions.series.joinBy
We need to set keys for the data, to:
keys: ['ISO3CD', 'value'],
API: https://api.highcharts.com/highmaps/series.map.keys
Final demo: https://jsfiddle.net/BlackLabel/q34tue8a/
It will be better to reassign your JSON to the new one with defined the Highcharts.map and attach it as a script, but I left this to you ;)
I was wondering if there was a way to parse a lua table into an javascript object, without using any libraries i.e require("json") haven't seen one yet, but if someone knows how please answer
If you want to know how to parse Lua tables to JSON strings take a look into the source code of any of the many JSON libraries available for Lua.
http://lua-users.org/wiki/JsonModules
For example:
https://github.com/rxi/json.lua/blob/master/json.lua
or
https://github.com/LuaDist/dkjson/blob/master/dkjson.lua
If you do not want to use any library and want to do it with pure Lua code the most convenient way for me is to use table.concat function:
local result
for key, value in ipairs(tableWithData) do
-- prepare json key-value pairs and save them in separate table
table.insert(result, string.format("\"%s\":%s", key, value))
end
-- get simple json string
result = "{" .. table.concat(result, ",") .. "}"
If your table has nested tables you can do this recursively.
The are a lot of pure-Lua JSON libraries.
Even me have one.
How to include pure-Lua module into your script without using require():
Download the Lua JSON module (for example, go to my json.lua, right-click on Raw and select Save Link as in context menu)
Delete the last line return json from this file
Insert the whole file at the beginning of your script
Now you can use local json_as_string = json.encode(your_Lua_table) in your script.
When I put an event into a stream using the AWS CLI, I can pass JSON in and get it back out, after decoding from base64. When I try to put an event using Amazonica, from Clojure, I am having a hard time formatting the event data parameter correctly though.
(kinesis/put-record "ad-stream" {:ad-id "some-id"} "parition-key"))
creates an event with a base64 encoded data field of "TlBZCAAAABXwBhsAAAACagVhZC1pZGkHc29tZS1pZA==", which decodes to
NP�jad-idisome-id
If I JSON encode the data first:
(kinesis/put-record "ad-stream" (json/write-str {:ad-id "some-id-2"}) "parition-key")
then I get an event with less junk characters, but it still isn't quite perfect, not good enough to read in other apps without breaking something:
NPi{"ad-id":"some-id-2"}
What is the significance of that leading junk, when converting Clojure maps to JSON? How to I pass a simple object to kinesis?
The tests show a plain map being passed as put-record's data parameter, I don't understand yet why that didn't just work for me:
(let [data {:name "any data"
:col #{"anything" "at" "all"}
:date now}
sn (:sequence-number (put-record my-stream data (str (UUID/randomUUID))))]
(put-record my-stream data (str (UUID/randomUUID)) sn))
(Thread/sleep 3000)
(def shard (-> (describe-stream my-stream)
:stream-description
:shards
last
:shard-id))
update
I'm pretty sure that this is a bug in the library (or the serializer that it uses), so I'm continuing the investigation in a bug report at https://github.com/mcohen01/amazonica/issues/211.
Passing a ByteBuffer of a JSON string as the record data works for me.
(kinesis/put-record "ad-stream"
(-> {:ad-id "ad-stream"}
json/write-str .getBytes ByteBuffer/wrap)
"parition-key")
Record data: "eyJhZC1pZCI6ImFkLXN0cmVhbSJ9", which decodes to:
{"ad-id":"ad-stream"}
This works around any encoding issue in the library, because Amazonica skips encoding when it is passed a ByteBuffer.
I am trying to serialize some Clojure data structure into a persistent database, and I currently use Chesire for that purpose.
Let's say I have a map that contains namespaced keywords like the following :
{:cemerick.friend/identity {:current friend, :authentications {friend {:identity friend, :roles #{:clojure-cms.handler/user}}}}}
It gets serialized into JSON, like that :
{"cemerick.friend/identity":{"current":"friend","authentications":{"friend":{"identity":"friend","roles":["clojure-cms.handler/user"]}}}}
When reading it back and serializing (with keywordization (parse-string data true)), I get back the following :
{:cemerick.friend/identity {:current friend, :authentications {:friend {:identity friend, :roles [clojure-cms.handler/user]}}}}
How can I parse this JSON with and get the same data as the original ?
Note : this question gives some context to what I am trying to achieve.
Looking at the tests in Chesire, it's quite obvious that the optional keyword parameter to parse-string will affect all name attributes in a JSON object, value attributes like the namespaced keyword in your example are not affected. Actually, your problem is two-fold: the original set is also not converted back correctly.
For the set problem, what you could do is to write a custom decoder as described in the Chesire documentation.
For the original problem, there is probably no direct way other than to post-process the returned map, find the value to :roles and turn the value into a keyword, like so (untested):
(defn postprocess-json [authmap]
(update-in authmap [:authentications :friend :roles] keyword))