ERLANG with JSON - json

I run following command in erlang,
os:cmd("curl -k -X GET http://10.210.12.154:10065/iot/get/task").
It gives a JSON output like this,
{"data":[
{"id":1,"task":"Turn on the bulb when the temperature in greater than 28","working_condition":1,"depending_value":"Temperature","action":"123"},
{"id":2,"task":"Trun on the second bulb when the temperature is greater than 30","working_condition":0,"depending_value":"Temperature","action":"124"}
]}
I want to categorize this data to Id, task, depending_value, action. It is like putting them in to a table. I want to easily find what is the depending value, working condition & action for Id=1. How can I do this?

It gives a JSON output like this.
{"data":[{"id":1,"t ...
Highly doubtful. The docs say that os:cmd() returns a string, which does not start with a {. Note also that a string is not even an erlang data type, rather double quotes are a shortcut for creating a list of integers, and a list of integers is not terribly useful in your case.
Here are two options:
Call list_to_binary() on the list of integers returned by os:cmd() to covert to a binary.
Instead of os:cmd(), use an erlang http client, like hackney, which will return the json as a binary.
The reason you want a binary is because then you can use an erlang json module, like jsx, to convert the binary into an erlang map (which might be what you are after?).
Here's what that will look like:
3> Json = <<"{\"data\": [{\"x\": 1, \"y\": 2}, {\"a\": 3, \"b\": 4}] }">>.
<<"{\"data\": [{\"x\": 1, \"y\": 2}, {\"a\": 3, \"b\": 4}] }">>
4> Map = jsx:decode(Json, [return_maps]).
#{<<"data">> =>
[#{<<"x">> => 1,<<"y">> => 2},#{<<"a">> => 3,<<"b">> => 4}]}
5> Data = maps:get(<<"data">>, Map).
[#{<<"x">> => 1,<<"y">> => 2},#{<<"a">> => 3,<<"b">> => 4}]
6> InnerMap1 = hd(Data).
#{<<"x">> => 1,<<"y">> => 2}
7> maps:get(<<"x">>, InnerMap1).
1
...putting them in to a table. I want to easily find what is the
depending value, working condition & action for Id=1.
Erlang has various table implementations: ets, dets, and mnesia. Here is an ets example:
-module(my).
-compile(export_all).
get_tasks() ->
Method = get,
%See description of this awesome website below.
URL = <<"https://my-json-server.typicode.com/7stud/json_server/db">>,
Headers = [],
Payload = <<>>,
Options = [],
{ok, 200, _RespHeaders, ClientRef} =
hackney:request(Method, URL, Headers, Payload, Options),
{ok, Body} = hackney:body(ClientRef),
%{ok, Body} = file:read_file('json/json.txt'), %Or, for testing you can paste the json in a file (without the outer quotes), and read_file() will return a binary.
Map = jsx:decode(Body, [return_maps]),
_Tasks = maps:get(<<"data">>, Map).
create_table(TableName, Tuples) ->
ets:new(TableName, [set, named_table]),
insert(TableName, Tuples).
insert(_Table, []) ->
ok;
insert(Table, [Tuple|Tuples]) ->
#{<<"id">> := Id} = Tuple,
ets:insert(Table, {Id, Tuple}),
insert(Table, Tuples).
retrieve_task(TableName, Id) ->
[{_Id, Task}] = ets:lookup(TableName, Id),
Task.
By default, an ets set type table ensures that the first position in the inserted tuple is the unique key (or you can explicitly specify another position in the tuple as the unique key).
** If you have a github account, I discovered a really cool website that allows you to place a json file in a new repository on github, and the website will serve up that file as json. Check it out at https://my-json-server.typicode.com:
How to
Create a repository on GitHub (<your-username>/<your-repo>)
Create a db.json file [in the repository].
Visit https://my-json-server.typicode.com/<your-username>/<your-repo> to
access your server
You can see the url I'm using in the code, which can be obtained by clicking on the link at the provided server page and copying the url in your web browser's address bar.
In the shell:
.../myapp$ rebar3 shell
===> Verifying dependencies...
===> Compiling myapp
src/my.erl:2: Warning: export_all flag enabled - all functions will be exported
Erlang/OTP 20 [erts-9.3] [source] [64-bit] [smp:4:4] [ds:4:4:10] [async-threads:1] [hipe] [kernel-poll:false]
Eshell V9.3 (abort with ^G)
1> ===> The rebar3 shell is a development tool; to deploy applications in production, consider using releases (http://www.rebar3.org/docs/releases)
===> Booted unicode_util_compat
===> Booted idna
===> Booted mimerl
===> Booted certifi
===> Booted ssl_verify_fun
===> Booted metrics
===> Booted hackney
1> Tasks = my:get_tasks().
[#{<<"action">> => <<"123">>,
<<"depending_value">> => <<"Temperature">>,<<"id">> => 1,
<<"task">> =>
<<"Turn on the bulb when the temperature in greater than 28">>,
<<"working_condition">> => 1},
#{<<"action">> => <<"124">>,
<<"depending_value">> => <<"Temperature">>,<<"id">> => 2,
<<"task">> =>
<<"Trun on the second bulb when the temperature is greater than 30">>,
<<"working_condition">> => 0}]
2> my:create_table(tasks, Tasks).
ok
3> my:retrieve_task(tasks, 1).
#{<<"action">> => <<"123">>,
<<"depending_value">> => <<"Temperature">>,<<"id">> => 1,
<<"task">> =>
<<"Turn on the bulb when the temperature in greater than 28">>,
<<"working_condition">> => 1}
4> my:retrieve_task(tasks, 2).
#{<<"action">> => <<"124">>,
<<"depending_value">> => <<"Temperature">>,<<"id">> => 2,
<<"task">> =>
<<"Trun on the second bulb when the temperature is greater than 30">>,
<<"working_condition">> => 0}
5> my:retrieve_task(tasks, 3).
** exception error: no match of right hand side value []
in function my:retrieve_task/2 (/Users/7stud/erlang_programs/old/myapp/src/my.erl, line 58)
6>
Note that the id is over to the right at the end of one of the lines. Also, if you get any errors in the shell, the shell will automatically restart a new process and the ets table will be destroyed, so you have to create it anew.
rebar.config:
{erl_opts, [debug_info]}.
{deps, [
{jsx, "2.8.0"},
{hackney, ".*", {git, "git://github.com/benoitc/hackney.git", {branch, "master"}}}
]}.
{shell, [{apps, [hackney]}]}. % This causes the shell to automatically start the listed apps. See https://stackoverflow.com/questions/40211752/how-to-get-an-erlang-app-to-run-at-starting-rebar3/45361175#comment95565011_45361175
src/myapp.app.src:
{application, 'myapp',
[{description, "An OTP application"},
{vsn, "0.1.0"},
{registered, []},
{mod, {'myapp_app', []}},
{applications,
[kernel,
stdlib
]},
{env,[]},
{modules, []},
{contributors, []},
{licenses, []},
{links, []}
]}.
But, according to the rebar3 dependencies docs:
You should add each dependency to your app or app.src files:
So, I guess src/myapp.app.src should look like this:
{application, 'myapp',
[{description, "An OTP application"},
{vsn, "0.1.0"},
{registered, []},
{mod, {'myapp_app', []}},
{applications,
[kernel,
stdlib,
jsx,
hackney
]},
{env,[]},
{modules, []},
{contributors, []},
{licenses, []},
{links, []}
]}.

Related

Elixir - JasonHelpers - How can I send a keyword list to json_map?

I have a data structure that I want to convert to json and preserve the key order.
For example:
%{ x: 1, a: 5} should be converted to "{\"x\": 1, \"a\": 5}"
Poison does it without any problem. But when I upgrade to Jason, it changes to "{\"a\": 5, \"x\": 1}".
So I use JasonHelpers json_map to preserve the order like this:
Jason.Helpers.json_map([x: 1, a: 5])
It creates a fragment with correct order.
However, when I use a variable to do this:
list = [x: 1, a: 5]
Jason.Helpers.json_map(list)
I have an error:
** (Protocol.UndefinedError) protocol Enumerable not implemented for {:list, [line: 15], nil} of type Tuple.
....
QUESTION: How can I pass a pre-calculated list into Jason.Helpers.json_map ?
The calculation is complicated, so I don't want to repeat the code just to use json_map, but use the function that returns a list.
json_map/1 is a macro, from its docs:
Encodes a JSON map from a compile-time keyword.
It is designed for compiling JSON at compile-time, which is why it doesn't work with your runtime variable.
Support for encoding keyword lists was added to the Jason library a year ago, but it looks like it hasn't been pushed to hex yet. I managed to get it work by pulling the latest code from github:
defp deps do
[{:jason, git: "https://github.com/michalmuskala/jason.git"}]
end
Then by creating a struct that implements Jason.Encoder (adapted from this solution by the Jason author):
defmodule OrderedObject do
defstruct [:value]
def new(value), do: %__MODULE__{value: value}
defimpl Jason.Encoder do
def encode(%{value: value}, opts) do
Jason.Encode.keyword(value, opts)
end
end
end
Now we can encode objects with ordered keys:
iex(1)> Jason.encode!(OrderedObject.new([x: 1, a: 5]))
"{\"x\":1,\"a\":5}"
I don't know if this is part of the public API or just an implementation detail, but it appears you have some control of the order when implementing the Jason.Encoder protocol for a struct.
Let's say you've defined an Ordered struct:
defmodule Ordered do
#derive {Jason.Encoder, only: [:a, :x]}
defstruct [:a, :x]
end
If you encode the struct, the "a" key will be before the "x" key:
iex> Jason.encode!(%Ordered{a: 5, x: 1})
"{\"a\":5,\"x\":1}"
Let's reorder the keys we pass in to the :only option:
defmodule Ordered do
#derive {Jason.Encoder, only: [:x, :a]}
defstruct [:a, :x]
end
If we now encode the struct, the "x" key will be before the "a" key:
iex> Jason.encode!(%Ordered{a: 5, x: 1})
"{\"x\":1,\"a\":5}"

How to define config file variables?

I have a configuration file with:
{path, "/mnt/test/"}.
{name, "Joe"}.
The path and the name could be changed by a user. As I know, there is a way to save those variables in a module by usage of file:consult/1 in
-define(VARIABLE, <parsing of the config file>).
Are there any better ways to read a config file when the module begins to work without making a parsing function in -define? (As I know, according to Erlang developers, it's not the best way to make a complicated functions in -define)
If you need to store config only when you start the application - you may use application config file which is defined in 'rebar.config'
{profiles, [
{local,
[{relx, [
{dev_mode, false},
{include_erts, true},
{include_src, false},
{vm_args, "config/local/vm.args"}]
{sys_config, "config/local/yourapplication.config"}]
}]
}
]}.
more info about this here: rebar3 configuration
next step to create yourapplication.config - store it in your application folder /app/config/local/yourapplication.config
this configuration should have structure like this example
[
{
yourapplicationname, [
{path, "/mnt/test/"},
{name, "Joe"}
]
}
].
so when your application is started
you can get the whole config data with
{ok, "/mnt/test/"} = application:get_env(yourapplicationname, path)
{ok, "Joe"} = application:get_env(yourapplicationname, name)
and now you may -define this variables like:
-define(VARIABLE,
case application:get_env(yourapplicationname, path) of
{ok, Data} -> Data
_ -> undefined
end
).

Clojurescript — PersistentArrayMap → Object → PersistentArrayMap — transferring data between web worker

I'm trying to pass a clojurescript map to a webworker.
Before I pass it, it is of type PersistentArrayMap.
cljs.core.PersistentArrayMap {meta: null, cnt: 3, arr: Array(6), __hash: null, cljs$lang$protocol_mask$partition0$: 16647951…}
However, when it gets to the worker, it's just a plain old Object
Object {meta: null, cnt: 3, arr: Array(6), __hash: null, cljs$lang$protocol_mask$partition0$: 16647951…}
with the data seemingly intact. At this point I'd like to turn it back into a PersistentArrayMap so that I can work with it in cljs again.
Using clj->js and js->clj doesn't really work because it doesn't distinguish between keywords and strings, so some data is lost.
What's the best way to handle this situation? It's possible that I'm going about this in the entirely wrong way.
The built-in solution is to serialize the data to EDN and reading it back. clj->js is inherently lossy and should be avoided.
You start by turning the object into a string and send that over to the worker.
(pr-str {:a 1})
;; => "{:a 1}"
In the worker you read it back via cljs.reader/read-string
(cljs.reader/read-string "{:a 1}")
;; => {:a 1}
This will usually be good enough but the transit-cljs library will be slightly faster. Depending on the amount of data you plan on sending it may be worth the extra dependency.
Did you use keywordize-keys in the code?
(def json "{\"foo\": 1, \"bar\": 2, \"baz\": [1,2,3]}")
(def a (.parse js/JSON json))
;;=> #js {:foo 1, :bar 2, :baz #js [1 2 3]}
(js->clj a)
;;=> {"foo" 1, "bar" 2, "baz" [1 2 3]}
(js->clj a :keywordize-keys true)
;;=> {:foo 1, :bar 2, :baz [1 2 3]}
Full documentation is here.

Postgrex how to define json library

I'm just trying to use Postgrex without any kind of ecto setup, so just the example from the documentation readme.
Here is what my module looks like:
defmodule Receive do
def start(_type, _args) do
{:ok, pid} = Postgrex.start_link(
hostname: "localhost",
username: "john",
# password: "",
database: "property_actions",
extensions: [{Postgrex.Extensions.JSON}]
)
Postgrex.query!(
pid,
"INSERT INTO actions (search_terms) VALUES ($1)",
[
%{foo: 'bar'}
]
)
end
end
when I run the code I get
** (RuntimeError) type `json` can not be handled by the types module Postgrex.DefaultTypes, it must define a `:json` library in its options to support JSON types
Is there something I'm not setting up correctly? From what I've gathered in the documentation, I shouldn't even need to have that extensions line because json is handled by default.
On Postgrex <= 0.13, you need to define your own types:
Postgrex.Types.define(MyApp.PostgrexTypes, [], json: Poison)
and then when starting Postgrex:
Postgrex.start_link(types: MyApp.PostgrexTypes)
On Postgrex >= 0.14 (currently master), it was made easier:
config :postgrex, :json_library, Poison

Why is the stream blocking in this iex session?

I'm trying to parse some CSVs using elixir:
iex -S mix
Erlang/OTP 18 [erts-7.2] [source-e6dd627] [64-bit] [smp:4:4] [async-threads:10] [hipe] [kernel-poll:false]
Interactive Elixir (1.3.0-dev) - press Ctrl+C to exit (type h() ENTER for help)
iex(1)> a = File.stream!("test/lib/fit_notes/fit_notes_export.csv") |> CSV.decode
#Function<49.97003610/2 in Stream.transform/3>
iex(2)> Stream.take(a, 1)
#Stream<[enum: #Function<49.97003610/2 in Stream.transform/3>,
funs: [#Function<38.97003610/1 in Stream.take/2>]]>
iex(3)> Enum.take(a, 1)
[["Date", "Exercise", "Category", "Weight (kgs)", "Reps", "Distance",
"Distance Unit", "Time"]]
iex(4)> Enum.take(a, 2)
^ this just blocks
The first Enum.take that I issue works, the second blocks forever. Can you tell me what am I doing wrong? I'm using this library for CSV parsing.
If you follow the example from http://hexdocs.pm/csv/CSV.html
your code would end up something like this:
a = File.stream!("test/lib/fit_notes/fit_notes_export.csv") |> CSV.decode |>
Stream.take(1) |>
Stream.take(1) |>
Enum.take(2)
Note I changed your first Enum.take(1) in to a Stream.take(1) so that the Stream doesn't get prematurely terminated. Also note doing two Stream.take(1) will be better converted into a single Stream.take(2). Also note how the stream piping works by adding a |> to the end of the each line until you reach an Enum call - which then fires the whole operation.
Added:
For Streams with side-effects (like logging) see
https://github.com/elixir-lang/elixir/issues/1831 where they recommend Stream.each