hi i have found a tutorial on how to use post json in lua.
here is the code :
http = require("socket.http")
crypto = require("crypto")
ltn12 = require("ltn12")
url = require("socket.url")
local json = require("json")
local commands_json =
{
["message"] = "Hello",
}
print (commands_json)
local json = {}
json.api_key = "6_192116334"
json.ver = 1
json.commands_json = json.encode(commands_json)
json.commands_hash = crypto.digest(crypto.md5, json.commands_json .. 'hkjhkjhkjh')
local post = "api=" .. url.escape(Json.Encode(json))
local response = {}
local r, c, h = http.request {
url = "http://127.0.0.1/?page=api",
method = "POST",
headers = {
["content-length"] = #post,
["Content-Type"] = "application/x-www-form-urlencoded"
},
source = ltn12.source.string(post),
sink = ltn12.sink.table(response)
}
local path = system.pathForFile("r.txt", system.DocumentsDirectory)
local file = io.open (path, "w")
file:write (Json.Encode(json) .. "\n")
file:write (post .. "\n")
file:write (response[1] .. "\n")
io.close (file)
json = Json.Decode(table.concat(response,''))
native.showAlert("hey", json.commands[1].tot_nbr_rows)
now i got these error:
Windows simulator build date: Dec 9 2011 # 14:01:29
Copyright (C) 2009-2011 A n s c a , I n c .
Version: 2.0.0
Build: 2011.704
table: 0346D6D0
Runtime error
...nistrator\my documents\corona projects\json\main.lua:17: attempt to c
all field 'encode' (a nil value)
stack traceback:
[C]: in function 'encode'
...nistrator\my documents\corona projects\json\main.lua:17: in main chun
k
Runtime error: ...nistrator\my documents\corona projects\json\main.lua:17: attem
pt to call field 'encode' (a nil value)
stack traceback:
[C]: in function 'encode'
...nistrator\my documents\corona projects\json\main.lua:17: in main chun
k
i don't know why i got the error from encode.
can anyone can help me about my case?
thanks in advance ...
This includes the Json code provided externally, likely with an encode function:
local json = require("json")
This throws away your old json variable and replaces it with an empty table:
local json = {}
And this tries to call json.encode which is now undefined since you redefined json as an empty table above:
json.commands_json = json.encode(commands_json)
The solution is to pick a different variable name.
Related
I would like to import variables defined in a json file(my_info.json) as attibutes for bazel rules.
I tried this (https://docs.bazel.build/versions/5.3.1/skylark/tutorial-sharing-variables.html) and works but do not want to use a .bzl file and import variables directly to attributes to BUILD.bazel.
I want to use those variables imported from my_info.json as attributes for other BUILD.bazel files.
projects/python_web/BUILD.bazel
load("//projects/tools/parser:config.bzl", "MY_REPO","MY_IMAGE")
container_push(
name = "publish",
format = "Docker",
registry = "registry.hub.docker.com",
repository = MY_REPO,
image = MY_IMAGE,
tag = "1",
)
Asking the similar in Bazel slack I was informed the is not possible to import variables directly to Bazel and it is needed to parse the json variables and write them into a .bzl file.
I tried also this code but nothing is written in config.bzl file.
my_info.json
{
"MYREPO" : "registry.hub.docker.com",
"MYIMAGE" : "michael/monorepo-python-web"
}
WORKSPACE.bazel
load("//projects/tools/parser:jsonparser.bzl", "load_my_json")
load_my_json(
name = "myjson"
)
projects/tools/parser/jsonparser.bzl
def _load_my_json_impl(repository_ctx):
json_data = json.decode(repository_ctx.read(repository_ctx.path(Label(":my_info.json"))))
config_lines = ["%s = %s" % (key, repr(val)) for key, val in json_data.items()]
repository_ctx.file("config.bzl", "\n".join(config_lines))
load_my_json = repository_rule(
implementation = _load_my_json_impl,
attrs = {},
)
projects/tools/parser/BUILD.bazel
load("#aspect_bazel_lib//lib:yq.bzl", "yq")
load(":config.bzl", "MYREPO", "MY_IMAGE")
yq(
name = "convert",
srcs = ["my_info2.json"],
args = ["-P"],
outs = ["bar.yaml"],
)
Executing:
% bazel build projects/tools/parser:convert
ERROR: Traceback (most recent call last):
File "/Users/michael.taquia/Documents/Personal/Projects/bazel/bazel-projects/multi-language-bazel-monorepo/projects/tools/parser/BUILD.bazel", line 2, column 22, in <toplevel>
load(":config.bzl", "MYREPO", "MY_IMAGE")
Error: file ':config.bzl' does not contain symbol 'MYREPO'
When making troubleshooting I see the execution calls the jsonparser.bzl but never enters to _load_my_json_impl function (based in print statements) and does not write anything to config.bzl.
Notes: Tested on macOS 12.6 (21G115 ) Darwin Kernel Version 21.6.0
There is a better way to do that? A code snippet will be very useful.
Hello i'm trying to use Json from my washer with lua. It's for visualizing the samsung in Domoitcz.
A part of the Json what i get from https://api.smartthings.com/v1/devices/abcd-1234-abcd is:
"main": {
"washerJobState": {
"value": "wash"
},
"mnhw": {
"value": "1.0"
},
"data": {
"value": "{
\"payload\":{
\"x.com.samsung.da.state\":\"Run\",\"x.com.samsung.da.delayEndTime\":\"00:00:00\",\"x.com.samsung.da.remainingTime\":\"01:34:00\",\"if\":[\"oic.if.baseline\",\"oic.if.a\"],\"x.com.samsung.da.progressPercentage\":\"2\",\"x.com.samsung.da.supportedProgress\":[\"None\",\"Wash\",\"Rinse\",\"Spin\",\"Finish\"],\"x.com.samsung.da.progress\":\"Wash\",\"rt\":[\"x.com.samsung.da.operation\"]}}"
},
"washerRinseCycles": {
"value": "3"
},
"switch": {
"value": "on"
},
if i use in my script
local switch = item.json.main.switch.value
I got the valua on or off and i can use it for showing the status of the washer.
i'm trying to find out how to get the "data"value in my script, there are more items with dots en backslhases:
local remainingTime = rt.data.value.payload['x.com.samsung.da.remainingTime']
or
local remainingTime = rt.data.value['\payload']['\x.com.samsung.da.remainingTime']
i tried some more opions with 'or // , "" but always got a nill value.
Can someone explain me how to get:
\"x.com.samsung.da.remainingTime\":\"01:34:00\"
\"x.com.samsung.da.progressPercentage\":\"2\",
All the " , \, x., ar confusing me
Below is my script to test where i only left the Json log (Dzvents Lua Based) i get an error:
dzVents/generated_scripts/Samsung_v3.lua:53: attempt to index a nil value (global 'json') i don't heave any idea how te use/adjust my code for decode the string.
local json = require"json" -- the JSON library
local outer = json.decode(your_JSON_string)
local rt = outer.main
local inner = json.decode(rt.data.value)
local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
local API = 'API'
local Device = 'Device'
local LOGGING = true
--Define dz Switches
local WM_STATUS = 'WM Status' --Domoitcz virtual switch ON/Off state Washer
return
{
on =
{
timer =
{
'every 1 minutes', -- just an example to trigger the request
},
httpResponses =
{
'trigger', -- must match with the callback passed to the openURL command
},
},
logging =
{
level = domoticz.LOG_DEBUG ,
},
execute = function(dz, item)
local wm_status = dz.devices(WM_STATUS)
if item.isTimer then
dz.openURL({
url = 'https://api.smartthings.com/v1/devices/'.. Device .. '/states',
headers = { ['Authorization'] = 'Bearer '.. API },
method = 'GET',
callback = 'trigger', -- see httpResponses above.
})
end
if (item.isHTTPResponse) then
if item.ok then
if (item.isJSON) then
rt = item.json.main
-- outer = json.decode'{"payload":{"x.com.samsung.da.state":"Run","x.com.samsung.da.delayEndTime":"00:00:00","x.com.samsung.da.remainingTime":"00:40:00","if":["oic.if.baseline","oic.if.a"],"x.com.samsung.da.progressPercentage":"81","x.com.samsung.da.supportedProgress":["None","Weightsensing","Wash","Rinse","Spin","Finish"],"x.com.samsung.da.progress":"Rinse","rt":["x.com.samsung.da.operation"]}}
inner = json.decode(rt.data.value)
-- local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
dz.utils.dumpTable(rt) -- this will show how the table is structured
-- dz.utils.dumpTable(inner)
local washerSpinLevel = rt.washerSpinLevel.value
-- local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
dz.log('Debuggg washerSpinLevel:' .. washerSpinLevel, dz.LOG_DEBUG)
dz.log('Debuggg remainingTime:' .. remainingTime, dz.LOG_DEBUG)
-- dz.log('Resterende tijd:' .. remainingTime, dz.LOG_INFO)
-- dz.log(dz.utils.fromJSON(item.data))
-- end
elseif LOGGING == true then
dz.log('There was a problem handling the request', dz.LOG_ERROR)
dz.log(item, dz.LOG_ERROR)
end
end
end
end
}
This is a weird construction: a serialized JSON inside a normal JSON.
This means you have to invoke deserialization twice:
local json = require"json" -- the JSON library
local outer = json.decode(your_JSON_string)
local rt = outer.main
local inner = json.decode(rt.data.value)
local remainingTime = inner.payload['x.com.samsung.da.remainingTime']
Is possible use a variable to parse a json's path? Like this?
json = require('json')
base={magali={pass='melancia'}}
--~ local oo = json.decode( readAll("../json/base.json") )
local oo = base
user = 'magali'
print("oo[" .. oo.magali.pass .. "]") -- work
print("oo[" .. oo.user.pass .. "]") -- does not work! How do this?
Error:
lua53: example.lua:34: attempt to index a nil value (field 'user')
stack traceback:
example.lua:34: in main chunk
[C]: in ?
I encountered this bug in cjson lua when I was using a script in redis 3.2 to set a particular value in a json object.
Currently, the lua in redis does not differentiate between an empty json array or an empty json object. Which causes serious problems when serialising json objects that have arrays within them.
eval "local json_str = '{\"items\":[],\"properties\":{}}' return cjson.encode(cjson.decode(json_str))" 0
Result:
"{\"items\":{},\"properties\":{}}"
I found this solution https://github.com/mpx/lua-cjson/issues/11 but I wasn't able to implement in a redis script.
This is an unsuccessful attempt :
eval
"function cjson.mark_as_array(t)
local mt = getmetatable(t) or {}
mt.__is_cjson_array = true
return setmetatable(t, mt)
end
function cjson.is_marked_as_array(t)
local mt = getmetatable(t)
return mt and mt.__is_cjson_array end
local json_str = '{\"items\":[],\"properties\":{}}'
return cjson.encode(cjson.decode(json_str))"
0
Any help or pointer appreciated.
There are two plans.
Modify the lua-cjson source code and compile redis, click here for details.
Fix by code:
local now = redis.call("time")
-- local timestamp = tonumber(now[1]) * 1000 + math.floor(now[2]/1000)
math.randomseed(now[2])
local emptyFlag = "empty_" .. now[1] .. "_" .. now[2] .. "_" .. math.random(10000)
local emptyArrays = {}
local function emptyArray()
if cjson.as_array then
-- cjson fixed: https://github.com/xiyuan-fengyu/redis-lua-cjson-empty-table-fix
local arr = {}
setmetatable(arr, cjson.as_array)
return arr
else
-- plan 2
local arr = {}
table.insert(emptyArrays, arr)
return arr
end
end
local function toJsonStr(obj)
if #emptyArrays > 0 then
-- plan 2
for i, item in ipairs(emptyArrays) do
if #item == 0 then
-- empty array, insert a special mark
table.insert(item, 1, emptyFlag)
end
end
local jsonStr = cjson.encode(obj)
-- replace empty array
jsonStr = (string.gsub(jsonStr, '%["' .. emptyFlag .. '"]', "[]"))
for i, item in ipairs(emptyArrays) do
if item[1] == emptyFlag then
table.remove(item, 1)
end
end
return jsonStr
else
return cjson.encode(obj)
end
end
-- example
local arr = emptyArray()
local str = toJsonStr(arr)
print(str) -- "[]"
I've succeeded in triggering a simple http request in ELM and decoding the JSON response into an ELM value - [https://stackoverflow.com/questions/43139316/decode-nested-variable-length-json-in-elm]
The problem I'm facing now-
How to chain (concurrency preferred) two http requests and merge the json into my new (updated) model. Note - please see the updated Commands.elm
Package used to access remote data - krisajenkins/remotedata http://package.elm-lang.org/packages/krisajenkins/remotedata/4.3.0/RemoteData
Github repo of my code - https://github.com/areai51/my-india-elm
Previous Working Code -
Models.elm
type alias Model =
{ leaders : WebData (List Leader)
}
initialModel : Model
initialModel =
{ leaders = RemoteData.Loading
}
Main.elm
init : ( Model, Cmd Msg )
init =
( initialModel, fetchLeaders )
Commands.elm
fetchLeaders : Cmd Msg
fetchLeaders =
Http.get fetchLeadersUrl leadersDecoder
|> RemoteData.sendRequest
|> Cmd.map Msgs.OnFetchLeaders
fetchLeadersUrl : String
fetchLeadersUrl =
"https://data.gov.in/node/85987/datastore/export/json"
Msgs.elm
type Msg
= OnFetchLeaders (WebData (List Leader))
Update.elm
update msg model =
case msg of
Msgs.OnFetchLeaders response ->
( { model | leaders = response }, Cmd.none )
Updated Code - (need help with Commands.elm)
Models.elm
type alias Model =
{ lsLeaders : WebData (List Leader)
, rsLeaders : WebData (List Leader) <------------- Updated Model
}
initialModel : Model
initialModel =
{ lsLeaders = RemoteData.Loading
, rsLeaders = RemoteData.Loading
}
Main.elm
init : ( Model, Cmd Msg )
init =
( initialModel, fetchLeaders )
Commands.elm
fetchLeaders : Cmd Msg
fetchLeaders = <-------- How do I call both requests here ? And fire separate msgs
Http.get fetchLSLeadersUrl lsLeadersDecoder <----- There will be a different decoder named rsLeadersDecoder
|> RemoteData.sendRequest
|> Cmd.map Msgs.OnFetchLSLeaders
fetchLSLeadersUrl : String
fetchLSLeadersUrl =
"https://data.gov.in/node/85987/datastore/export/json"
fetchRSLeadersUrl : String <------------------ New data source
fetchRSLeadersUrl =
"https://data.gov.in/node/982241/datastore/export/json"
Msgs.elm
type Msg
= OnFetchLSLeaders (WebData (List Leader))
| OnFetchRSLeaders (WebData (List Leader)) <-------- New message
Update.elm
update msg model =
case msg of
Msgs.OnFetchLSLeaders response ->
( { model | lsLeaders = response }, Cmd.none )
Msgs.OnFetchRSLeaders response -> <--------- New handler
( { model | rsLeaders = response }, Cmd.none )
The way to fire off two concurrent requests is to use Cmd.batch:
init : ( Model, Cmd Msg )
init =
( initialModel, Cmd.batch [ fetchLSLeaders, fetchRSLeaders ] )
There is no guarantee on which request will return first and there is no guarantee that they will both be successful. One could fail while the other succeeds, for example.
You mention that you want to merge the results, but you didn't say how the merge would work, so I'll just assume you want to append the lists of leaders together in one list, and it will be useful to your application if you had only to deal with a single RemoteData value rather than multiple.
You can merge multiple RemoteData values together with a custom function using map and andMap.
mergeLeaders : WebData (List Leader) -> WebData (List Leader) -> WebData (List Leader)
mergeLeaders a b =
RemoteData.map List.append a
|> RemoteData.andMap b
Notice that I'm using List.append there. That can really be any function that takes two lists and merges them however you please.
If you prefer an applicative style of programming, the above could be translated to the following infix version:
import RemoteData.Infix exposing (..)
mergeLeaders2 : WebData (List Leader) -> WebData (List Leader) -> WebData (List Leader)
mergeLeaders2 a b =
List.append <$> a <*> b
According to the documentation on andMap (which uses a result tuple rather than an appended list in its example):
The final tuple succeeds only if all its children succeeded. It is still Loading if any of its children are still Loading. And if any child fails, the error is the leftmost Failure value.