I am using the Yason library in common-lisp, I want to parse a json string but would like the parser to keep one a its node unparsed.
Typically with an example like that:
{
"metadata1" : "mydata1",
"metadata2" : "mydata2",
"payload" : {...my long payload object},
"otherNodesToParse" : {...}
}
How can I set the yason parser to parse my json but skip the payload node and keep it as a string in the json format.
Use: let's say I just want the envelope data (everything that's not the payload), and to forward the payload as-is (as json string) to another system.
If I parse the whole json (so including payload) and then re-encode the payload to json, it is inefficient. The payload size could also be pretty big.
How do you know where the end of the payload object is in the stream? You do so by parsing the stream: if you don't parse the stream you simply can't know where the end of the object is: that's the nature of JSON's syntax (as it is the nature of CL's default syntax). For instance the only way you can know the difference between where to continue after
{x:1}
and after
{x:1.2}
is by parsing the two things.
So you must necessarily parse the whole thing.
So the answer to your question is: you can't do this.
You could (but not, I think, with YASON) decide that you did not want to build an object as a result of the parse. And perhaps, if the stream you are parsing corresponds to something with random access like a string or a file, you could note the start and end positions in the stream to later extract a string from it corresponding to the unparsed data (or you could perhaps build it up as you go).
It looks as if some or all of this might be possible with CL-JSON, but you'd have to work at it.
Unless the objects you are reading are vast the benefit of this seems questionable-to-none. If you really do want to do something like this efficiently you need a serialisation scheme which tells you how long things are.
Using the Play framework with Anorm, I would like to write a Controller method that simply returns the results of an SQL query as JSON. I don't want to bother converting to objects. Also, ideally this code should stream out the JSON as the SQL ResultSet is processed rather than processing the entire SQL result before returning anything to the client.
select colA colB from mytable
JSON Response
[{"colA": "ValueA", "colB": 33}, {"colA": "ValueA2", "colB": 34}, ...]
I would like to express this in Scala code as elegantly and concisely as possible, but the examples I'm finding seem to have a lot of boiler plate (redundant column name definitions). I'm surprised there isn't some kind of SqlResult to JsValue conversion in Play or Anorm already.
I realize you may need to define Writes[] or an Enumeratee implementation to achieve this, but once the conversion code is defined, I'd like the code for each method to be nearly as simple as this:
val columns = List("colA", "colB")
db.withConnection { implicit c =>
Ok(Json.toJson(SQL"select #$columns from mytable"))
}
I'm not clear on the best way to define column names just once and pass it to the SQL query as well as JSON conversion code. Maybe if I create some kind of implicit ColumnNames type, the JSON conversion code could access it in the previous example?
Or maybe define my own kind of SqlToJsonAction to achieve even simpler code and have common control over JSON responses?
def getJson = SqlToJsonAction(List("colA", "colB")) { columns =>
SQL"select #$columns from mytable"
}
The only related StackOverflow question I found was: Convert from MySQL query result or LIST to JSON, which didn't have helpful answers.
I'm a Java developer just learning Scala so I still have a lot to learn about the language, but I've read through the Anorm, Iteratee/Enumeratee, Writes, docs and numerous blogs on Anorm, and am having trouble figuring out how to setup the necessary helper code so that I can compose my JSON methods this way.
Also, I'm unclear on what approaches allow Streaming out the Response, and which will iterate the entire SQL ResultSet before responding with anything to the client. According to Anorm Streaming Results only methods such as fold/foldWhile/withResult and Iteratees stream. Are these the techniques I should use?
Bonus:
In some cases, I'll probably want to map a SQL column name to a different JSON field name. Is there a slick way to do this as well?
Something like this (no idea if this Scala syntax is possible):
def getJson = SqlToJsonAction("colA" -> "jsonColA", "colB", "colC" -> "jsonColC")) { columns =>
SQL"select #$columns from mytable"
}
I am trying to access the steamid data in a json response returned by an API, specifically the Steam API.
The responses look like this:
I've made it return json but why do I see array all over the place?
How would I access the steamid data? I'm getting a bit confused as I thought this would be json.
I'm using guzzle to get the data and converting it to json using the guzzle json() method:
Any help would be appreciated.
Thanks!
The API is indeed using JSON to send/receive , however JSON is just a string, so in order to use that data PHP must parse it, which is automatically handled by guzzle, so as soon as you get the data back it has automatically decoded the data into a usable format for yourself.
It does this using the json_encode() and json_decode() functions.
You'd be able to access the steamid with the following.
// Assuming $data is your response from the API.
$players = array_get($data, 'response.players', []);
foreach($players as $player)
{
$steamId = array_get($player, 'steamid', null);
}
Using the laravel helper array_get() function is a great way of ensuring you return a sane default if the data doesn't exist as well as eliminating the need to keep doing things like isset() to avoid errors about undefined indexes, etc. http://laravel.com/docs/5.1/helpers
Alternativly not using the laravel helpers you could use something similar to below, although I'd advise you add checks to avoid the aforementioned problems.
foreach($data['response']['players'] as $player)
{
$steamId = $player['steamid'];
}
If you didn't want guzzle to automatically decode the API's JSON I believe you should just be able to call the getBody() method to return the JSON string.
$json = $response->getBody();
I want to create an application with front-end HTML + JavaScript and back-end Progress4GL.
I found this documentation: http://communities.progress.com/pcom/docs/DOC-106147 (see Introducing AJAX and Introducing JSON). In the example described it is used GET method when requesting data:
xmlhttp.open("GET", "http://localhost/cgi-bin/cgiip.exe/WService=wsbroker1/getcustomersJSON_param.p?piCustNum="+ custval, true);
xmlhttp.send();
and on Progress4GL procedure for getting the param it is used get-value("piCustNum").
In my application I want to use POST method. So the request will be, for example:
xmlhttp.open("POST","http://localhost/cgi-bin/cgiip.exe/WService=wsbroker1/getcustomersJSON_param.p",true);
xmlhttp.send("piCustNum=" + custval);
But I don't know how to get the sent param on Progress side. Actually I want to send a stringify JSON.
Can anyone help me with this? Thanks!
If you want to POST JSON data to a webspeed program, check out WEB-CONTEXT:FORM-INPUT or if you post more than 32K, check out WEB-CONTEXT:FORM-LONG-INPUT.
Now... regarding reading the JSON data, it depends on your OpenEdge version. In 10.2B Progress started supporting JSON, however it is very limited, especially if you have little control of how the JSON gets created. Since you are the one creating the JSON data it may work for you. Version 11.1 has much better support JSON including a SAX streaming implementation.
We were on version 10.2 so I had to resort to using this C library to convert the JSON into a CSV file. If you have access to Python on your server it is very easy to convert to a CSV file
For the front-end I'd recommend you to use some library (like jQuery) to handle the ajax's request for you, instead of dealing with the complexity to work with different browsers, etc. You can use jQuery's functions like $.ajax, $.get or $.post to make your requests.
A post to a webspeed page could easily be done like this:
var data = {
tt_param: [ { id: 1, des: 'Description 1' } ]
}
var params = { data: JSON.stringify(data) }
$.post(
'http://<domain>/scripts/cgiip.exe/WService=<service>/ajax.p',
params,
function (data) {
alert('returned:' + data);
},
'text'
);
And the back-end would receive the JSON string using get-value('data'):
{src/web2/wrap-cgi.i}
def temp-table tt_param no-undo
field id as int
field des as char.
def var lc_param as longchar no-undo.
procedure output-header:
output-content-type("text/text").
end.
run output-header.
assign lc_param = get-value('data').
temp-table tt_param:read-json('longchar', lc_param).
find first tt_param no-error.
{&OUT} 'Cod: ' tt_param.id ', Des: ' tt_param.des.
It's a good place to start, hope it helps.
Cheers,
There is a library from Node for calling Progress Business Logic dynamically. I hope this would help.
node4progress
Part of a website's JSON response had this (... added for context):
{..., now:function(){return(new Date).getTime()}, ...}
Is adding anonymous functions to JSON valid? I would expect each time you access 'time' to return a different value.
No.
JSON is purely meant to be a data description language. As noted on http://www.json.org, it is a "lightweight data-interchange format." - not a programming language.
Per http://en.wikipedia.org/wiki/JSON, the "basic types" supported are:
Number (integer, real, or floating
point)
String (double-quoted Unicode
with backslash escaping)
Boolean
(true and false)
Array (an ordered
sequence of values, comma-separated
and enclosed in square brackets)
Object (collection of key:value
pairs, comma-separated and enclosed
in curly braces)
null
The problem is that JSON as a data definition language evolved out of JSON as a JavaScript Object Notation. Since Javascript supports eval on JSON, it is legitimate to put JSON code inside JSON (in that use-case). If you're using JSON to pass data remotely, then I would say it is bad practice to put methods in the JSON because you may not have modeled your client-server interaction well. And, further, when wishing to use JSON as a data description language I would say you could get yourself into trouble by embedding methods because some JSON parsers were written with only data description in mind and may not support method definitions in the structure.
Wikipedia JSON entry makes a good case for not including methods in JSON, citing security concerns:
Unless you absolutely trust the source of the text, and you have a need to parse and accept text that is not strictly JSON compliant, you should avoid eval() and use JSON.parse() or another JSON specific parser instead. A JSON parser will recognize only JSON text and will reject other text, which could contain malevolent JavaScript. In browsers that provide native JSON support, JSON parsers are also much faster than eval. It is expected that native JSON support will be included in the next ECMAScript standard.
Let's quote one of the spec's - https://www.rfc-editor.org/rfc/rfc7159#section-12
The The JavaScript Object Notation (JSON) Data Interchange Format Specification states:
JSON is a subset of JavaScript but excludes assignment and invocation.
Since JSON's syntax is borrowed from JavaScript, it is possible to
use that language's "eval()" function to parse JSON texts. This
generally constitutes an unacceptable security risk, since the text
could contain executable code along with data declarations. The same
consideration applies to the use of eval()-like functions in any
other programming language in which JSON texts conform to that
language's syntax.
So all answers which state, that functions are not part of the JSON standard are correct.
The official answer is: No, it is not valid to define functions in JSON results!
The answer could be yes, because "code is data" and "data is code".
Even if JSON is used as a language independent data serialization format, a tunneling of "code" through other types will work.
A JSON string might be used to pass a JS function to the client-side browser for execution.
[{"data":[["1","2"],["3","4"]],"aFunction":"function(){return \"foo bar\";}"}]
This leads to question's like: How to "https://stackoverflow.com/questions/939326/execute-javascript-code-stored-as-a-string".
Be prepared, to raise your "eval() is evil" flag and stick your "do not tunnel functions through JSON" flag next to it.
It is not standard as far as I know. A quick look at http://json.org/ confirms this.
Nope, definitely not.
If you use a decent JSON serializer, it won't let you serialize a function like that. It's a valid OBJECT, but not valid JSON. Whatever that website's intent, it's not sending valid JSON.
JSON explicitly excludes functions because it isn't meant to be a JavaScript-only data
structure (despite the JS in the name).
A short answer is NO...
JSON is a text format that is completely language independent but uses
conventions that are familiar to programmers of the C-family of
languages, including C, C++, C#, Java, JavaScript, Perl, Python, and
many others. These properties make JSON an ideal data-interchange
language.
Look at the reason why:
When exchanging data between a browser and a server, the data can only
be text.
JSON is text, and we can convert any JavaScript object into JSON, and
send JSON to the server.
We can also convert any JSON received from the server into JavaScript
objects.
This way we can work with the data as JavaScript objects, with no
complicated parsing and translations.
But wait...
There is still ways to store your function, it's widely not recommended to that, but still possible:
We said, you can save a string... how about converting your function to a string then?
const data = {func: '()=>"a FUNC"'};
Then you can stringify data using JSON.stringify(data) and then using JSON.parse to parse it (if this step needed)...
And eval to execute a string function (before doing that, just let you know using eval widely not recommended):
eval(data.func)(); //return "a FUNC"
Via using NodeJS (commonJS syntax) I was able to get this type of functionality working, I originally had just a JSON structure inside some external JS file, but I wanted that structure to be more of a Class, with methods that could be decided at run time.
The declaration of 'Executor' in myJSON is not required.
var myJSON = {
"Hello": "World",
"Executor": ""
}
module.exports = {
init: () => { return { ...myJSON, "Executor": (first, last) => { return first + last } } }
}
Function expressions in the JSON are completely possible, just do not forget to wrap it in double quotes. Here is an example taken from noSQL database design:
{
"_id": "_design/testdb",
"views": {
"byName": {
"map": "function(doc){if(doc.name){emit(doc.name,doc.code)}}"
}
}
}
although eval is not recommended, this works:
<!DOCTYPE html>
<html>
<body>
<h2>Convert a string written in JSON format, into a JavaScript function.</h2>
<p id="demo"></p>
<script>
function test(val){return val + " it's OK;}
var someVar = "yup";
var myObj = { "func": "test(someVar);" };
document.getElementById("demo").innerHTML = eval(myObj.func);
</script>
</body>
</html>
Leave the quotes off...
var a = {"b":function(){alert('hello world');} };
a.b();