JSON is a nice way to pass complex data from my server side code to client side JavaScript. For example, in PHP I can write:
<script type="text/javascript>
var MyComplexVariable = <?= BigFancyObjectGraph.GetJSON() ?>;
DoMagic(MyComplexVariable);
</script>
This is pretty cool, but sometimes you want to pass more than basic date, like dates or even function definitions. There is a simple and straightforward way of doing it too, like:
<script type="text/javascript>
var MyComplexVariable = {
'SimpleProperty' : 42,
'FunctionProperty' : function()
{
return 6*7;
},
'DateProperty' : new Date(989539200000),
'ArbitraryProperty' : GetTheMeaningOfLifeUniverseAndEverything()
};
DoMagic(MyComplexVariable);
</script>
And this works like a charm on all browsers I've seen so far. But according to JSON.org such syntax is invalid. On the other hand, I've seen this syntax being used in very many places, including some popular JavaScript frameworks. So...
Can I expect any problems if I use "unsupported" JSON features like the above? Why is it wrong or not?
Added clarification: If I expected my JSON to be consumed by some unknown 3rd party software, or even a known parser which was not a browser, then such exotics would indeed most likely not work and I would not attempt to embed them. But I'm interested in the case where the JSON code is written directly inside a JavaScript code block that is executed by an Internet browser. Like the examples above.
According to JSON.org, a JSON object only supports the following value members of an object:
(source: json.org)
Since none of these is a function, I would suggest not using it since, as you said, it is not officially supported in the spec.
Besides, what happens when a non-Javascript client (such as a Python program) tries to consume your JSON? How is it going to run your JavaScript code?
Related
So far I have seen, the API allows me to add certain parameters to my calls such as filter_by and sort_by.
This works well for me, but I would like to know how do I use multiple parameters at the same time and using the filter_by parameter.
Currently, I am working with the Silex-Boilerplate, which offers me this function:
options('{"sort_by":"name:asc","is_startpage":false}')
I have tried to pass this JSON as options-parameter:
'{"filter_by":"{"component":"reference"}", "sort_by":"name:asc"}'
But it doesn't seem to work. Are there any suggestions about how the JSON could look like?
Thanks in advance!
As I can see you're already using the options Twig helper which is the right way to go.
options('{"sort_by":"name:asc","is_startpage":false}')
you can use the filter_by parameter directly using this syntax:
options('{"sort_by":"name:asc","filter_by[component]":"reference"}')
Also this syntax would be possible:
getStories('starts_with', 1, 10, 'name:ASC', options('{"filter_by":{"component":"reference"}}'))
this will be mapped directly for the API call by our PHP Client Library. Those requests are also cached in your Silex Boilerplate without any extra effort.
I wrote bindings to an API and put everything into an R package, including tests, vignettes, etc., but the API keeps constantly changing. This brings up some issues
updating my package is error-prone, maybe I miss a new function or forget to mark an old as deprecated
submitting the package to CRAN is not a good idea, since it's changing frequently and packages are reviewed by hand
I got a hard time keeping this software up2date, since the API chance irregularly and therefor I maybe miss them
I came up with the idea to generate the bindings automatically. The API itself provides everything required for that via an online JSON documentation. These docs reflect constantly the current definition of the API.
Writing some code which converts the JSON docs to R functions is not the problem. But if I do so, I still need to update the package on CRAN. The best solution would be, to create a package that (on load) looks up the API definition and creates the required functions. Ideally these functions should be unit tested.
I am thankful for any hint on that.
Best
Edit: The API is the firebrowse API with an example of what the input would be.
This is really challenging and thus there's no obvious way to do it. The whole idea behind wsdl was to be able to do this easily using a standardized XML description. That was never really implemented in R and it never really took off more broadly (because of the emergence of RESTful services and JSON).
You can definitely generate functions dynamically by creating a so-called "function factories" (Hadley discussed these a bit here). In short, you write a function that takes JSON as input and returns a function that does whatever is described in the JSON. (Creating such a factory that dynamically does this whenever the package is loaded seems risky but I suppose it's possible. I'd probably just keep the factory to myself and use it to create and update the package.)
I'm not going to attempt to deal with your API specifically, but to see how this would work:
# create factory with arguments to control returned function
factory <- function(action, endpoint, content = TRUE, parsed = FALSE) {
if (content) {
if(parsed) {
out <- function() httr::content(httr::VERB(action, endpoint))
} else {
out <- function() httr::content(httr::VERB(action, endpoint), "text")
}
} else {
out <- function() httr::VERB(action, endpoint)
}
return(out)
}
# use factory to create different functions
(a <- factory("GET", "http://example.com", content = TRUE, parsed = FALSE))
## function() httr::content(httr::VERB(action, endpoint), "text")
(b <- factory("GET", "http://example.com", content = TRUE, parsed = TRUE))
## function() httr::content(httr::VERB(action, endpoint))
(c <- factory("GET", "http://example.com", content = FALSE))
function() httr::VERB(action, endpoint)
# evaluate each function
a() # returns a character string
b() # returns parsed HTML
c() # returns an httr response object
The best solution would be, to create a package that (on load) looks up the API definition and creates the required functions. Ideally these functions should be unit tested.
This is a very well known problem. React to server changes without breaking the clients is a pain not just in your situation, but also for mobile applications (that needs to be resubmitted every time API changes).
While your approach may work (generate the client on the fly), the best result can be reached if the server may collaborate to reach the achievement.
You have to decouple the client from API implementation. How? Using REST (for real), thous introducing the concept of state and transitions.
This is not the right place to explain how it works, but a great introduction can be found in this great presentation by Glenn Block, and then continuing to read.
This won't solve your particular problem, but it is, in my opinion, the right way to approach the problem.
You may want to have a look to this video as well, 15:24 part.
Given an ajax call such as:
$.ajax(
{
url:"MyWebService.blah",
data: {"data":"awesome"},
success : function(responseText)
{
var myJsonObj = $.parseJSON(responseText);
//do stuff with myJsonObj
}
});
This was working fine. I updated jQuery to 1.9 today (I was on 1.6 for a while) as a possible fix to Safari all of the sudden not supporting various toggle functionality (something about eventLayer.X no longer supported), and now all my ajax calls are throwing the following javascript error:
Uncaught Syntax Error: Unexpected token o
After a little research and some testing, I discovered that "responseText" in my code above is now a JSON object, not a string. So the error makes sense, but I'm trying to wrap my head around this. Did jQuery really change the default return type? I checked the documentation:
http://api.jquery.com/jQuery.ajax/
and dataType is defaulted to "Intelligent Guess". I can see how that might be convenient, but I also don't like it.
So here are my questions:
Is this a new(ish) change in jQuery?
Was it version 1.9 that did this, or has it been this way before and I'm a fossil having been using 1.6?
What are some suggestions to handle this and sort of "future-proof" my code?
This is a pretty fundamental change that affects a lot of code. I will go through my code and remove any instance of parsing my returned data to JSON, but this whole thing is a little unnerving. Was I mistaken in not specifying a dataType? I suppose it is a good practice to enforce a dataType instead of relying on default, but... wow. Am I alone on this, or was that a tad presumptuous of a change on the part of jQuery?
jQuery automatically detects what the dataType is based on what was returned if no dataType was set. Most likely 1.9 just improved that detection to properly detect what you are returning as json. It's best to always give a datatype to ensure you'll always get consistent results.
i have a problem in joomla 1.5.18. i'm trying to get text from an element using for instance
var divContent = $$('#myDiv').get('text');
but each time i get the error, in chrome: Uncaught TypeError: Object #<HTMLDivElement> has no method 'get'; in firefox: divContent.get is not a function. why i'm getting this error?
even following samples in mootools i get the same.
i know how to do it for each object in the collection. i got doing $$('.') and using the "each" method:
$$('p.classname').each(function (el){
el.addEvent('click', function() {
var txt = el.get('text');
...
});
});
and obviously i add the function onto domready. i don't use jquery 'cause mootools & jquery stops the events each one... -i tried once & what i needed didn't work- and i wish to use all joomla resources including mootools.
checking the version in mootools.js it says 1.13 (?)
not sure which version of mootools comes in joomla 1.5.18, it may be 1.2.5. if so, .get should work but not as you expect it to.
You are probably a jquery user, used to $("#myid") and find that the only way to get similar results with the # in there in mootools is via document.getElements, aka, $$.
the problem is, to get a single item by id in mootools, you actually do document.id("mydiv") or even $("mydiv"). $$("#mydiv") will actually return a COLLECTION of elements with a single member, so [obj], so the real element is $$("#mydiv")[0].
if you apply a .get method to a COLLECTION, the getter normally iterates via a .each through all members and performs the get individually. it will return a new array member for each member of the collection - i.e. ["innertext"]; - though there should be a method for the collection, make sure that the element is there, it's in domready / onload and it's a unique id.
Still, I'd swap to using the $("mydiv").get("text"), it ought to be fine. This is all all too common assumption of jquery users that don't read the manual, in my experience. It results in bad and un-performant code due to all the .each iterations mootools has to silently do to work with the collection for you. Just saying.
You can also (and should) upgrade your Joomla to the latest version (security fixes, etc) and I believe it was about version 1.5.20 they included a newer version of mootools right out of the box (also there is a plugin for mootools upgrade you can enable). I believe the version included out of 1.5.20 is like 1.2.5 or something...
That may help!
I've created a simple REST service that serves data as XML. I've managed to enable XML, JS and RSS format but I can not find the way to enable JSON format. Is JS == JSON? Guess not :).
How can I enable this in version 1.2/1.3?
Thx!!
Router::parseExtensions('json');
If you have PHP 5.2 or higher, it ships with JSON encode/decode support. Check the docs here.
You'll probably need to do the encoding/output by hand, but it should be trivial to code.
Bonus points would be to build it as a behavior :)
Edit:
Check out the $javascript->object() method here, it may do what you want.
Quick google search indicates that there are is a json Component for CakePHP. Link to article discussing its use in Cake 1.2: http://www.pagebakers.nl/2007/06/05/using-json-in-cakephp-12/
just add this line of code in your controller or AppController
var $components = array('RequestHandler');
function beforeFilter() {
$this->RequestHandler->setContent('json', 'text/x-json');
}
and run it into internet explorer.