JSON vs HTML Ajax response - json

Which is faster, to return ajax in JSON and then process JSON response to render the html, or just have the Ajax response the raw html in a bunch of <li></li>'s?

Depends. In both cases, the server is simply returning a response with text. If the JSON version of the response requires more characters than the HTML version, that response will take longer to be transmitted back to the client, and vice versa.
But of course there is also the server-side script which must do its work. Perhaps in your case generating JSON is faster than HTML from your server-side script. No way for me to know.
And then there is the client-side processing. You'd have to parse the response to turn it into a true object, and then you'd need to iterate over the resulting object in order to generate the HTML. This will definitely take longer than just taking an HTML response and injecting it into the DOM.
However, I doubt that the performance difference will be noticeable, meaning that your decision about providing a JSON response vs. HTML response should be based on other factors.

As already mentioned, that depends. From a server side point of view it makes a lot of sense to let the client generate the HTML because just serializing JSON is faster and takes a lot of strain off the server because it doesn't have to deal with all the HTML generation. An additional benefit is, that you offer an API when returning JSON that can be used for more than just outputting HTML.
If you want to take the work off the client it makes sense to generate the HTML on the server side.
In the end the speed of it depends a lot on the technologies used. Both ways can perform extremely well but when done wrong either one will be slow.

here as you can see i did the same response by HTML and JSON. the JSON response equals weight half what HTML response as kilobyte ,Thats means faster server-side respone. but in this case you have to rebulid the html from json so let calculate the json rebuild time and see
the first one is the html so it takes more time to do the server respone
now lets see appending it to html document
first one is html again the html proccess last longer than Json

Related

Does using multipart/form-data Content Type for a RESTful POST api a good practice?

I have a situation where I have to write a api to create a resource and amongst datafields that I need to accept is a string that is basically contents of a html file. As I see it I have a choice between structuring the entire thing as a json object where this field is a string field with urlencoded html string , and having the Content Type as multipart/form-data where each of the fields and the html string (UTF-8 encoded) is a part in the message.
Not using json is something I am not comfortable with as I feel violating the REST standards in not structuring the content of the entity I am about to create thus there is a loss of information for the consumers as they can't tell immediately looking at my api definition about what data to feed to it. But practically multipart/form-data handles stuff like html file content better and more efficient as I will not have to urlencode it and can control the char-encoding also.
What will be a better approach in current context and upholding RESTful principles ? Also are there other trade-offs i should be aware of ? what about parsing a json with a huge string field (~ 200 Kb)embedded?
EDIT :- I was reading some similar questions on SO and one approach that stood out was the 2-step approach of making a first call with metadata to create the entity and then upload the file as an UPDATE process to the created entity wherein we use multipart/form-data. In that context, I guess , what I am asking is how sound is an approach where I send both metadata and the file in a single api call as multipart data , where each metadata field is actually a part in the multipart message as is the file.
The canonical way to upload files to REST API is using the multipart/form-data. As W3 recommendation guide says:
The content type "multipart/form-data" should be used for submitting
forms that contain files, non-ASCII data, and binary data.
Multipart/form-data has advantages over base64 to represent binary data. Is sticked to REST/Http philosophy, and simplify the develop of API clients.
Returning values from Forms: multipart/form-data
W3 Recommendation guide
The good practice is to use multipart/form-data whenever files are uploaded to the server along with database fields. Do not send a base64 JSON string as the request to your Rest API as it might corrupt the file or degrade the performance of your application.
As far as documenting multipart/form-data Rest API for your consumers is concerned you have to force your API consumers to use the same form fields which you have predefined in your web service.
Returning Values from Forms: multipart/form-data
I started using FormData objects everywhere on the client-side, in lieu of regular form input fields, for dynamic REST posts. FormData is presented in a positive light in various tutorials, so I went with it.
However, down the line, this caused me problems when decoding the form data into my Go structs. FormData objects are sent as "multipart/form-data" (regardless of files being sent) and I believe my decoder in Go didn't convert the raw data back to string form. Eventually my SQL queries were throwing panics, as hex data was being sent in where strings should have been.
So with some adjustment, I could use FormData however I've decided to revert to the simple universal recommendation: Use "multipart/form-data" only for special cases like when sending files. Otherwise, just use regular "application/x-www-form-urlencoded".

Why use a JSON object to pass data with POST versus a Query String in Perl?

I'm looking to run a script using Stripe.pm, basically looking to do credit card processing. The credit card number is not being passed at all. All the examples I see use a JSON object passed in a POST call but I have a lot of experience using Query Strings i.e.
http://www.example.com/cgi-bin/processingscript.pl?param1=XXXX&param2=YYYYY&param3=ZZZZZ
Is this a security risk? What is the advantage or disadvantage of posting using JSON versus a query string like I'm used to using?
From a purely technical point of view, there is no difference between POST and GET if you pass a reasonably short parameter. You can also just pass JSON as a GET parameter no problem:
GET foo.pl?json={'foo':'bar'}
It would make sense to url-encode the data in this case. You can also send the same request using POST.
If you do not want to use query params at all, you need POST and put your JSON into the request body. Depending on which option you choose, there are differences in how to deal with it in Perl. Let's say you are using the CGI module... Perl makes no difference between POST and GET params.
For the query string GET or POST, you need to do:
use CGI;
my $cgi = CGI->new;
my $json = $cgi->param('json');
If you put the payload directly into the request body, you will instead need to do:
use CGI;
my $cgi = CGI->new;
$cgi->param('POSTDATA');
This is documented in CGI under "handling non url-encoded ...".
For JSON, there is also of course the time it takes to parse it, but that should be negligible.
The advantage JSON has over query strings without JSON inside them is, that you can encode arbitrary complex data structures inside JSON, while plain-text query strings are just one level deep.
From a security point of view, pretty much everything has been said. I'll recap my own ideas:
use SSL
do not put sensitive stuff into log files
if you are dealing with CC data (even if it is not the number itself), take extra care; read up on PCI DSS and encrypt stuff during transmission
NEVER store a cvc!
if you want to learn more about that topic, there is a Stack Exchange site called Information Security.
Security
- making sure no one other than user/server know the data GET/POST has no influence. Use SSL to ensure this.
- Stopping the user passing "interesting" arguments to your script. POST has a certain amount of security-through-obscurity in that most users won't see the parameters, but it's no real security. You should check the parameters in your application to cover that.
Generally POST with a nice data package (e.g. JSON) makes for a much more flexible and maintainable application interface, and has the advantage that you don't need to worry about encoding and length of parameters the same way you do when using GET.
POST:
Generally safer for important data
Parameters not shown in url
You can pass longer Strings-Structures (e.g. JSON) than by using GET that has length limit on the parameters you send

What http verb to use to send JSON in body?

I have a UI in my webapp that allows users to build fairly complex charts where they can specify chart types, chart axes, ranges, and also specify multiple filters (which can be quite complex themselves). I've written the javascript (actually CoffeeScript) in a very OO style, so that the whole state of the chart configuration can be serialized to a very neat and tidy JSON document. When the user wants to render the graph, a request is made to the server with that JSON document in the request body, and the server then responds with another JSON document containing the actual data for the chart.
My question is, what HTTP verb should I be using for this request? I'm currently using POST as a GET request with a body feels wrong, but POST doesn't really fit. Any ideas would be helpful!
What makes you feel POST does not fit?
As GET utilizes the querystring via a parameterized / key value pairing, adding the json as one qs value would not feel right to me, where as a POST feels absolutely the right choice.

Normal form submission vs. JSON

I see advantages of getting JSON response and formatting in client side but are there any advantages by using JSON for form submission compared to normal submission?
One thing comes to mind when dealing with POST data is useless repetition:
For example, in POST we have this:
partners[]=Apple&partners[]=Microsoft&partners[]=Activision
We can actually see, that there is a lot of repetition here, where as when we are sending JSON:
{"partners":["Apple","Microsoft","Activision"]}
59 characters versus 47. In this small sample it looks miniscule, but these savings could go up and up, and even several bytes will save you some data. Of course, there is the parsing of data on server side, which could even out the differences, but still, i saw this sometimes helping when dealing with slower connections (looking at you, 3G and EDGE).
I didn't see any mention of file submission so I thought I'd chime in. FormData seems to be the standard way of submitting files to a server over AJAX. I didn't come across any solutions using JSON but there might be a way to serialize/deserialize files ...
It probably depends on your server side application. You are usually posting data to servers using POST, so how do you format underline data depends on how do you want for your server to process it. POST provides some form of key->value protocol, while in JSON you can put more than that. You can also transfer json using GET by placing it in url.
You must look on json as a way how data is written, while normal submission with POST should give you just a way how you transport data(of course you can abuse key->value feature of it for ordering your data).
There exists protocols on top of HTTP, that could help you define interface to your web application. One good example is RESTfull http://en.wikipedia.org/wiki/Representational_state_transfer
Specifically for submitting forms, I don't see any advanatages, POST was designed for this in a first place. There are cases where you want to transmit not only data from form, but also some metadata in this case json might help you by encoding form data(with metadata) in some json format, but at the end you will be still abusing POST for transferig this json data.
Hope I answered your question.
I don't see any apparent advantages for basic form submission. But when it comes to handling complex structures you'll start to realize the advantage of organizing your data.
So if you have a simple contact form (name, email, message) stick with normal form POSTing. But think about submitting a complete user's CV for example, it's very annoying to handle the massive amount of variables in your server-side script.
Here's an example for using JSON with PHP
//Here are the submission data
{
"personalInformation": {
"name": "hey",
"age": "20"
},
"education": {
"entry1": {
"type": "Collage",
"year": "2012"
},
"entry2": {
"type": "Highschool",
"year": "2010"
}
}
}
$CV_Data = json_decode($_POST['json_form'], true);
$CV_Data['personalInformation']['name'];
$CV_Data['personalInformation']['age'];
//Or you can loop
foreach($CV_Data['education'] as $entry){
$entry['type'];
$entry['year'];
}
As you can see, using JSON here makes it a lot easier for you to work on your data.
I am actually wrestling with the same problem. My use case requires that a potentially complex tree to be posted to the server. Some framework are able to decode two dimensional arrays as form attributes (Spring WebMVC is one I know of). Even so, this only helps you in the specific case when you are sending a nested array. The inherent nature of name-value-pair makes it unsuitable for transmitting a tree more than one level deep. In the past, I have used hacks like sending URL-encoded JSON as attribute value:
val0=%7B%22name%22%3A%22value%22%7D&val1=something&val2=something+else
However, this approach gets messy and difficult to debug when the object becomes more complex. In addition, many frameworks provide tools that automagically map JSON form post to objects (e.g.: Jackson for Java), it seems obtuse not to take advantage of these tools.
So ultimately, the choice rests on the complexity of the object you are sending. If the object is limited to one level deep, use straight name-value-pair; if the object is complex and involves a deeply-nested tree, use JSON.

Send PDF as byte[] / JSON problem

I am trying to send a generated PDF file (Apache FOP) to the client. I know this can be done by writing the array to the response stream and by setting the correct content type, length and so on in the servlet. My problem is that the whole app was built based on the idea that it will only receive/send JSON. In the servlet's service() method, I have this:
response.setContentType("application/json");
reqBroker.process(request, response);
RequestBroker is the class who processes the JSON (jackson processor), everything is generic and I cannot change it. On top of this, I have to receive the JSON from the request correctly, to access the data and generate my pdf. So those two lines are necessary. But when I send the response, I need to have another content type so that the pdf is displayed correctly in the browser.
So far, I am able to send the byte array as part of the JSON, but then I don't know how to display the array as PDF on the client (if smth like this is even possible).
I would like some suggestions on how can I send my pdf and set the right header, without messing with the JSON. Thanks.
JSON and byte arrays don't mix.
Instead, you should create an <iframe> and point it to a URL that returns a raw PDF.
Take a look here:How to send pdf in json, it lists couple of approaches that you can consider. The easiest way is to convert the binary data into string by using Base64 compression. In C#, this would mean a call to Convert.FromBase64String. However this has space overhead as Base64 compression means around +33% more memory. If you can get away with it, this is the least complicated solution. in case additional size is an issue you can think about zipping it up.