I've got compression propperly configured for my Azure web role. Both .aspx pages and static pages like *.css are being compressed correctly.
<urlCompression doStaticCompression="true" doDynamicCompression="true" dynamicCompressionBeforeCache="true" />
I've got several different [System.Web.Services.WebMethod]'s though, that are not returning GZIP'd data. The size of each request is around 350KB, so I'm thinking it should be quite a bit faster if I can get this to work.
Within my webMethod, I create a list of objects, return the objects, and I assume some type of built in serializer turns this into JSON?
Is there anyway to force this content to be compressed?
Thanks so much!
I've seen people have issues with built in Compression for numerous reasons
The simplest way is to use a third party component such as Telerik's RadCompression to enforce compression on the response to AJAX calls.
Alternatively, you can override the application's BeginRequest method or write your own handler
to pack up the responses on the fly. A basic VB version of how to do this is here:
Sub Application_BeginRequest(...)
If Request.RawUrl.Contains(".aspx") And _
Not Request.Headers("Accept-Encoding") Is Nothing Then
If Request.Headers("Accept-
encoding").ToLower().Contains("gzip") Then
Response.Filter = New GZipStream(Response.Filter,CompressionMode.Compress, True)
Response.AppendHeader("Content-encoding", "gzip")
' Else...attempt deflate if GZip is not allowed
End If
End If
End Sub
I've done a method with the handler as well (and that's what I believe Telerik's RadCompression uses), but it is a good bit more complicated as you have to modify the response size, etc.
Here's what I ended up with, a variation Yak's answer.
HttpApplication app = (HttpApplication)sender;
HttpRequest request = app.Request;
HttpResponse response = app.Response;
System.Web.HttpApplication Appl = (System.Web.HttpApplication)sender;
HttpContext context = Appl.Context;
string origpath = context.Request.Url.AbsolutePath;
//Ajax Web Service request is always starts with application/json
if (request.ContentType.ToLower(CultureInfo.InvariantCulture).StartsWith("application/json"))
{
//User may be using an older version of IE which does not support compression, so skip those
if (!((request.Browser.IsBrowser("IE")) && (request.Browser.MajorVersion <= 6)))
{
string acceptEncoding = request.Headers["Accept-Encoding"];
if (!string.IsNullOrEmpty(acceptEncoding))
{
acceptEncoding = acceptEncoding.ToLower(CultureInfo.InvariantCulture);
if (acceptEncoding.Contains("gzip"))
{
response.Filter = new GZipStream(response.Filter, CompressionMode.Compress);
response.AddHeader("Content-encoding", "gzip");
}
else if (acceptEncoding.Contains("deflate"))
{
response.Filter = new DeflateStream(response.Filter, CompressionMode.Compress);
response.AddHeader("Content-encoding", "deflate");
}
}
}
}
Related
Per Box example easy way to get user's root folder using below code
http://opensource.box.com/box-java-sdk/
BoxAPIConnection api = new BoxAPIConnection("your-developer-token");
BoxFolder rootFolder = BoxFolder.getRootFolder(api);
for (BoxItem.Info itemInfo : rootFolder) {
System.out.format("[%d] %s\n", itemInfo.getID(), itemInfo.getName());
}
But if i need to access someone else info using As-user, I'm unable to use BOX SDK classes (BoxFolder, BoxFile, BoxUser...) and need to get the data only from JSON directly like below.
If i do so, i'm loosing the latest features added in the new SDK. Is it the best way? How about the performance? Is there any alternative way available?
url= new URL("https://api.box.com/2.0/folders/0");
BoxAPIRequest request = new BoxAPIRequest(api,url,"GET");
request.addHeader("As-User", "12345678");
BoxJSONResponse response = (BoxJSONResponse) request.send();
JsonObject responseJSON = JsonObject.readFrom(response.getJSON());
Later get the folder properties using JsonObject / JsonArray. If i need the folder items, i need to loop the JsonArray like below
JsonArray entries = responseJSON.get("entries").asArray();
for (JsonValue entry : entries)
{ ....}
Unfortunately, the new Java SDK beta doesn't have built-in support for "As-User" functionality yet, which makes this kind of tricky. One workaround is to use a RequestInterceptor with your BoxAPIConnection to manually add the "As-User" header to every request.
api.setRequestInterceptor(new RequestInterceptor() {
#Override
public BoxAPIResponse onRequest(BoxAPIRequest request) {
request.addHeader("As-User", "user-id");
// Returning null means the request will be sent along with our new header.
return null;
}
}
This should let you use the rest of the SDK normally and not have to worry about doing the API requests manually. I also created an issue for adding "As-User" support.
I'm using justinrainbow/json-schema class to validate data against a schema.
However I'm receiving this error:
Media type application/schema+json expected
I could try to change ContentType in nginx for all my json files, but it doesn't make sense.
Another way would be to change the constant inside the library to 'application/json' (as my server is delivering for json files). Again, is not ok to change the source.
Is there a way to pass this as a parameter to justinrainbow/json-schema class?
https://github.com/justinrainbow/json-schema
I couldn't find a solution for this because there is no content-type on the web as schema+json.
Just replace in justinrainbow/json-schema/src/JsonSchema/Validator.php the SCHEMA_MEDIA_TYPE to 'application/json'.
You can also serve the file by local path, not by url.
Now the library supports "json/application" additionally, but it throws an error at other content types.
To avoid this, you can extend the default "JsonSchema\Uri\UriRetriever" and override "confirmMediaType()":
class MyUriRetriever extends JsonSchema\Uri\UriRetriever {
public function confirmMediaType($uriRetriever, $uri) {
return;
}
}
$retriever = new \MyUriRetriever();
$refResolver = new JsonSchema\SchemaStorage($retriever);
$schema = $refResolver->resolveRef($schema);
$validator = new JsonSchema\Validator(new JsonSchema\Constraints\Factory($refResolver));
$validator->check($data, $schema);
$data: json decoded response from API
$schema: url of the schema
I had the same issue many times when testing other party`s API against their schema. Often they do not send the correct "Content-Type" header for their schemas and it can take long for them to change it.
Update: Ability to exclude endpoints from validation
You can use UriRetriever:addInvalidContentTypeEndpoint():
$retriever = new UriRetriever();
$retriever->addInvalidContentTypeEndpoint('http://example.com/car/list');
I'm using Ember-Data 1.0.0.Beta-9 and Ember 1.7 to consume a REST API via DreamFactory's REST Platform. (http://www.dreamfactory.com).
I've had to extend the RESTAdapter in order to use DF and I've been able to implement GET and POST requests with no problems. I am now trying to implement model.save() (PUT) requests and am having a serious hiccup.
Calling model.save() sends the PUT request with the correct data to my API endpoint and I get a 200 OK response with a JSON response of { "id": "1" } which is what is supposed to happen. However when I try to access the updated record all of the properties are empty except for ID and the record on the server is not updated. I can take the same JSON string passed in the request, paste it into the DreamFactory Swagger API Docs and it works no problem - response is good and the record is updated on the DB.
I've created a JSBin to show all of the code at http://emberjs.jsbin.com/nagoga/1/edit
Unfortunately I can't have a live example as the servers in question are locked down to only accept requests from our company's public IP range.
DreamFactory provides a live demo of the API in question at
https://dsp-sandman1.cloud.dreamfactory.com/swagger/#!/db/replaceRecordsByIds
OK in the end I discovered that you can customize the DreamFactory response by adding a ?fields=* param to the end of the PUT request. I monkey-patched that into my updateRecord method using the following:
updateRecord: function(store, type, record) {
var data = {};
var serializer = store.serializerFor(type.typeKey);
serializer.serializeIntoHash(data, type, record);
var adapter = this;
return new Ember.RSVP.Promise(function(resolve, reject) {
// hack to make DSP send back the full object
adapter.ajax(adapter.buildURL(type.typeKey) + '?fields=*', "PUT", { data: data }).then(function(json){
// if the request is a success we'll return the same data we passed in
resolve(json);
}, function(reason){
reject(reason.responseJSON);
});
});
}
And poof we haz updates!
DreamFactory has support for tacking several params onto the end of the requests to fully customize the response - at some point I will look to implement this correctly but for the time being I can move forward with my project. Yay!
EmberData is interpreting the response from the server as an empty object with an id of "1" an no other properties in it. You need to return the entire new object back from the server with the changes reflected.
I have a simple Backbone.js/Bootstrap front end in HTML5 with a Node.js/Restify backend. I am setting cookies in a header response from the server as below:
res.setHeader("Set-Cookie", ["token=ninja", "language=javascript"]);
On the client side, I am making a REST call as
var response = this.model.fetch().success(function(data){
//success
}).error(function(data){
//error
}).complete(function(data){
//complete
});
that callsback a parse method in the model.
How can I read the cookie value in the model?
Include Cookie.js.
You can then reference individual cookies like this:
var token = Cookie.get('token')
# token == 'ninja'
Here is what I figured out. My application has two components - the HTML/js from one domain that talks to a REST sevice on another domain (and therefore is cross-domain.) Because the cookie is set from REST, it appears is not readable across domains. So the web page will not store the cookie even though the server is sending it. One alternative is to use local cookies or use the technique illustrated by http://backbonetutorials.com/cross-domain-sessions/.
Assuming you are using jQuery with Backbone, you can get the headers by defining the parse function in your model by calling getAllResponseHeaders or getResponseHeader:
var model = Backbone.Model.extend({
// the rest of your model
parse: function(resp, xhr) {
var allHeaders = xhr. getAllResponseHeaders();
var cookieHeader = xhr. getResponseHeader("Set-Cookie");
// do something with the headers
return resp;
}
});
Started testing my jQuery applications with IE9. Looks like I may be in for some trouble here.
I noticed that when I return JSON data back to the Javascript methods I always get this Prompt that says: "Do you want to open or save this file?" and provides me with 3 buttons: Open, Save and Cancel. Of course, my javascript is taking actions based on the values set in the JSON object but since IE9 doesn't pass it over to the script, I cannot execute the follow up action from there on.
Anyone else facing this issue? Here is a snapshot.
If anyone is using ASP.net MVC and trying to fix this issue - I used the following built in methods in the MVC framework. Simply update the content Type and encoding on the JsonResult.
public ActionResult Index(int id)
{
// Fetch some data
var someData = GetSomeData();
// Return and update content type and encoding
return Json(someData, "text/html", System.Text.Encoding.UTF8,
JsonRequestBehavior.AllowGet);
}
This fixed the issue for me!
(Answer originally posted for this question.)
If using MVC, one way of handling this is to implement a base controller in which you override (hide) the Json(object) method as follows:
public class ExtendedController : Controller
{
protected new JsonResult Json(object data)
{
if (!Request.AcceptTypes.Contains("application/json"))
return base.Json(data, "text/plain");
else
return base.Json(data);
}
}
Now, your controllers can all inherit ExtendedController and simply call return Json(model); ...
without modifying the response content type for those browsers which play nicely (not <=IE9 !)
without having to remember to use Json(data, "text/plain") in your various Ajax action methods
This works with json requests which would otherwise display the "Open or Save" message in IE8 & IE9 such as those made by jQuery File Upload
I also faced this problem yesterday with WebAPI which returned a list of URLs (of asynchronously uploaded files).
Just set content type to "text/html" instead of default "application/json; charset=UTF-8" of WebAPI services. I got response as a JSON string and then used $.parseJSON to convert it to JSON object.
public async Task<HttpResponseMessage> Upload()
{
// ...
var response = Request.CreateResponse(HttpStatusCode.OK, files);
response.Content.Headers.ContentType = new MediaTypeHeaderValue("text/html");
return response;
}
// result is an iframe's body content that received response.
$.each($.parseJSON(result.html()), function (i, item)
{
console.log(item.Url);
});
In my case when contentType in response header is "application/json; charset=UTF-8", the IE 9 shows that Prompt. But changed to "text/html" then the prompt does not show, although all otter browsers are fine with the "application/json; charset=UTF-8".
Actually, you were right #EricLaw. After setting the content type in the Json result, it worked.
I had to add the following lines:
result.ContentEncoding = System.Text.Encoding.UTF8;
result.ContentType = "application/json; charset=UTF-8