Square's Retrofit response parsing logic: streaming? - json

Could you please explain Square's Retrofit response parsing logic.
I'm interested in case when we should receive & parse a big json (>100Kb) - will Retrofit wait while all content will be received from server and only than parse it, or it will start to parse it immediately while getting stream data?
My goal is to speedup response processing.
Are there any options about it available to configure?

As soon as the HTTP client parses the headers, the InputStream will be handed back to Retrofit which will then hand it directly to the Converter. This means that as the underlying converter mechanism (say, Gson) is pulling bytes they are being read (and potentially blocking) directly from the network.
Note: this is only true if logging is off (as it should be in production / release builds). When logging is turned on beyond HEADERS level, the response body has to be read in its entirety into a byte[] in order to both log and hand the data to the converter.

Related

Utf8JsonReader from a Stream

I'm trying to read a sequence of JSON objects from a network stream. This involves finding complete JSON objects and returning them one by one. As soon as a complete JSON object was received, I need to have it. Anything else that follows that JSON object is for the next object and must only be used when the next complete object was received.
I would have thought that the Utf8JsonReader class could do that but apparently it cannot accept a Stream of any kind. It even seems to be unwanted to have that possibility.
Now I'm wondering if it's possible at all to use .NET's shiny new JSON parser to read from a stream when I don't know when data arrives and how much of it. Do I need to split the JSON object messages manually or can the already existing reader just stop when it has something and continue when the next thing is available? I mean, if it can do that on a predefined array of bytes, it could surely also do it with some waiting in between until more data is available. That just doesn't seem to be exposed in the public API. Or did I miss something?
The JsonDocument.ParseAsync(Stream) method cannot be used because it would read the stream to the end. That doesn't make sense for a network stream that stays open for a long time and just reads some data from time to time.

How to force a DataSnap REST TStream method to respond as content stream instead of a JSON Array

I've a DataSnap REST method returning files as TStream.
The client side is Objective-C code on iOS, JavaScript code, and Delphi code.
I compiled the server side with Delphi Sydney 10.4.1, upgrading from Delphi XE3.
Testing the new version, I realized the stream response format now is a JSON Array, and no more a content stream of the response.
I can get a content stream with a client-side parameter:
http://host:port/datasnap/rest/[Class]/[Method]/?json=false
But this require to update the client software too, and I want to pospone this update and distribute updates progressively.
Is there a way to force DataSnap REST server to apply the "?json=false" behaviour for a specific method call? Or for any method involving TStream response?
The TWebModule1.DSHTTPWebDispatcher1FormatResult let me manage the result val but I need not to change the json response, but to have a content stream instead.

Handling large data through restapi

I have a RestFul server that is suppuse to return a large json object more specifically an array of objects to browsers. For example 30,000 points will have a size of 6.5mb.
But I get this content mismatch error in browser when speed is slow. I feel it is because large data throught rest api breaks up something. Even in Postman sometimes it fails to render even though i see data of 6.5 mb received.
My Server is in NodeJS. and return content-type header is application/json.
My Question is
Would it make more sense if I return a .json file. Will the browser be able to handle. If yes, then I will download the file and make front end changes.
Old URL - http://my-rest-server/data
Proposed Url - http://my-rest-server/data.json
What would be content-type in the proposed url?
Your client can't possibly expect to want all of the data at once but still, want their data fast data.
...but you might want to look into sending data in chunks and streams:
https://medium.freecodecamp.org/node-js-streams-everything-you-need-to-know-c9141306be93

How can I use json.Decoder to decode a single json message and switch the connection to a different protocol going foward?

I am working on a TCP-based proxy that must first do a REQ/REPLY handshake in json on a given connection. Because JSON is a self-delimiting protocol I reach for Go's json.Decoder to pull off this work which does the job nicely.
Here are the steps I take:
Dial a connection to a remote server
Write a single json request to a remote server (REQ)
Read a single json reply from the same remote server (completing the proxy handshake REPLY)
Upon a valid json handshake, pass the client connection onto another part of the code which will (going forward) switch to a text based protocol from this point on.
The problem is, when json.Decoder reads data into its internal buffer it can potentially read more data than it needs in which case the json.Decoder has a Buffered() method which gives back an io.Reader with the remainder of the data.
This data (available in the Buffered() method) is now the text-based protocol data which needs to get read from the connection after the json hand-shake did its work. But if I pass the connection forward as is without considering the left over buffer, the connection gets into a locked state because it is waiting to read this data which never comes. The code that deals with the text-based protocol expects a net.Conn going forward and once I pass the connection forward (after the json handshake has been made) the code utilizing the connection understands how to speak the text-based protocol at this point on. So there should be a clear boundary of work.
My question is what is the ideal way to solve this issue so I can still take advantage of the json.Decoder, but ensure that when I pass the connection to a different part of the code in my proxy I know the start of the data for the text-based protocol will still be readable. I somehow need to take the remaining data in the json.Decoder's Buffered() method and put that back in front of the connection so it can be properly read going forward.
Any insight is much appreciated.
You can try
type ConnWithBuffIncluded struct{ //Implement net.Conn so can be passed through pipeline
net.Conn
json.Decoder
}
func (x ConnWithBuffIncluded) Read(p []byte) (n int, err error){ //Will Read both sources
return io.MultiReader(x.Decoder.Buffered(), x.Conn).Read(p)
}

how to do post upload chunking(vs. download chunking)

So, in playframework, I can stream any response back so when i get a json request, I can do http chunking and stream the response back for some really really large responses. I was wondering if the same can be done on POST calls. If a client has a very very large POST call, can they stream me the request? Is this possible with html?
That said, if I can't do that, i need an api that curl or some other non-browser client will use to upload a file(the json request, or a csv, etc). How to create such an api?
I should note that I canNOT receive the whole request at once or will get out of memory. I need to receive pieces and as I receive pieces put that to the backend datastore a piece at a time.
Also, what would be the curl syntax to make sure it is streaming the file rather than sending it in one huge huge request that would break the server? How to force the client to stream the file in?
thanks,
Dean
You can get full control over HTTP request processing by using an EssentialAction. An EssentialAction processes the request body and returns a Result.
Normal Play Actions are a special case of EssentialAction. Actions process request bodies and return Results too, but they always perform their processing in two steps. Actions first parse the request body. Then the Actions parsethe parsed value to a function to get a Result. For you, having a separate parsing step is a problem because it means that the parsed value needs to be stored in memory.
If you use an EssentialAction then you can avoid storing a parsed value in memory, because you can just process the request body as it arrives.
What you need to do is add a method to your controller that returns an EssentialAction.
The signature of EssentialAction is:
trait EssentialAction extends (RequestHeader) ⇒ Iteratee[Array[Byte], SimpleResult]
The EssentialAction needs to accept the request header and then return an iteratee to process the request body. The iteratee will incrementally process the request body as it arrives. You can use your iteratee to put each piece into your data store as each piece arrives. When you're done processing all the pieces you can return a Result.
More info here: http://www.playframework.com/documentation/2.2.x/HttpApi.