I'd like to post one file and a json object to my server. Posting as a Parameter works fine, so this works:
request.AddFile ("MyPic", organisation.Pic, "Pic.png", "image/png");
request.AddParameter ("name", "test name");
But if my JSON object is something with a nested structure like:
{organisation:{
'name':'test',
'address':{
'line1':'foo',
'line2':'foo2'
}
}}
How do I post this, while maintaining the json structure.
If I set it in the content body i.e.:
request.AddBody (organisation);
the content is posted, but the file is not posted.
Is it possible to post both a json body and file?
So, I figured out a workaround, which suits my purposes. What I was looking for was to do a single post of both the file and associated meta-data. In this instance I serialised the object to a JSON string, set that as my parameter and de-serialised on the server. This works in my case but I accept that this may not be ideal for others. Thought it might help someone.
No.
When you use AddFile you are actually addding a body to your request. You can't send a request with 2 bodies.
Actually this will be possible very soon. There's a pull request on the RestSharp library that addresses this. I've started maintaining the project and will be merging it into master in the next day or so. Once there's enough stuff, I'll release a new NuGet package as well.
Related
It took me a while to understand this, being that it was a little obvious. I will answer myself, so other can benefit of the answer and ofcourse to see if there's a better way to do this. The problem was based on Axios/Yii2 but I guess this will apply equally to other frontend libraries/frameworks sending data to Yii2.
I needed to post data from a small form made on Vuejs, sending the request Axios to a Action/Controller on Yii2, so data is sent on a simple POST request and the post is getting to the controller, but I was not able to receive the data on the action, $_POST | $post arrives empty (checked using xdebug).
As much as I remember, this had something to do with security. But I already tried by disabling public $enableCsrfValidation, so that was not the problem.
public $enableCsrfValidation = false;
But no matter what, data was not being added to the request/post data inside Yii2.
The following Image, explains the problem you will find there:
The Axisos method that sends the post with test data.
The Yii2 Action stpoed at the place, I should be able to see data.
The capture of the xdebug variables and data for the request.
The capture of Chrome where you can check the payload is sent.
The answer is as I said "kind of obvious", but I could not see that, and I am sure some other devs will probably fall on this.
After searching like crazy and asking everyone, I tried sending the request by using Postman app, yup the best thing I know to test apis.
Dont forgue to add the xdebug cookie to be able to debug your PHP Endpoint.
There I found the first clue «the obvious part», I was not sending data as a form-data, Axios and other libraries, send the data as a raw (application/json) payload.
This means that Yii2 will no be able to find the data inside the post request, yes its there but Yii2 magic will not work, neither you will find this data inside $GLOBALS or in $_POST.
So reading the Yii2 documentation I found that inside request I can use a function that will help me recovering the Raw data, so to do this use the following line:
$raw_data = Yii::$app->request->getRawBody();
Now, that data gets to you as a simple, raw json string, so use the power of PHP to parse it to an object.
$object= json_decode($raw_data );
And finally use the data inside by calling the properties you look for, sent on the pay load:
Json Payload:
{
"msg":"This is my payload",
"id":"11"
}
To use it:
echo $object->{'msg'}; // prints: This is my payload
So that's the way to handle that, now I would like some other points of view to see if there's a better way or cleaner way to do this. Hope it helps.
I am new to wso2 API Manager, trying to set it up expose my plain HTTP POST back end calls as a REST API. I am sure this is not a new pattern that I am trying to achieve here. The requirement is to convert the JSON data coming in (has array structures) into the HTTP URL query string parameters.
After some research through the documentation and other posts on this forum, decided to go with the script mediator that would parse the JSON data and convert it to a string that can be appended to the endpoint URL. Some how I am not able to achieve this.
I have followed the following post that seems to be very straight forward. As the original poster suggested, I am also not able to use getPayloadJSON() method. Even have trouble using the work around suggested there due to JSON.parse() not working.
Link
Also, this approach of editing the service bus configuration from source view does not sound like the correct option. Is there another elegant solution to achieve this? Thanks for the help in advance.
I was able to get both methods working by using an external script instead of the inline java script. Thanks
I'm brand new to Pentaho and I'm trying to do the following workflow:
read a bunch of lines out of a DB
do some transformations
POST them to a REST web service in JSON
I've got the first two figured out using an input step and the Json Output step.
However I have two problems doing the final step:
1) I can't get the JSON formatted how I want. It insists on doing {""=[{...}]} when I just want {...}. This isn't a big deal - I can work around this since I have control over the web service and I could relax the input requirements a bit. (Note: this page http://wiki.pentaho.com/display/EAI/JSON+output gives an example for the output I want by setting no. rows in a block=1 and an empty JSON block name, but it doesn't work as advertised.)
2) This is the critical one. I can't get the data to POST as JSON. It posts as key=value, where the key is the name I specify in the HTTP Post field name (on the 'Fields' tab) and the value is the encoded JSON. I just want to post the JSON as the request body. I've tried googling on this but can't find anyone else doing it, leading me to believe that I'm just approaching this wrong. Any pointers in the right direction?
Edit: I'm comfortable scripting (in Javascript or another language) but when I tried to use XmlHttpRequest in a custom javascript snippet I got an error that XmlHttpRequest is not defined.
Thanks!
This was trivial...just needed to use the REST Client (http://wiki.pentaho.com/display/EAI/Rest+Client) instead of the HTTP Post task. Somehow all my googling didn't discover that, so I'll leave this answer here in case someone else has the same problem as me.
You need to parse the JSON using a Modified JavaScript step. e.g. if the Output Value from the JSON Output is called result and its contents are {"data"=[{...}]}, you should call var plainJSON = JSON.stringify(JSON.parse(result).data[0]) to get the JSON.
In the HTTP Post step, the Request entity field should be plainJSON. Also, don't forget to add a header for Content-Type as application/json (you might have to add that as a constant)
New Relic currently URL encodes the JSON payload of our webhooks. We correctly include the necessary header, however, I'm wondering if this is best practice - or if the JSON payload should be sent un-encoded (or even with some different encoding)?
It's more common to send JSON directly in the request body with a Content-Type header value of application/json. It's not entirely uncommon to do it the way New Relic does however. GitHub post-receive hook payloads are also encoded this way.
In some languages/frameworks it's easier to extract the JSON data from a parameter than from reading the entire request body since form parameters are automatically parsed out and easily referenced and in some cases reading a request body may involve some more elaborate stream parsing.
An example of that is PHP's syntax for extracting the entire raw request body:
$data = file_get_contents('php://input');
var_dump(json_decode($data));
Versus extracting from a form parameter:
$data = $_POST["payload"];
var_dump(json_decode($data));
For frameworks that easily expose the request body, this isn't as much an issue. Here's the example given from the GitHub docs for handling the hook with Sinatra:
post '/' do
push = JSON.parse(params[:payload])
"I got some JSON: #{push.inspect}"
end
If the JSON body were passed directly, this is what it would look like:
post '/' do
push = JSON.parse(request.body.read)
"I got some JSON: #{push.inspect}"
end
It's mostly just an aesthetic preference, there's no technical advantage to either. With the parameter, you do get the flexibility of sending multiple JSON payloads in a single request or the option to send additional meta data outside the JSON. I would suggest sending a list of JSON objects and including all the meta data in a top level wrapper if that's important to you. Something like this:
{
"notification_id": "akjfd8",
"events": [ ... ]
}
I personally prefer raw JSON bodies, but that's mostly because it makes debugging far easier on sites like RequestBin. Your screenshot does a great job of illustrating how unreadable JSON encoded into a form parameter is.
<rant>
Payloads should be the content type you say they are. If they are sending as application/x-www-form-urlencoded then that is what you should expect to find. The culture of using a Russian doll model of nested encodings is beyond stupid in my opinion.
If you want to send JSON send it as application/json. Is that really so hard?
</rant>
I have a Restlet application already working that accepts JSON and returns JSON entity as response.
I'm trying to understand how I can compress the JSON entity that is returned in the response.
I did not find any clear example on how to achieve it.
I think I have to put somewhere on the router chain the Encoder/EncoderService classes, but I really don't understand where and how to use them.
Could anybody help me?
After some testing, I got the answer.
Creating a new filter like this
Filter encoder = new Encoder(getContext(), false, true, new EncoderService(true));
inside the createInboundRoot() method of my own Application class did the trick, the client requests were already containing the gzip header needed.