I am sending a raw Json requet using postman to an API service which feeds it to another web service and finally a database. I want to attach a file to the raw Json request.
I am attaching below the current request I am sending. Is it the right way? The first name and other information is going through but the attachment is not. Any suggestions?
{
"Prefix": "",
"FirstName": "test-resume-dlyon",
"LastName": "test-dlyon-resume",
"AddressLine1": "test2",
"AddressLine2": "",
"City": "Invalid Zipcode",
"State": "GA",
"Zip": "99999",
"Phone": "9999999999",
"Email": "testresumedlyon#gmail.com",
"Source": "V",
"WritingNumber": "",
"AgeVerified": true,
"AdditionalSource": "",
"EnableInternetSource": true,
"InternetSource": "",
"ExternalResult": "",
"PartnerID": "",
"SubscriberID": "15584",
"Languages": [
"English",
"Spanish"
],
"fileName": "resume",
"fileExtension": "docx",
"fileData": "UELDMxE76DDKlagmIF5caEVHmJYFv2qF6DpmMSkVPxVdtJxgRYV"
}
There is no "correct" format to attach a file to a JSON.
JSON is not multipart/form-data (which is designed to include files).
JSON is a text-based data format with a variety of data types (such as strings, arrays, and booleans) but nothing specific for files.
This means that to attach a file, you have to get creative.
For example, you could encode a file in text format (e.g. using base64), but it wouldn't be very efficient, and any Word document would result in you getting a much longer string than "UELDMxE76DDKlagmIF5caEVHmJYFv2qF6DpmMSkVPxVdtJxgRYV".
Of course, the method you use to encode the file has to be the method that whatever is reading the JSON expects you to use. Since there is no standard for this, and you have said nothing about the system which is consuming the JSON you are sending, we have no idea what that method is.
First of all, I'd recommend reading the postman API docs. They have some extremely useful information on there for using the API. Two particular articles that might of interest here are these:
Looking at it and running it through a validator like this one shows that there are no syntax errors so it must be to do with the JSON parameters the API is expecting.
Here's something you can try:
In postman, set method type to POST.
Then select Body -> form-data -> Enter your parameter name (file according to your code)
and on right side next to value column, there will be dropdown "text, file", select File. choose your image file and post it.
For rest of "text" based parameters, you can post it like normally you do with Postman. Just enter parameter name and select "text" from that right side dropdown menu and enter any value for it, hit send button. Your controller method should get called.
Related
How to tell my customer's provider to send the response in a valid format, or I misunderstood something?
The response I get is:
{
"Code": "202",
"Message": "BUIxxxxxxxxxxxxxxxxxxxxxLUE",
"Status": "SUCCESS",
"Data": "{\"PRxxxxxxX\":\"2712.0000\",\"TRANSACTION_ID\":\"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\"REMARKS\":\"SUCCESS\"}"
}
My problem is that, I can not map back the Data section this way, so I have to trim out values with an excel, which is not a good solution in long term (like =LEFT(D8,FIND("TRANSACTION_ID",D8,18)-4))
When I told him this he replied the following:
"we are serializing the object hence slash is added automatically (below is the screen shot)
apiResult.Data = JsonConvert.SerializedObject(
**The slash will not appear in the actual output"
Situation
I'm using Azure Logic App to do the following steps:
Load/read a JSON file from blob storage. This file is created and written in R, using the toJSON function fromt JSONlite package
Get de file content from this JSON file
Convert it to a JSON format (using expression json())
Insert the converted output into a Azure Table Storage
Please look at the screenshot for this workflow
azure workflow
I've created a JSON-file in R (using the JSONlite package). When printed in the R console, it looks fine:
{"PartitionKey":"pdf","RowKey":"robject)test","nr":"24330087","status":true}
This is the 'formatting' I want. When in this format, it inserts smoothly with logic app into the azure table storages.
Problem
But when I write the JSON-file above to Azure Blob, and use this file as input in a logic app workflow, these escape slashes cause problems
The file from the blob storage isn't interpreted by logic app in the desired format (without escape slashed) It includes the escape slashes (). And I think this causes the problems in logic apps. I pasted a hardcoded format without these slashes. And this worked. But hard coding isn't an option here.
Below the format the logic app 'json()' expression makes of it. It includes the dreaded escapes.
[
"{\"PartitionKey\":\"pdf\",\"RowKey\":\"coolblue_factuur_1744212700.pdf\",\"kvknr\":\"24330087\",\"active_status\":true}"
]
And then this Error occurs:
{
"odata.error": {
"code": "InvalidInput",
"message": {
"lang": "en-US",
"value": "An error occurred while processing this request.\nRequestId:xxxx\nTime:2019-11-20T09:02:46.6051827Z"
}
}
}
After some online research, it looks like Logic App is having difficulty with the escape slashes () and the dubble quotes it escapes ("). So all these (\") escapes are choking the flow.
Question
How to deal with \" (escape slashes?). All I want is that logic app can correctly read the JSON-file from blob storage, convert it and instert the data into the table storage.
So ideally: convert the JSON file from blob storage into a format without escape slashes. I tried to pre process this in R, but this did not work.
Extra info
Below you can find the steps I took in logic app:
The JSON file uploaded (and fetched) from blob storage
{
"headers": {
"Pragma": "no-cache",
"Transfer-Encoding": "chunked",
"Retry-After": "15",
"Vary": "Accept-Encoding",
"x-ms-request-id": "xxxx",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"X-Content-Type-Options": "nosniff",
"X-Frame-Options": "DENY",
"Timing-Allow-Origin": "*",
"x-ms-apihub-cached-response": "true",
"Cache-Control": "no-store, no-cache",
"Date": "Wed, 20 Nov 2019 09:09:52 GMT",
"Location": "https://[location url]",
"Set-Cookie": "[cookieset]",
"Content-Type": "application/json",
"Expires": "-1",
"Content-Length": "452"
},
"body": {
"Id": "xyz",
"Name": "robjecttest_parameters_db.json",
"DisplayName": "robjecttest_parameters_db.json",
"Path": "/path/robjecttest_parameters_db.json",
"LastModified": "2019-11-20T09:09:39Z",
"Size": 95,
"MediaType": "application/octet-stream",
"IsFolder": false,
"ETag": "\"[etag]\"",
"FileLocator": "[filelocations]",
"LastModifiedBy": null
}
}
From the headers, the content type is 'JSON'. But the body (which is the data I want) it is 'octet-stream').
After a 'get filecontent' action:
{
"$content-type": "application/octet-stream",
"$content": "[content]"
}
Next step is to convert/cast the 'body-data' to a JSON format. Using the expression & dynamic content from logic APP
json(body('Get_blob_content'))
Use this 'output' as an entity to insert into the table storage.
You just need to replace the \" with " by the expression below:
replace(string(body('Get_blob_content')), '\"', '"')
And then we can use json() to convert it.
Update:
In your json data to insert to table storage, the data in red box below cause the problem
You can't insert into it with the second level of data.
We use POST /v1/url/bulk/:branch_key for batch deep link generation for some of our items.
The response returns an array of URL's alone. The links are working fine, but its not returned in the order of our items send as request.
Is there any way to identify which branch link belongs to which item?
At least if the response had item's id or some other custom data returned with it, we could identify the link correctly.
Any hope? Thanks.
At the most basic level, this information is available to you via the Links tab on the Branch dashboard's Liveview & Export page. You can see the last 100 links created on this tab. To see more, you can use the "Export Links" button that appears in the upper right hand corner of the page.
If you need this for more information than can be retrieved via "Export Links," you can have the app whitelisted for the Data Export API (see: https://dev.branch.io/methods-endpoints/data-export-api/guide/). This provides access to a daily collection of .csv files that would include links created and their metadata. To whitelist the app for the Data Export API you send a request to integrations#branch.io. Be sure to include the app's key and to send the request from an email address on the Team tab (https://dashboard.branch.io/settings/team).
You can also query links. For a single link, append "?debug=true" and enter this value into the address bar of your browser.
You can also script the lookup of link data using the HTTP API: https://github.com/BranchMetrics/branch-deep-linking-public-api#viewing-state-of-existing-deep-linking-urls
The Branch API also allows you to specify a custom alias (the URL slug), so if you simply want an easy way to tie specific bulk-created URLs to the data inside without querying a second time, you could use this as a workaround. Details here
The bulk creation link API would return the links in that specific order.
You can test out the same via creating 3 links and using a particular parameter to differentiate.
E.G :
curl -XPOST https://api2.branch.io/v1/url/bulk/key_live_xxxxxxxxxxx -H "Content-Type: application/json" \
-d '[
{
"channel": "facebook",
"feature": "onboarding",
"campaign": "new product",
"stage": "new user",
"tags": ["one", "two", "three"],
"data": {
"$canonical_identifier": "content/123",
"$og_title": "Title1",
"$og_description": "Description from Deep Link",
"$og_image_url": "http://www.lorempixel.com/400/400/",
"$desktop_url": "http://www.example.com",
"custom_boolean": true,
"custom_integer": 1243,
"custom_string": "everything",
"custom_array": [1,2,3,4,5,6],
"custom_object": { "random": "dictionary" }
}
},
{
"channel": "facebook",
"feature": "onboarding",
"campaign": "new product",
"stage": "new user",
"tags": ["one", "two", "three"],
"data": {
"$canonical_identifier": "content/123",
"$og_title": "Title2",
"$og_description": "Description from Deep Link",
"$og_image_url": "http://www.lorempixel.com/400/400/",
"$desktop_url": "http://www.example.com"
}
},
{
"channel": "facebook",
"feature": "onboarding",
"campaign": "new product",
"stage": "new user",
"tags": ["one", "two", "three"],
"data": {
"$canonical_identifier": "content/123",
"$og_title": "Title3",
"$og_description": "Description from Deep Link",
"$og_image_url": "http://www.lorempixel.com/400/400/",
"$desktop_url": "http://www.example.com"
}
}
]'
As you can see, we have used og_title as a unique parameter and the links created for your app would be in the same order.
Yes, You can identify link belongs to which item by using data of branch.io link , you can pass branch.io config parameter as well as your custom parameters.
Every Branch link includes a dictionary of key : value pairs that is specified by you at the time the link is created. Branch’s SDKs make this data available within your app whenever the app is opened via a Branch link click.
i am trying to build POST-request for one CRM-system, building URL. I have the documentation wich shows methods for this CRM. In my case, i need to add contact, so they have the example of this method, required parameters in JSON.
fields:{
"NAME": "Mark",
"LAST_NAME": "Jonson",
"STATUS_ID": "NEW",
"ASSIGNED_BY_ID": 1,
"CURRENCY_ID": "USD",
"OPPORTUNITY": 12500,
"PHONE": [ { "VALUE": "555888", "VALUE_TYPE": "WORK" } ]
}
so i can successfuly add contact in my system, using this https://myportal.mycrm.com/rest/crm.contact.add?auth=xxxxxxxxxfields[NAME]=A&fields[LAST_NAME]=B
But i have a huge problem, adding a PHONE, because of its multiple parameters. So i was trying to do it in a lot of ways, for ex. fields[PHONE[0[VALUE]]]=345678&fields[PHONE[0[VALUE_TYPE]]]=WORK, but neither was success. Maybe someone can help me with it?
P.S. i need to do it only in URL, so it is not an option for me to use json-parsing, php, http-request-builder etc
Well, you are trying to make a POST-request, but what you've really done is GET-request. Please take a look.
Our goal is to develop API where you can POST /data/save/ that will accept some JSON data like below. The main requirement that JSON should contain one of the following attributes:
"attribute1", "attribute2", "attribute3". Namely when one attribute is exist another one should not exist.
{
"name": "test name",
"attribute1": [
"test1", "test2"
]
or
"attribute2": [
"test3", "test4"
]
or
"attribute3": true
}
The question is how to correctly design such API that it will be easy to use and not confused from the client side.
It would be good to know some best practices in such direction.
I would return a
400 Bad Request
The request could not be understood by the server due to malformed
syntax. The client SHOULD NOT repeat the request without
modifications.
and a phrase explaining that multiple attributes are not supported.
I agree such API is confusing for client side.
What's about creating different endpoints:
POST /data/save/attribute1 json_1
POST /data/save/attribute2 json_2
A custom media type should clarify how to use your API. It should specify what to include in your request.
Another solution might be, building the request like this:
{
"name": "test name",
"attr-key": "my-attribute1",
"values": ["test1", "test2"]
}