I am wondering if there is a way to send a DELETE request using .csv format instead of .json format?
API Reference: http://dev.socrata.com/publishers/direct-row-manipulation.html
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Date: Thu, 27 Mar 2014 00:48:42 GMT
[
{
"typ": "delete",
"id": "row-evac~sxbs~gm8t"
}
]
I tried something along the lines of:
typ, id
delete, row-evac~sxbs~gm8t
to no avail.
You can delete using a CSV payload by adding a ":deleted" column and passing a value of true. You can also omit all the other columns but the ID. Example:
Earthquake ID,:deleted
15215753, true
Make sure you use a Content-Type of "text/csv" as well.
Related
This question already has an answer here:
Karate not allowing invalid Json syntax
(1 answer)
Closed 1 year ago.
I need to execute POST request with the following preconditions:
json has 2 parent concepts with the same concept name, but different properties, like
{
"dictionary": {
"concept": {
"c1": {
"logicalName": "c1",
"isNull": true
},
"c1": {
"logicalName": "c1",
"isNull": false
}
}
}}
but after post request i observed that 1st concept with json property "isNull": true was removed. But i'm expecting that in this case system will fail with proper error code and shown proper validation message.
i double checked with the regular cURL and the same input json from file - everything looks fine.
Why karate removes 1 block from json input?
Could you please advice?
Thank you
This is invalid JSON. If you want Karate to not parse the JSON I guess for a "negative test", please use text.
* url 'https://httpbin.org/anything'
* text body =
"""
{
"dictionary":{
"concept":{
"c1":{
"logicalName":"c1",
"isNull":true
},
"c1":{
"logicalName":"c1",
"isNull":false
}
}
}
}
"""
* header Content-Type = 'application/json'
* request body
* method post
Even though the request log may show "valid" JSON, you can run the above request and see the response "echo" the actual request - which will be what you want.
14:47:03.100 [main] DEBUG com.intuit.karate - response time in milliseconds: 1219
1 < 200
1 < Date: Mon, 12 Jul 2021 09:17:03 GMT
1 < Content-Type: application/json
1 < Content-Length: 871
1 < Connection: keep-alive
1 < Server: gunicorn/19.9.0
1 < Access-Control-Allow-Origin: *
1 < Access-Control-Allow-Credentials: true
{"args":{},"data":"{\n \"dictionary\":{\n \"concept\":{\n \"c1\":{\n \"logicalName\":\"c1\",\n \"isNull\":true\n },\n \"c1\":{\n \"logicalName\":\"c1\",\n \"isNull\":false\n }\n }\n }\n}", "more": "..."}
When I make the following call:
/beta/me/messages/{id}?$select=internetMessageHeaders
I get the following output:
{
"#odata.context": "https://graph.microsoft.com/beta/$metadata#users('...')/messages(internetMessageHeaders)/$entity",
"#odata.etag": "...",
"id": "AAMkAGY1Mz...",
"internetMessageHeaders": [
{
"name": "Received",
"value": "from CY1PR16MB0549.namprd16.prod.outlook.com (2603:10b6:903:13d::13) by DM3PR16MB0553.namprd16.prod.outlook.com with HTTPS via CY4PR06CA0051.NAMPRD06.PROD.OUTLOOK.COM; Fri, 16 Feb 2018 22:14:45 +0000"
},
...
]
}
And nowhere do I find 'To' or 'From' fields in the response. Why? Is there a way to retrieve this information?
From the documentation, this property holds:
A key-value pair that represents an Internet message header, as defined by RFC5322, that provides details of the network path taken by a message from the sender to the recipient.
Based on that description, your result looks correct to me:
from CY1PR16MB0549.namprd16.prod.outlook.com (2603:10b6:903:13d::13)
by DM3PR16MB0553.namprd16.prod.outlook.com
with HTTPS
via CY4PR06CA0051.NAMPRD06.PROD.OUTLOOK.COM;
Fri, 16 Feb 2018 22:14:45 +0000
For the To and From addresses, you need to add toRecipients and from to your $select clause.
/beta/me/messages/{id}?$select=toRecipients,from,internetMessageHeaders
I'm performing a request to a foreign API using luasec, lua-socket and converting the data, a JSON string, to a lua table with cjson. I've read the docs of said modules and unfortunately none of it helped me with my problem. Can't link more than 2 websites with current account, sorry.
Summary: I get the response and the appropiate string using the posted request function, when turning said string into a lua table via cjson.decode the output table isn't the desired one, it's a copy of my response header, which is not intentional.
The following code is how I do my request:
local function request (req_t)
local res_t = {}
resp = https.request {
url = const.API_URL .. req_t.url,
method = req_t.method,
headers = req_t.headers,
sink = ltn12.sink.table(res_t)
}
return table.concat(res_t), resp.headers, resp.code
end
Using the following call
local res, headers = request({ ... })
I receive the proper response as a string but my goal is to do data manipulation with it, so turning said response(string) to a lua table with
local resJson = cjson.decode(res)
Does not produce the correct output. It does produce a table which is exactly the same as my response header. Here is the following output from my terminal alongside the code
When out of function type is: string
Desired response in string:
{"total_photos":221926,"photo_downloads":"186029632.0"}
When out of function type is: string
Desired response in string:
{"total_photos":221926,"photo_downloads":"186029632.0"}
After decode, type is: table
server Cowboy
strict-transport-security max-age=31536000
access-control-allow-headers *
x-ratelimit-limit 50
x-ratelimit-remaining 46
x-cache-hits 0, 0
accept-ranges bytes
access-control-request-method *
x-request-id ee5a74fd-2b10-4f46-9c25-5cfc53aeac6c
access-control-expose-headers Link,X-Total,X-Per-Page,X-RateLimit-Limit,X-RateLimit-Remaining
content-type application/json
connection close
content-length 55
fastly-debug-digest f62d52c08b1ef74db89a66a0069f0a35c49e52230567905240dacf08c9ea1813
vary Origin
cache-control no-cache, no-store, must-revalidate
x-timer S1496524765.369880,VS0,VE111
x-cache MISS, MISS
x-served-by cache-iad2123-IAD, cache-mad9429-MAD
via 1.1 vegur, 1.1 varnish, 1.1 varnish
date Sat, 03 Jun 2017 21:19:25 GMT
age 0
access-control-allow-origin *
x-runtime 0.011667
Printing header
server Cowboy
strict-transport-security max-age=31536000
access-control-allow-headers *
x-ratelimit-limit 50
x-ratelimit-remaining 46
x-cache-hits 0, 0
accept-ranges bytes
access-control-request-method *
x-request-id ee5a74fd-2b10-4f46-9c25-5cfc53aeac6c
access-control-expose-headers Link,X-Total,X-Per-Page,X-RateLimit-Limit,X-RateLimit-Remaining
content-type application/json
connection close
content-length 55
fastly-debug-digest f62d52c08b1ef74db89a66a0069f0a35c49e52230567905240dacf08c9ea1813
vary Origin
cache-control no-cache, no-store, must-revalidate
x-timer S1496524765.369880,VS0,VE111
x-cache MISS, MISS
x-served-by cache-iad2123-IAD, cache-mad9429-MAD
via 1.1 vegur, 1.1 varnish, 1.1 varnish
date Sat, 03 Jun 2017 21:19:25 GMT
age 0
access-control-allow-origin *
x-runtime 0.011667
Function that produces said log
local res, headers = request({ ... })
print('When out of function type is: ' ..type(res) .. '\n')
print('Desired response in string:')
print(res .. '\n')
resJson = cjson.decode(res)
print('\nAfter decode, type is: ' .. type(resJson) .. '\n')
pTable(resJson)
print('\nPrinting header\n')
pTable(headers)
pTable is just a function to output a table to stdout.
Thanks in advance
Posted function and routines are correct. The problem was located in my print table function, which I somehow had hardcoded my headers.
I was flooded with a primitive json body for fcm:
Body = mochijson2:encode([ {<<"operation">>, <<"create">>},{<<"notification_key_name">>, <<"console group">>},{<<"registration_ids">>, [<<"02aa6XXXX3c9b6d">>,<<"APA91bGtaXXXXXXXXXXXXoi4UH8vIdZk1X67A_9izpSFSHV3BXxdIwG">>]}]).
And send POST-request to create group according to documentation:
httpc:request(post, {Url, [{"Authorization", KeyApi}, {"project_id", ProjectId}], "application/json", Body},[{timeout, 5000}], []).
But I got error BadJsonFormat:
{ok,{{"HTTP/1.1",400,"Bad Request"},
[{"cache-control","private, max-age=0"},
{"date","Fri, 10 Mar 2017 16:19:37 GMT"},
{"accept-ranges","none"},
{"server","GSE"},
{"vary","Accept-Encoding"},
{"content-length","25"},
{"content-type","application/json; charset=UTF-8"},
{"expires","Fri, 10 Mar 2017 16:19:37 GMT"},
{"x-content-type-options","nosniff"},
{"x-frame-options","SAMEORIGIN"},
{"x-xss-protection","1; mode=block"},
{"alt-svc","quic=\":443\"; ma=2592000; v=\"36,35,34\""}],
"{\"error\":\"BadJsonFormat\"}"}}
But mochijson2:decode(Body) works fine, and it looks like properly formed json, but I get the error BadJsonFormat anyway.
What was wrong? How can I fix this?
The function mochijson2:encode doesn't return a string or a binary, but an iolist:
1> Body = mochijson2:encode([ {<<"operation">>, <<"create">>},{<<"notification_key_name">>, <<"console group">>},{<<"registration_ids">>, [<<"02aa6XXXX3c9b6d">>,<<"APA91bGtaXXXXXXXXXXXXoi4UH8vIdZk1X67A_9izpSFSHV3BXxdIwG">>]}]).
[123,
[34,<<"operation">>,34],
58,
[34,<<"create">>,34],
44,
[34,<<"notification_key_name">>,34],
58,
[34,<<"console group">>,34],
44,
[34,<<"registration_ids">>,34],
58,
[91,
[34,<<"02aa6XXXX3c9b6d">>,34],
44,
[34,<<"APA91bGtaXXXXXXXXXXXXoi4UH8vIdZk1X67A_9izpSF"...>>,
34],
93],
125]
There is nothing wrong with that, by itself. Using iolists instead of strings or binaries means that you don't have to create an expensive flat data structure, that you would just write to a file or a socket, after which you'd throw it away. Function like file:write_file and gen_tcp:send handle iolists just as well as strings or binaries.
However, httpc:request doesn't!
Let's test that by starting a server on port 1111 with netcat in a shell:
$ nc -l 1111
And then make a request from the Erlang shell:
3> httpc:request(post, {"http://127.0.0.1:1111", [], "application/json", Body},[{timeout, 5000}], []).
The netcat server shows this output:
POST / HTTP/1.1
content-type: application/json
content-length: 13
te:
host: 127.0.0.1:1111
connection: keep-alive
{"operation":"create",....
Note that the content-length is 13 instead of 159! httpc:request is able to send the iolist, but it uses the function length instead of iolist_size to generate the content-length header, and as a result the server only considers the first 13 bytes of the JSON object, which is not valid JSON by itself.
The solution is to pass iolist_to_binary(Body) to httpc:request instead of just Body.
I am trying to implement a simple GET/POST api via Django REST framework
views.py
class cuser(APIView):
def post(self, request):
stream = BytesIO(request.DATA)
json = JSONParser().parse(stream)
return Response()
urls.py
from django.conf.urls import patterns, url
from app import views
urlpatterns = patterns('',
url(r'^challenges/',views.getall.as_view() ),
url(r'^cuser/' , views.cuser.as_view() ),
)
I am trying to POST some json to /api/cuser/ (api is namespace in my project's urls.py ) ,
the JSON
{
"username" : "abhishek",
"email" : "john#doe.com",
"password" : "secretpass"
}
I tried from both Browseable API page and httpie ( A python made tool similar to curl)
httpie command
http --json POST http://localhost:58601/api/cuser/ username=abhishek email=john#doe.com password=secretpass
but I am getting JSON parse error :
JSON parse error - Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
Whole Debug message using --verbose --debug
POST /api/cuser/ HTTP/1.1
Content-Length: 75
Accept-Encoding: gzip, deflate
Host: localhost:55392
Accept: application/json
User-Agent: HTTPie/0.8.0
Connection: keep-alive
Content-Type: application/json; charset=utf-8
{"username": "abhishek", "email": "john#doe.com", "password": "aaezaakmi1"}
HTTP/1.0 400 BAD REQUEST
Date: Sat, 24 Jan 2015 09:40:03 GMT
Server: WSGIServer/0.1 Python/2.7.9
Vary: Accept, Cookie
Content-Type: application/json
Allow: POST, OPTIONS
{"detail":"JSON parse error - Expecting property name enclosed in double quotes: line 1 column 2 (char 1)"}
The problem that you are running into is that your request is already being parsed, and you are trying to parse it a second time.
From "How the parser is determined"
The set of valid parsers for a view is always defined as a list of classes. When request.data is accessed, REST framework will examine the Content-Type header on the incoming request, and determine which parser to use to parse the request content.
In your code you are accessing request.DATA, which is the 2.4.x equaivalent of request.data. So your request is being parsed as soon as you call that, and request.DATA is actually returning the dictionary that you were expecting to parse.
json = request.DATA
is really all you need to parse the incoming JSON data. You were really passing a Python dictionary into json.loads, which does not appear to be able to parse it, and that is why you were getting your error.
I arrived at this post via Google for
"detail": "JSON parse error - Expecting property name enclosed in double-quotes":
Turns out you CANNOT have a trailing comma in JSON.
So if you are getting this error you may need to change a post like this:
{
"username" : "abhishek",
"email" : "john#doe.com",
"password" : "secretpass",
}
to this:
{
"username" : "abhishek",
"email" : "john#doe.com",
"password" : "secretpass"
}
Note the removed comma after the last property in the JSON object.
Basically, whenever you are trying to make a post request with requests lib, This library also contains json argument which is ignored in the case when data argument is set to files or data. So basically when json argument is set with json data. Headers are set asContent-Type: application/json. Json argument basically encodes data sends into a json format. So that at DRF particularly is able to parse json data. Else in case of only data argument it is been treated as form-encoded
requests.post(url, json={"key1":"value1"})
you can find more here request.post complicated post methods