I'm trying to design a load test for some APIs, I need to read the data from a CSV file, Here is my CSV file :
amount,description
100,"100 Added"
-150,"-150 removed"
20, "20 added"
the amount is a number.
My CSV data config looks like this:
Image
When I put the body data like this :
Image
I have this error :
{"timestamp":1529427563867,"status":400,"error":"Bad Request","message":"JSON parse error: Unrecognized token 'amount': was expecting ('true', 'false' or 'null'); nested exception is com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'amount': was expecting ('true', 'false' or 'null')\n
and when I change it to this:
Image
I have this error:
"timestamp":1529427739395,"status":400,"error":"Bad Request","message":"JSON parse error: Cannot deserialize value of type `java.math.BigDecimal` from String \"amount\": not a valid representation; nested exception is com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `java.math.BigDecimal` from String \"amount\": not a valid representation\n at [Source: (PushbackInputStream); line: 2, column: 13]
How should I pass the parameters to be able to read from the CSV file?
P.S it works fine when I do all the things without CSV.
In your config, you have "Ignore first line" set to False.
Options:
1: Set "Ignore First Line" to True, for source files with header data included.
2: Leave the config setting as is, but remove the headers from the CSV source file.
Related
I am trying to parse a json payload sent via a POST request to a NGINX/Openresty location. To do so, I combined Openresty's content_by_lua_block with its cjson module like this:
# other locations above
location /test {
content_by_lua_block {
ngx.req.read_body()
local data_string = ngx.req.get_body_data()
local cjson = require "cjson.safe"
local json = cjson.decode(data_string)
local endpoint_name = json['endpoint']['name']
local payload = json['payload']
local source_address = json['source_address']
local submit_date = json['submit_date']
ngx.say('Parsed')
}
}
Parsing sample data containing all required fields works as expected. A correct JSON object could look like this:
{
"payload": "the payload here",
"submit_date": "2018-08-17 16:31:51",
},
"endpoint": {
"name": "name of the endpoint here"
},
"source_address": "source address here",
}
However, a user might POST a differently formatted JSON object to the location. Assume a simple JSON document like
{
"username": "JohnDoe",
"password": "password123"
}
not containing the desired fields/keys.
According to the cjson module docs, using cjson (without its safe mode) will raise an error if invalid data is encountered. To prevent any errors being raised, I decided to use its safe mode by importing cjson.safe. This should return nil for invalid data and provide the error message instead of raising the error:
The cjson module will throw an error during JSON conversion if any invalid data is encountered. [...]
The cjson.safe module behaves identically to the cjson module, except when errors are encountered during JSON conversion. On error, the cjson_safe.encode and cjson_safe.decode functions will return nil followed by the error message.
However, I do not encounter any different error handling behavior in my case and the following traceback is shown in Openresty's error.log file:
2021/04/30 20:33:16 [error] 6176#6176: *176 lua entry thread aborted: runtime error: content_by_lua(samplesite:50):16: attempt to index field 'endpoint' (a nil value)
Which in turn results in an Internal Server Error:
<html>
<head><title>500 Internal Server Error</title></head>
<body>
<center><h1>500 Internal Server Error</h1></center>
<hr><center>openresty</center>
</body>
</html>
I think a workaround might be writing a dedicated function for parsing the JSON data and calling it with pcall() to catch any errors. However, this would make the safe mode kind of useless. What am I missing here?
Your “simple JSON document” is a valid JSON document. The error you are facing is not related to cjson, it's a standard Lua error:
resty -e 'local t = {foo = 1}; print(t["foo"]); print(t["foo"]["bar"])'
1
ERROR: (command line -e):1: attempt to index field 'foo' (a number value)
stack traceback:
...
“Safeness” of cjson.safe is about parsing of malformed documents:
cjson module raises an error:
resty -e 'print(require("cjson").decode("[1, 2, 3"))'
ERROR: (command line -e):1: Expected comma or array end but found T_END at character 9
stack traceback:
...
cjson.safe returns nil and an error message:
resty -e 'print(require("cjson.safe").decode("[1, 2, 3"))'
nilExpected comma or array end but found T_END at character 9
i try to construct JSON with string that contains "\n" in it like this :
ver_str= 'Package ID: version_1234\nBuild\nnumber: 154\nBuilt\n'
proj_ver_str = 'Version_123'
comb = '{"r_content": {0}, "s_version": {1}}'.format(ver_str,proj_ver_str)
json_content = json.loads()
d =json.dumps(json_content )
getting this error:
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "C:/Dev/python/new_tester/simple_main.py", line 18, in <module>
comb = '{"r_content": {0}, "s_version": {1}}'.format(ver_str,proj_ver_str)
KeyError: '"r_content"'
The error arises not because of newlines in your values, but because of { and } characters in your format string other than the placeholders {0} and {1}. If you want to have an actual { or a } character in your string, double them.
Try replacing the line
comb = '{"r_content": {0}, "s_version": {1}}'.format(ver_str,proj_ver_str)
with
comb = '{{"r_content": {0}, "s_version": {1}}}'.format(ver_str,proj_ver_str)
However, this will give you a different error on the next line, loads() missing 1 required positional argument: 's'. This is because you presumably forgot to pass comb to json.loads().
Replacing json.loads() with json.loads(comb) gives you another error: json.decoder.JSONDecodeError: Expecting value: line 1 column 15 (char 14). This tells you that you've given json.loads malformed JSON to parse. If you print out the value of comb, you see the following:
{"r_content": Package ID: version_1234
Build
number: 154
Built
, "s_version": Version_123}
This isn't valid JSON, because the string values aren't surrounded by quotes. So a JSON parsing error is to be expected.
At this point, let's take a look at what your code is doing and what you seem to want it to do. It seems you want to construct a JSON string from your data, but your code puts together a JSON string from your data, parses it to a dict and then formats it back as a JSON string.
If you want to create a JSON string from your data, it's far simpler to create a dict with your values and use json.dumps on that:
d = json.dumps({"r_content": ver_str, "s_version": proj_ver_str})
I am trying to retrieve an xml file from my Mongo DB with mongofiles. I get a JSON parsing error. Here is an excerpt my terminal:
$ mongofiles -d anhalytics get_id 'ObjectId("5e7f56d30800611b17fc66b1")'
2020-09-15T16:55:33.205+0200 connected to: mongodb://localhost/
2020-09-15T16:55:33.205+0200 Failed: error parsing id as Extended JSON: invalid JSON number. Position: 18
I am using a MongoDB server version: 4.2.9
Here is the record of the target file
{
"_id" : ObjectId("5e7f56d30800611b17fc66b1"),
"filename" : "5e7f56d30800611b17fc66b0.tei.xml",
"aliases" : null,
"chunkSize" : NumberLong(261120),
"uploadDate" : ISODate("2020-03-28T13:53:23.708Z"),
"length" : NumberLong(35405),
"contentType" : null,
"md5" : "eeafae907c44b207071ccb6036148808"
}
Any idea why I am getting this error? Thanks!
The message error parsing id as Extended JSON indicates that the mongofiles tool had trouble parsing the id string that was provided on the command line.
That is done in parseOrCreateId function here: https://github.com/mongodb/mongo-tools/blob/master/mongofiles/mongofiles.go#L330
That function wraps the value from the command line in another string like {"_id":"%s"}, so the value actually passed to the bson.UnmarshalExtJSON function would have been
"{\"_id\":\"ObjectId(\"5e7f56d30800611b17fc66b1\")\"}"
Position 18 of that string, as called out in the error message is the quotation mark immediately preceding the hex string.
I am trying to integrate Spring Rest docs with Grails 3.2.9 application, using Rest Assured. During integration testing it is producing the Payload Handling Exception.
Full stack trace:
org.springframework.restdocs.payload.PayloadHandlingException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: [B#793cd42d; line: 1, column: 2]
at org.springframework.restdocs.payload.JsonContentHandler.readContent(JsonContentHandler.java:84)
at org.springframework.restdocs.payload.JsonContentHandler.findMissingFields(JsonContentHandler.java:50)
at org.springframework.restdocs.payload.AbstractFieldsSnippet.validateFieldDocumentation(AbstractFieldsSnippet.java:113)
at org.springframework.restdocs.payload.AbstractFieldsSnippet.createModel(AbstractFieldsSnippet.java:74)
at org.springframework.restdocs.snippet.TemplatedSnippet.document(TemplatedSnippet.java:64)
at org.springframework.restdocs.generate.RestDocumentationGenerator.handle(RestDocumentationGenerator.java:192)
at org.springframework.restdocs.restassured.RestDocumentationFilter.filter(RestDocumentationFilter.java:63)
at com.jayway.restassured.internal.filter.FilterContextImpl.next(FilterContextImpl.groovy:73)
at org.springframework.restdocs.restassured.RestAssuredRestDocumentationConfigurer.filter(RestAssuredRestDocumentationConfigurer.java:65)
at com.jayway.restassured.internal.filter.FilterContextImpl.next(FilterContextImpl.groovy:73)
at com.jayway.restassured.internal.RequestSpecificationImpl.applyPathParamsAndSendRequest(RequestSpecificationImpl.groovy:1574)
at com.jayway.restassured.internal.RequestSpecificationImpl.get(RequestSpecificationImpl.groovy:159)
at so.con.BaseRestDocTestSetup.$tt__createDoc(BaseRestDocTestSetup.groovy:130)
at so.con.BaseRestDocTestSetup.createDoc_closure4(BaseRestDocTestSetup.groovy)
at groovy.lang.Closure.call(Closure.java:414)
at groovy.lang.Closure.call(Closure.java:430)
at grails.transaction.GrailsTransactionTemplate$1.doInTransaction(GrailsTransactionTemplate.groovy:70)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at grails.transaction.GrailsTransactionTemplate.executeAndRollback(GrailsTransactionTemplate.groovy:67)
at so.con.v1.docs.ConfApiDocSpec.test and document get request for /conf getting conf-list as Response(ConfApiDocSpec.groovy:63)
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: [B#793cd42d; line: 1, column: 2]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1702)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:558)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:456)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._handleUnexpectedValue(UTF8StreamJsonParser.java:2689)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:878)
at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:772)
at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3834)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3783)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2929)
at org.springframework.restdocs.payload.JsonContentHandler.readContent(JsonContentHandler.java:81)
... 19 more
Please help me resolve this.
You are trying to parse XML with JSON decoder. Use Accept-Encoding header for you request and provide expected format there.
I am running:
select *
from table
and I am getting this error:
Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected character ('O' (code 79)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: java.io.StringReader#570e45; line: 1, column: 2]
The tweets are also in a language that I don't know. Is this necessary to have an extension of json in tweets?
You need to change the configuration file in flume. I too faced the issue and resolved by adding the below properties in twitter.conf.
TwitterAgent.sources.Twitter.maxBatchSize = 50000
TwitterAgent.sources.Twitter.maxBatchDurationMillis = 100000