Retrieving of a file with mongofiles leads to a JSON error - json

I am trying to retrieve an xml file from my Mongo DB with mongofiles. I get a JSON parsing error. Here is an excerpt my terminal:
$ mongofiles -d anhalytics get_id 'ObjectId("5e7f56d30800611b17fc66b1")'
2020-09-15T16:55:33.205+0200 connected to: mongodb://localhost/
2020-09-15T16:55:33.205+0200 Failed: error parsing id as Extended JSON: invalid JSON number. Position: 18
I am using a MongoDB server version: 4.2.9
Here is the record of the target file
{
"_id" : ObjectId("5e7f56d30800611b17fc66b1"),
"filename" : "5e7f56d30800611b17fc66b0.tei.xml",
"aliases" : null,
"chunkSize" : NumberLong(261120),
"uploadDate" : ISODate("2020-03-28T13:53:23.708Z"),
"length" : NumberLong(35405),
"contentType" : null,
"md5" : "eeafae907c44b207071ccb6036148808"
}
Any idea why I am getting this error? Thanks!

The message error parsing id as Extended JSON indicates that the mongofiles tool had trouble parsing the id string that was provided on the command line.
That is done in parseOrCreateId function here: https://github.com/mongodb/mongo-tools/blob/master/mongofiles/mongofiles.go#L330
That function wraps the value from the command line in another string like {"_id":"%s"}, so the value actually passed to the bson.UnmarshalExtJSON function would have been
"{\"_id\":\"ObjectId(\"5e7f56d30800611b17fc66b1\")\"}"
Position 18 of that string, as called out in the error message is the quotation mark immediately preceding the hex string.

Related

csv2ofx: error: argument -r/--first-row: invalid int value

I need to convert a csv file to an ofx , after install csv2ofx library in a virtual environment : this is some of my header data in the csv :
Postingdate; valuedate; Textkey; Primanota; payee; payeeaccount; payeeIBAN; payeebank code; payeeBIC; transaction; customerreference; currency; turnover; debitcredit
30.12.2021;31.12.2021;31 Close;905;;0;;;;Debit statement from December 29, 2021;;EUR;20,15;S
I want to convert this on an ofx file.
and exactly i want to modify the mapping by the commande lines with the header fields that i have in my csv file:
here is the command line:
csv2ofx file_data_2021.csv file_data_2021.ofx -r Postingdate valuedate Textkey Primanota payee payeeaccount payeeIBAN payeebank code payeeBIC transaction customerreference currency turnover debitcredit
-------->An error displayed:
usage: csv2ofx [options] <source> <dest>
csv2ofx: error: argument -r/--first-row: invalid int value: 'Postingdate'

Promtail: how to trim not JSON part from log

I have multiline log that consists correct json part (one or more lines), and after it - stack trace.
Is it possile to parse first part of the log as json, and for stack-trace make new label ("stackTrace" for example) and put there all the lines after first part?
Unfortunately, logs can contain a different number of fields in json format, and therefore it is unlikely to parse them using regex.
{ "timestamp" : "2022-03-28 14:33:00,000", "logger" : "appLog", "level" : "ERROR", "thread" : "ktor-8080", "url" : "/path","method" : "POST","httpStatusCode" : 400,"callId" : "f7a22bfb1466","errorMessage" : "Unexpected JSON token at offset 184: Encountered an unknown key 'a'. Use 'ignoreUnknownKeys = true' in 'Json {}' builder to ignore unknown keys. JSON input: { \"entityId\" : \"TGT-8c8d950036bf\", \"processCode\" : \"test\", \"tokenType\" : \"SSO_CCOM\", \"ttlMills\" : 600000, \"a\" : \"a\" }" }
com.example.info.core.WebApplicationException: Unexpected JSON token at offset 184: Encountered an unknown key 'a'.
Use 'ignoreUnknownKeys = true' in 'Json {}' builder to ignore unknown keys.
JSON input: {
"entityId" : "TGT-8c8d950036bf",
"processCode" : "test",
"tokenType" : "SSO_CCOM",
"ttlMills" : 600000,
"a" : "a"
}
at com.example.info.signtoken.SignTokenApi$signTokenModule$2$1$1.invokeSuspend(SignTokenApi.kt:94)
at com.example.info.signtoken.SignTokenApi$signTokenModule$2$1$1.invoke(SignTokenApi.kt)
at com.example.info.signtoken.SignTokenApi$signTokenModule$2$1$1.invoke(SignTokenApi.kt)
at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:248)
at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:116)
at io.ktor.util.pipeline.SuspendFunctionGun.execute(SuspendFunctionGun.kt:136)
at io.ktor.util.pipeline.Pipeline.execute(Pipeline.kt:78)
at io.ktor.routing.Routing.executeResult(Routing.kt:155)
at io.ktor.routing.Routing.interceptor(Routing.kt:39)
at io.ktor.routing.Routing$Feature$install$1.invokeSuspend(Routing.kt:107)
at io.ktor.routing.Routing$Feature$install$1.invoke(Routing.kt)
at io.ktor.routing.Routing$Feature$install$1.invoke(Routing.kt)
UPD.
I've made promtail pipeline like so
scrape_configs:
- job_name: Test_AppLog
static_configs:
- targets:
- ${HOSTNAME}
labels:
job: INFO-Test_AppLog
host: ${HOSTNAME}
__path__: /home/adm_web/app.log
pipeline_stages:
- multiline:
firstline: ^\{\s?\"timestamp\"
max_lines: 128
max_wait_time: 1s
- match:
selector: '{job="INFO-Test_AppLog"}'
stages:
- regex:
expression: '(?P<log>^\{ ?\"timestamp\".*\}[\s])(?s)(?P<stacktrace>.*)'
- labels:
log:
stacktrace:
- json:
expressions:
logger: logger
url: url
method: method
statusCode: httpStatusCode
sla: sla
source: log
But in fact, json config block does not work, the result in Grafana is only two fields - log and stacktrace.
Any help would be appreciated
if the style is constantly like this maybe the easiest way is to analyze whole log string find index of last symbol "}" - then split the string using its index+1 and result should be in the first part of output array

Parse JSON with missing fields using cjson Lua module in Openresty

I am trying to parse a json payload sent via a POST request to a NGINX/Openresty location. To do so, I combined Openresty's content_by_lua_block with its cjson module like this:
# other locations above
location /test {
content_by_lua_block {
ngx.req.read_body()
local data_string = ngx.req.get_body_data()
local cjson = require "cjson.safe"
local json = cjson.decode(data_string)
local endpoint_name = json['endpoint']['name']
local payload = json['payload']
local source_address = json['source_address']
local submit_date = json['submit_date']
ngx.say('Parsed')
}
}
Parsing sample data containing all required fields works as expected. A correct JSON object could look like this:
{
"payload": "the payload here",
"submit_date": "2018-08-17 16:31:51",
},
"endpoint": {
"name": "name of the endpoint here"
},
"source_address": "source address here",
}
However, a user might POST a differently formatted JSON object to the location. Assume a simple JSON document like
{
"username": "JohnDoe",
"password": "password123"
}
not containing the desired fields/keys.
According to the cjson module docs, using cjson (without its safe mode) will raise an error if invalid data is encountered. To prevent any errors being raised, I decided to use its safe mode by importing cjson.safe. This should return nil for invalid data and provide the error message instead of raising the error:
The cjson module will throw an error during JSON conversion if any invalid data is encountered. [...]
The cjson.safe module behaves identically to the cjson module, except when errors are encountered during JSON conversion. On error, the cjson_safe.encode and cjson_safe.decode functions will return nil followed by the error message.
However, I do not encounter any different error handling behavior in my case and the following traceback is shown in Openresty's error.log file:
2021/04/30 20:33:16 [error] 6176#6176: *176 lua entry thread aborted: runtime error: content_by_lua(samplesite:50):16: attempt to index field 'endpoint' (a nil value)
Which in turn results in an Internal Server Error:
<html>
<head><title>500 Internal Server Error</title></head>
<body>
<center><h1>500 Internal Server Error</h1></center>
<hr><center>openresty</center>
</body>
</html>
I think a workaround might be writing a dedicated function for parsing the JSON data and calling it with pcall() to catch any errors. However, this would make the safe mode kind of useless. What am I missing here?
Your “simple JSON document” is a valid JSON document. The error you are facing is not related to cjson, it's a standard Lua error:
resty -e 'local t = {foo = 1}; print(t["foo"]); print(t["foo"]["bar"])'
1
ERROR: (command line -e):1: attempt to index field 'foo' (a number value)
stack traceback:
...
“Safeness” of cjson.safe is about parsing of malformed documents:
cjson module raises an error:
resty -e 'print(require("cjson").decode("[1, 2, 3"))'
ERROR: (command line -e):1: Expected comma or array end but found T_END at character 9
stack traceback:
...
cjson.safe returns nil and an error message:
resty -e 'print(require("cjson.safe").decode("[1, 2, 3"))'
nilExpected comma or array end but found T_END at character 9

Jmeter : Read data (numbers) from CSV file in Jmeter

I'm trying to design a load test for some APIs, I need to read the data from a CSV file, Here is my CSV file :
amount,description
100,"100 Added"
-150,"-150 removed"
20, "20 added"
the amount is a number.
My CSV data config looks like this:
Image
When I put the body data like this :
Image
I have this error :
{"timestamp":1529427563867,"status":400,"error":"Bad Request","message":"JSON parse error: Unrecognized token 'amount': was expecting ('true', 'false' or 'null'); nested exception is com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'amount': was expecting ('true', 'false' or 'null')\n
and when I change it to this:
Image
I have this error:
"timestamp":1529427739395,"status":400,"error":"Bad Request","message":"JSON parse error: Cannot deserialize value of type `java.math.BigDecimal` from String \"amount\": not a valid representation; nested exception is com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `java.math.BigDecimal` from String \"amount\": not a valid representation\n at [Source: (PushbackInputStream); line: 2, column: 13]
How should I pass the parameters to be able to read from the CSV file?
P.S it works fine when I do all the things without CSV.
In your config, you have "Ignore first line" set to False.
Options:
1: Set "Ignore First Line" to True, for source files with header data included.
2: Leave the config setting as is, but remove the headers from the CSV source file.

Error in fromJSON(commandArgs(1)) : unexpected character '''

I am calling a R script from Scala by using scala.sys.process. This script takes a command line argument in JSON format and processes it. I am using rjson's fromJSON function to store the JSON in a list.
This works very fine when I execute the R script from the command line:
$ ./dfChargerFlink.R '{"Id":"1","value":"ABC"}'
But when I call it from scala, I get the following error:
Error in fromJSON(commandArgs(1)) : unexpected character '''
Execution halted
This is the code I am using:
val shellCommand = "./dfChargerFlink.R '"+arg+"'"
return shellCommand !!
where arg is the JSON string.
You can notice that I have appended " ' " to both the sides of the JSON string as if I don't, I get this error:
Error in fromJSON(commandArgs(1)) : unclosed string
Execution halted
How can this be solved? Is it some bug?