I'm trying to produce Kafka message with kcat tool. I have a message that follows topic schema:
{
"meta": {
"correlationId": "244c7ed3-0472-496e-bed9-7071f8ddb921",
"payload": {
"id": "3ae843e9-0c96-4ee8-b1f0-fbdfef30e9dd",
"timestamp": 1658846522
}
}
}
store it to the file k-msg-test.json and run kcat:
kcat \
-b localhost:9092 \
-r localhost:8081 \
-t test-topic \
-T \
-P \
/tmp/k-msg-test.json
and I get an error:
% Failed to open : No such file or directory
/tmp/k-msg-test.json: line 2: meta:: command not found
/tmp/k-msg-test.json: line 3: correlationId:: command not found
/tmp/k-msg-test.json: line 4: payload:: command not found
/tmp/k-msg-test.json: line 5: id:: command not found
/tmp/k-msg-test.json: line 6: timestamp:: command not found
/tmp/k-msg-test.json: line 8: syntax error near unexpected token `}'
/tmp/k-msg-test.json: line 8: ` }'
So the question is – how can I produce a message?
Your file is apparently being interpreted as literal text to your shell rather than read by kcat.
You need to use shell redirection as kcat accepts stdin, rather than file arguments.
You'll also need to remove the new lines and produce as a single line. You can do that with jq
kcat \
-b localhost:9092 \
-r localhost:8081 \
-t test-topic \
-T \
-P < $(jq -rc '.' /tmp/k-msg-test.json)
I tried to extract data from below cbq command which was successful.
cbq -u Administrator -p Administrator -e "http://localhost:8093" --script= SELECT * FROM `sample` where customer.id=="12345'" -q | jq '.results' > temp.json;
However when I am trying to import the same data in json format to target cluster using below command I am getting error.
cbimport json -c http://{target-cluster}:8091 -u Administrator -p Administrator -b sample -d file://C:\Users\{myusername}\Desktop\temp.json -f list -g %docId%
JSON import failed: 0 documents were imported, 0 documents failed to be imported
JSON import failed: input json is invalid: ReadArray: expect [ or , or ] or n, but found {, error found in #1 byte of ...|{
"requ|..., bigger context ...|{
"requestID": "2fc34542-4387-4643-8ae3-914e316|...],```
```{
"requestID": "6ef38b8a-8e70-4c3d-b3b4-b73518a09c62",
"signature": {
"*": "*"
},
"results": [
{
"{Bucket-name}":{my-data}
"status": "success",
"metrics": {
"elapsedTime": "4.517031ms",
"executionTime": "4.365976ms",
"resultCount": 1,
"resultSize": 24926
}
It looks like the file which was extracted from cbq command has control fields details like RequestID, metrics, status etc. Also json in pretty format. If I manually remove it(remove all fields except {my-data}) then put in a json file and make json unpretty then it works. But I want to automate it in a single run. Is there a way to do it in cbq command.
I don't find any other utility or way to use where condition on cbexport to do that on Couchbase, because the document which are exported using cbexport can be imported using cbimport easily.
For the cbq command, you can use the --quiet option to disable the startup connection messages and the --pretty=false to disable pretty-print. Then, to extract just the documents in cbimport json lines format, I used jq.
This worked for me -- selecting documents from travel-sample._default._default (for the jq filter, where I have _default, you would put the Bucket-name, based on your example):
cbq --quiet --pretty=false -u Administrator -p password --script='select * from `travel-sample`._default._default' | jq --compact-output '.results|.[]|._default' > docs.json
Then, importing into test-bucket1:
cbimport json -c localhost -u Administrator -p password -b test-bucket1 -d file://./docs.json -f lines -g %type%_%id%
cbq documentation: https://docs.couchbase.com/server/current/tools/cbq-shell.html
cbimport documentation: https://docs.couchbase.com/server/current/tools/cbimport-json.html
jq documentation:
https://stedolan.github.io/jq/manual/#Basicfilters
$ curl -k -i -X POST -d '{ "path" : "/" }' https://abc#example.com:qwer##53096wrxgcg.ibmaspera.com:33001/files/workspaces/42796/all/12345:2
results in:
curl: (3) Port number ended with 'I'
Without #:
$ curl -k -i -X POST -d '{ "path" : "/" }' https://abcexample.com:qwer#53096wrxgcg.ibmaspera.com:33001/files/workspaces/42796/all/12345:2
results in:
curl: (7) Failed to connect to 53096wrxgcg.ibmaspera.com port 33001: Connection timed out
How to specify # in username and password for curl command?
If the auth username or password contain an # symbol, it should be urlencoded to %40.
Your curl command could look like either of these examples:
curl -k -i -X POST -d '{ "path" : "/" }' \
https://abc%40example.com:qwer%40#53096wrxgcg.ibmaspera.com:33001/files/workspaces/42796/all/12345:2
or
curl --user abc%40example.com:qwer%40 -k -i ...
Im trying to push a data to an API,
for which I get the below error:
("Request body is not unjson()-able: %s" % body)
DirectException: Request body is not unjson()-able: '{"action":"EventsRouter","method":"add_event","data":[{"summary":"mes","device":"hos","message":"message","component":"sev","severity":"ap","evclasskey":"nxlog","evclass":"nxlog","monitor":"localhost"}],"type":"rpc","tid":2}'
current bash file:
#!/bin/bash
message=$1
hostname=$2
appname=$3
severity=$4
data='{"action":"EventsRouter","method":"add_event","data":[{"summary":"'$message'","device":"'$hostname'","message":"message","component":"'$appname'","severity":"'$severity'","evclasskey":"nxlog","evclass":"nxlog","monitor":"localhost"}],"type":"rpc","tid":2}'
echo "Total number of args : $#"
echo "message = $message"
echo "hostname = $hostname"
echo "appname = $appname"
echo "data = $data"
curl -u "Eve:eve#123" -k "https://myurl.com/zport/dmd/evconsole_router" -d "'$data'" -H "Content-Type:application/json"
First, What is the reason and how can i correct this. I am only sending simple values while testing:
sh tcp.sh mes hos sev ap
However, when i curl a data directly, it is working.
curl -u Eve:eve#123 -k https://myurl.com/zport/dmd/evconsole_router -d '{"action":"EventsRouter", "method":"add_event","data":[{"summary":"test","device":"test","message":"msg","component":"testhost","severity":"5", "evclasskey":"nxlog", "evclass":"/nxlog/perf","monitor":"localhost"}],"type":"rpc","tid":2}' -H "Content-Type:application/json"
{"uuid": "1654584c-5f86-489e-a5e7-35d45e462066", "action": "EventsRouter", "result": {"msg": "Created event", "success": true},"tid": 2, "type": "rpc", "method": "add_event"}
Secondly, This is about to be automated to read multiple log messages, if the log message has special characters such as ',",\,/, etc, will it affect my bash inputs?
If yes, how to I eliminate it?
can you please try this syntax and check the execution
data=$(cat <<EOF
{"action":"EventsRouter","method":"add_event","data":[{"summary":"$message","device":"$hostname","message":"message","component":"$appname","severity":"$severity","evclasskey":"nxlog","evclass":"nxlog","monitor":"localhost"}],"type":"rpc","tid":2}
EOF
)
Curl command syntax like
curl -u "Eve:eve#123" -k "https://myurl.com/zport/dmd/evconsole_router" -d "$data" -H "Content-Type:application/json"
What is the right format of query argument of mongoexport utility?
While running the following command in the command line:
mongoexport -h localhost:27000 -d dbName -c collName -q "{'time': { $gt: new Date('2014-01-28T12:00:00Z')}}" -o output.js
I'm getting the following error:
connected to: localhost:27000 assertion: 16619 code FailedToParse:
FailedToParse: Expecting '}' or ',': offset:37
Reading Mongo Export query arg and JSONDocument docs haven't helped me to understand the expected format of query argument.
Running the same query in mongo shell succeeds.
If:
>new Date ("2014-01-28T12:00:00Z").getTime()
1390910400000
You will have to construct your query as follows:
-q "{sendToServerTime: {\$gt: {\$date : 1390910400000}}}"
The problem is your new Date() command. This no valid json. Try this:
mongoexport -h localhost:27000 -d DeploymentJan01 -c sensorsData -q '{sendToServerTime: { $gt: "2014-01-28T12:00:00Z"}}' -o output.js