parse json to lld from custom parameter agent-active - zabbix

Hello i got the following:
in agent i've add:
UserParameter=sip.register,/usr/bin/tbstatus -D --lvl 0 -x "/sip_registration/domain/user/*" --json
and it gives me pretty json in - zabbix-server
2022-04-28 13:25:24
{
"/sip_registration/domain:SIP/user:jtlc31337/contact:1_jtlc31337_95.213.198.99_5060_1" : {
"binding_state" : "Active",
"state_time_left_sec" : 59,
"contact_nap" : "JT_4033_TIP_LIVECALL",
"expires_from_ua" : 120,
"expires_to_registrar" : 3600,
"expires_from_registrar" : 900,
"expires_to_ua" : 59,
"creation_time" : "2022/04/27 10:54:21",
"last_registration_time" : "2022/04/28 13:24:13",
"contact_remap" : "jtlc31337",
"packet_source_struct" : {
"source_ip" : "95.213.198.99",
"source_port" : 5060,
"destination_ip" : "8.7.6.2",
"destination_port" : 5060,
"transport" : "UDP"
}
}
}
I would like to have a LLD and template with discovery rule that gives:
"binding_state"
"source_ip"
Any hints how to make a discovery rules?
Thanks

Related

Jmeter: Extracting JSON response with special/spaces characters

Hello can someone help me extract the value of user parameter which is "testuser1"
I tried to use this JSON Path expression $..data I was able to extract the entire response but unable to extract user parameter. Thanks in advance
{
"data": "{ "took" : 13, "timed_out" : false, "_shards" : { "total" : 5, "successful" : 5, "skipped" : 0, "failed" : 0 }, "hits" : { "total" : 1, "max_score" : 1.0, "hits" : [ { "_index" : "bushidodb_history_network_eval_ea9656ef-0a9b-474b-8026-2f83e2eb9df1_2021-april-10", "_type" : "network", "_id" : "6e2e58be-0ccf-3fb4-8239-1d4f2af322e21618059082000", "_score" : 1.0, "_source" : { "misMatches" : [ "protocol", "state", "command" ], "instance" : "e3032804-4b6d-3735-ac22-c827950395b4|0.0.0.0|10.179.155.155|53|UDP", "protocol" : "UDP", "localAddress" : "0.0.0.0", "localPort" : "12345", "foreignAddress" : "10.179.155.155", "foreignPort" : "53", "command" : "ping yahoo.com ", "user" : "testuser1", "pid" : "10060", "state" : "OUTGOINGFQ", "rate" : 216.0, "originalLocalAddress" : "192.168.100.229", "exe" : "/bin/ping", "md5" : "f9ad63ce8592af407a7be43b7d5de075", "dir" : "", "agentId" : "abcd-dcd123", "year" : "2021", "month" : "APRIL", "day" : "10", "hour" : "12", "time" : "1618059082000", "isMerged" : false, "timestamp" : "Apr 10, 2021 12:51:22 PM", "metricKey" : "6e2e58be-0ccf-3fb4-8239-1d4f2af322e2", "isCompliant" : false }, "sort" : [ 1618059082000 ] } ] }, "aggregations" : { "count_over_time" : { "buckets" : [ { "key_as_string" : "2021-04-10T08:00:00.000-0400", "key" : 1618056000000, "doc_count" : 1 } ] } }}",
"success": true,
"message": {
"code": "S",
"message": "Get Eval results Count Success"
}
}
Actual Response:
Images
What you posted doesn't look like a valid JSON to me.
If in reality you're getting what's in your image, to wit:
{
"data": "{ \"took\" : 13, \"timed_out\" : false, \"_shards\" : { \"total\" : 5, \"successful\" : 5, \"skipped\" : 0, \"failed\" : 0 }, \"hits\" : { \"total\" : 1, \"max_score\" : 1.0, \"hits\" : [ { \"_index\" : \"bushidodb_history_network_eval_ea9656ef-0a9b-474b-8026-2f83e2eb9df1_2021-april-10\", \"_type\" : \"network\", \"_id\" : \"6e2e58be-0ccf-3fb4-8239-1d4f2af322e21618059082000\", \"_score\" : 1.0, \"_source\" : { \"misMatches\" : [ \"protocol\", \"state\", \"command\" ], \"instance\" : \"e3032804-4b6d-3735-ac22-c827950395b4|0.0.0.0|10.179.155.155|53|UDP\", \"protocol\" : \"UDP\", \"localAddress\" : \"0.0.0.0\", \"localPort\" : \"12345\", \"foreignAddress\" : \"10.179.155.155\", \"foreignPort\" : \"53\", \"command\" : \"pingyahoo.com\", \"user\" : \"testuser1\", \"pid\" : \"10060\", \"state\" : \"OUTGOINGFQ\", \"rate\" : 216.0, \"originalLocalAddress\" : \"192.168.100.229\", \"exe\" : \"/bin/ping\", \"md5\" : \"f9ad63ce8592af407a7be43b7d5de075\", \"dir\" : \"\", \"agentId\" : \"abcd-dcd123\", \"year\" : \"2021\", \"month\" : \"APRIL\", \"day\" : \"10\", \"hour\" : \"12\", \"time\" : \"1618059082000\", \"isMerged\" : false, \"timestamp\" : \"Apr10, 202112: 51: 22PM\", \"metricKey\" : \"6e2e58be-0ccf-3fb4-8239-1d4f2af322e2\", \"isCompliant\" : false }, \"sort\" : [ 1618059082000 ] } ] }, \"aggregations\" : { \"count_over_time\" : { \"buckets\" : [ { \"key_as_string\" : \"2021-04-10T08: 00: 00.000-0400\", \"key\" : 1618056000000, \"doc_count\" : 1 } ] } }}",
"success": true,
"message": {
"code": "S",
"message": "Get Eval results Count Success"
}
}
the easiest way is just using 2 JSON Extractors:
Extract data attribute value into a JMeter Variable from the response
Extract user attribute value into a JMeter variable from ${data} JMeter Variable:
Demo:
If the response looks like exactly you posted you won't be able to use JSON Extractors and will have to treat it as normal text so your choice is limited to Regular Expression Extractor, example regular expression:
"user"\s*:\s*"(\w+)"
Add Regular Expression extractor with the corresponding request and extract it. Use the below expression.
Expression: "user" : "(.*?)"
Ref: https://jmeter.apache.org/usermanual/regular_expressions.html
Regular Expression Extractor Sample

JSON file to CSV file conversion when my JSON columns are dynamic

I found the solution for json to csv conversion. Below is the sample json and solution.
{
"took" : 111,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 2,
"relation" : "eq"
},
"max_score" : 1.0,
"hits" : [
{
"_index" : "alerts",
"_type" : "_doc",
"_id" : "1",
"_score" : 1.0,
"_source" : {
"alertID" : "639387c3-0fbe-4c2b-9387-c30fbe7c2bc6",
"alertCategory" : "Server Alert",
"description" : "Successfully started.",
"logId" : null
}
},
{
"_index" : "alerts",
"_type" : "_doc",
"_id" : "2",
"_score" : 1.0,
"_source" : {
"alertID" : "2",
"alertCategory" : "Server Alert",
"description" : "Successfully stoped.",
"logId" : null
}
}
]
}
}
The solution :
jq -r '.hits.hits[]._source | [ "alertID" , "alertCategory" , "description", "logId" ], ([."alertID",."alertCategory",."description",."logId" // "null"]) | #csv' < /root/events.json
The problem with this solution is that I have to hard code the column names. What If my json gets a few additions under _source tag later? I need a solution which can handle the dynamic data under _source. I am open to any other tool or command in shell.
Simply use keys_unsorted (or keys if you want them sorted). See e.g. Convert JSON array into CSV using jq or How to convert arbitrary simple JSON to CSV using jq? for two SO examples. There are many others too.

remove escaped strings from json output in ansible

One of my ansible tasks executes a Mongo command using the Shell module but the output returns escaped strings. I've tried using some ansible filters but not getting any luck.
this is what the ansible play and output look like:
- name: get ops
shell:
cmd: mongo --eval "printjsononeline(db.currentOp())"
register: currentOp
- debug:
msg: "{{ currentOp.stdout_lines[2] }}"
current output with escaped strings:
ok: [db1] => {
"msg": "{ \"inprog\" : [ { \"desc\" : \"conn24359\", \"threadId\" : \"139883322984192\", \"connectionId\" : 24359, \"client\" : \"127.0.0.1:51259\", \"active\" : true, \"opid\" : 959820, \"secs_running\" : 0, \"microsecs_running\" : NumberLong(21), \"op\" : \"command\", \"ns\" : \"admin.$cmd\", \"query\" : { \"currentOp\" : 1 }, \"numYields\" : 0, \"locks\" : { }, \"waitingForLock\" : false, \"lockStats\" : { } }, { \"desc\" : \"WT RecordStoreThread: local.oplog.rs\", \"threadId\" : \"139883394451200\", \"active\" : true, \"opid\" : 2, \"secs_running\" : 65833, \"microsecs_running\" : NumberLong(\"65833826945\"), \"op\" : \"none\", \"ns\" : \"local.oplog.rs\", \"query\" : { }, \"numYields\" : 0, \"locks\" : { }, \"waitingForLock\" : false, \"lockStats\" : { \"Global\" : { \"acquireCount\" : { \"r\" : NumberLong(1), \"w\" : NumberLong(1) }, \"acquireWaitCount\" : { \"w\" : NumberLong(1) }, \"timeAcquiringMicros\" : { \"w\" : NumberLong(1813778) } }, \"Database\" : { \"acquireCount\" : { \"w\" : NumberLong(1) } }, \"oplog\" : { \"acquireCount\" : { \"w\" : NumberLong(1) } } } } ], \"ok\" : 1 }"
}
I'd like the outcome to be in actual JSON without escapes. any pointers? I've tried the following ansible filters | to_json | from_json but no luck
The output of your shell is not json. It's mongo extended json which contains functions that will produce errors when parsed by a regular json parser.
My XP with mongo is just above nothing so there might definitelly be smarter ways to do this. But based on an other answer I found, here is what I came up with and successfully tested against a mongo docker container:
- hosts: mongo
gather_facts: false
tasks:
- shell: mongo --quiet --eval 'print(JSON.stringify(db.currentOp()))'
register: mocmd
- debug:
msg: "{{ mocmd.stdout | from_json }}"

Extract JSON value using Jmeter

I have this JSON:
{
"totalMemory" : 12206567424,
"totalProcessors" : 4,
"version" : "0.4.1",
"agent" : {
"reconnectRetrySec" : 5,
"agentName" : "1001",
"checkRecovery" : false,
"backPressure" : 10000,
"throttler" : 100
},
"logPath" : "/eq/equalum/eqagent-0.4.1.0-SNAPSHOT/logs",
"startTime" : 1494837249902,
"status" : {
"current" : "active",
"currentMessage" : null,
"previous" : "pending",
"previousMessage" : "Recovery:Starting pipelines"
},
"autoStart" : false,
"recovery" : {
"agentName" : "1001",
"partitionInfo" : { },
"topicToInitialCapturePosition" : { }
},
"sources" : [ {
"dataSource" : "oracle",
"name" : "oracle_source",
"captureType" : "directOverApi",
"streams" : [ ],
"idlePollingFreqMs" : 100,
"status" : {
"current" : "active",
"currentMessage" : null,
"previous" : "pending",
"previousMessage" : "Trying to init storage"
},
"host" : "192.168.191.5",
"metricsType" : { },
"bulkSize" : 10000,
"user" : "STACK",
"password" : "********",
"port" : 1521,
"service" : "equalum",
"heartbeatPeriodInMillis" : 1000,
"lagObjective" : 1,
"dataSource" : "oracle"
} ],
"upTime" : "157 min, 0 sec",
"build" : "0-SNAPSHOT",
"target" : {
"targetType" : "equalum",
"agentID" : 1001,
"engineServers" : "192.168.56.100:9000",
"kafkaOptions" : null,
"eventsServers" : "192.168.56.100:9999",
"jaasConfigurationPath" : null,
"securityProtocol" : "PLAINTEXT",
"stateMonitorTopic" : "_state_change",
"targetType" : "equalum",
"status" : {
"current" : "active",
"currentMessage" : null,
"previous" : "pending",
"previousMessage" : "Recovery:Starting pipelines"
},
"serializationFormat" : "avroBinary"
}
}
I trying using Jmeter to extract out the value of agentID, how can I do that using Jmeter, what would be better ? using extractor or json extractor?
what I am trying to do is to extract agentID value in order to use it on another http request sample, but first I have to extract it from this request.
thanks!
I believe using JSON Extractor is the best way to get this agentID value, the relevant JsonPath query will be as simple as $..agentID
Demo:
See the following reference material:
JsonPath - Getting Started - for initial information regarding JsonPath language, functions, operators, etc.
JMeter's JSON Path Extractor Plugin - Advanced Usage Scenarios - for more complex scenarios.

Error while adding members to replica set

While adding members to replica set:
Error encountered:
errmsg" : "exception: need most members up to reconfigure, not ok
rs.status() gives
{
"set" : "rs0",
"date" : ISODate("2013-05-26T12:12:09Z"),
"myState" : 1,
"members" : [
{
"_id" : 0,
"name" : "Bhavneet-PC:27017",
"health" : 1,
"state" : 1,
"stateStr" : "PRIMARY",
"uptime" : 9286,
"optime" : {
"t" : 1369561487,
"i" : 1
},
"optimeDate" : ISODate("2013-05-26T09:44:47Z"),
"self" : true
}
],
"ok" : 1
}
Check if you have started all the mongod instances that you want to include as a replica with --replSet. I resolved this issue by restarting the instances with said option.