remove escaped strings from json output in ansible - json

One of my ansible tasks executes a Mongo command using the Shell module but the output returns escaped strings. I've tried using some ansible filters but not getting any luck.
this is what the ansible play and output look like:
- name: get ops
shell:
cmd: mongo --eval "printjsononeline(db.currentOp())"
register: currentOp
- debug:
msg: "{{ currentOp.stdout_lines[2] }}"
current output with escaped strings:
ok: [db1] => {
"msg": "{ \"inprog\" : [ { \"desc\" : \"conn24359\", \"threadId\" : \"139883322984192\", \"connectionId\" : 24359, \"client\" : \"127.0.0.1:51259\", \"active\" : true, \"opid\" : 959820, \"secs_running\" : 0, \"microsecs_running\" : NumberLong(21), \"op\" : \"command\", \"ns\" : \"admin.$cmd\", \"query\" : { \"currentOp\" : 1 }, \"numYields\" : 0, \"locks\" : { }, \"waitingForLock\" : false, \"lockStats\" : { } }, { \"desc\" : \"WT RecordStoreThread: local.oplog.rs\", \"threadId\" : \"139883394451200\", \"active\" : true, \"opid\" : 2, \"secs_running\" : 65833, \"microsecs_running\" : NumberLong(\"65833826945\"), \"op\" : \"none\", \"ns\" : \"local.oplog.rs\", \"query\" : { }, \"numYields\" : 0, \"locks\" : { }, \"waitingForLock\" : false, \"lockStats\" : { \"Global\" : { \"acquireCount\" : { \"r\" : NumberLong(1), \"w\" : NumberLong(1) }, \"acquireWaitCount\" : { \"w\" : NumberLong(1) }, \"timeAcquiringMicros\" : { \"w\" : NumberLong(1813778) } }, \"Database\" : { \"acquireCount\" : { \"w\" : NumberLong(1) } }, \"oplog\" : { \"acquireCount\" : { \"w\" : NumberLong(1) } } } } ], \"ok\" : 1 }"
}
I'd like the outcome to be in actual JSON without escapes. any pointers? I've tried the following ansible filters | to_json | from_json but no luck

The output of your shell is not json. It's mongo extended json which contains functions that will produce errors when parsed by a regular json parser.
My XP with mongo is just above nothing so there might definitelly be smarter ways to do this. But based on an other answer I found, here is what I came up with and successfully tested against a mongo docker container:
- hosts: mongo
gather_facts: false
tasks:
- shell: mongo --quiet --eval 'print(JSON.stringify(db.currentOp()))'
register: mocmd
- debug:
msg: "{{ mocmd.stdout | from_json }}"

Related

parse json to lld from custom parameter agent-active

Hello i got the following:
in agent i've add:
UserParameter=sip.register,/usr/bin/tbstatus -D --lvl 0 -x "/sip_registration/domain/user/*" --json
and it gives me pretty json in - zabbix-server
2022-04-28 13:25:24
{
"/sip_registration/domain:SIP/user:jtlc31337/contact:1_jtlc31337_95.213.198.99_5060_1" : {
"binding_state" : "Active",
"state_time_left_sec" : 59,
"contact_nap" : "JT_4033_TIP_LIVECALL",
"expires_from_ua" : 120,
"expires_to_registrar" : 3600,
"expires_from_registrar" : 900,
"expires_to_ua" : 59,
"creation_time" : "2022/04/27 10:54:21",
"last_registration_time" : "2022/04/28 13:24:13",
"contact_remap" : "jtlc31337",
"packet_source_struct" : {
"source_ip" : "95.213.198.99",
"source_port" : 5060,
"destination_ip" : "8.7.6.2",
"destination_port" : 5060,
"transport" : "UDP"
}
}
}
I would like to have a LLD and template with discovery rule that gives:
"binding_state"
"source_ip"
Any hints how to make a discovery rules?
Thanks

ansible print key if inside value is defined

can somebody please help me with this json parse?
I have this json
{
"declaration": {
"ACS-AS3": {
"ACS": {
"class": "Application",
"vs_ubuntu_22": {
"virtualAddresses": ["10.11.205.167"]
},
"pool_ubuntu_22": {
"members": {
"addressDiscovery": "static",
"servicePort": 22
}
},
"vs_ubuntu_443": {
"virtualAddresses": ["10.11.205.167"],
"virtualPort": 443
},
"pool_ubuntu01_443": {
"members": [{
"addressDiscovery": "static",
"servicePort": 443,
"serverAddresses": [
"10.11.205.133",
"10.11.205.165"
]
}]
},
"vs_ubuntu_80": {
"virtualAddresses": [
"10.11.205.167"
],
"virtualPort": 80
},
"pool_ubuntu01_80": {
"members": [{
"addressDiscovery": "static",
"servicePort": 80,
"serverAddresses": [
"10.11.205.133",
"10.11.205.165"
],
"shareNodes": true
}],
"monitors": [{
"bigip": "/Common/tcp"
}]
}
}
}
}
}
and I am trying this playbook
tasks:
- name : deploy json file AS3 to F5
debug:
msg: "{{ lookup('file', 'parse2.json') }}"
register: atc_AS3_status
no_log: true
- name : Parse json 1
debug:
var: atc_AS3_status.msg.declaration | json_query(query_result) | list
vars:
query_result: "\"ACS-AS3\".ACS"
#query_result1: "\"ACS-AS3\".ACS.*.virtualAddresses"
register: atc_AS3_status1
I got this response
TASK [Parse json 1] ******************************************************************************************************************************************************************************************
ok: [avx-bigip01.dhl.com] => {
"atc_AS3_status1": {
"atc_AS3_status.msg.declaration | json_query(query_result) | list": [
"class",
"vs_ubuntu_22",
"pool_ubuntu_22",
"vs_ubuntu_443",
"pool_ubuntu01_443",
"vs_ubuntu_80",
"pool_ubuntu01_80"
],
"changed": false,
"failed": false
}
}
but I would like to print just key which has inside key virtualAddresses
if ""ACS-AS3".ACS.*.virtualAddresses" is defined the print the key .
the result should be
vs_ubuntu_22
vs_ubuntu_443
vs_ubuntu_80
One way to get the keys of a dict, is to use the dict2items filter. This will give vs_ubuntu_22 etc. as "key" and their sub-dicts as "value". Using this we can conditionally check if virtualAddresses is defined in values.
Also parse2.json can be included as vars_file or with include_vars rather than having a task to debug and register the result.
Below task using vars_file in playbook should get you the intended keys from the JSON:
vars_files:
- parse2.json
tasks:
- name: show atc_status
debug:
var: item.key
loop: "{{ declaration['ACS-AS3']['ACS'] | dict2items }}"
when: item['value']['virtualAddresses'] is defined

Ingest node Filebeat to Elasticsearch

We are sending logs directly from Filebeats to Elasticsearch without Logstash.
Logs can contain JSON in different fields that also need to be parsed. I have created a pipeline to parse logs, tested it in the developer console, and output was as expected. I have set Filebeat to send logs to this pipeline by adding 'pipeline: application_pipeline' to filebeat.yml. But in Index Management, I see only my docs.
How to check if Filebeat is sending these logs to the pipeline?
log example:
{"level":"info","message":"Webhook DeletePrice-{\"_headers\":{\"x-forwarded-proto\":[\"https\"],\"x-requested-with\":[\"\"],\"x-client-ip\":[\"93.84.120.32\"],\"user-agent\":[\"1C+Enterprise\\/8.3\"],\"accept\":[\"application\\/json\"],\"host\":[\"host.com\"],\"content-length\":[\"\"],\"content-type\":[\"\"]},\"company_id\":\"10248103\",\"service_id\":\"102.01.02S\",\"service_type\":\"clientApi\"}","service":"servicename","project":"someproject.com","event_id":"255A854BED569B8D4C21B5DE6D8E109C","payload":[],"date_server":"2020-07-24T11:45:48+00:00","date_unix":1595591148.966919}
{"level":"error","message":"NO service integration","service":"servicename","project":"someproject.com","event_id":"D3986456E5A42AF8574230C29D1D474D","payload":{"exception":{"class":"\\Ship\\Exceptions\\IntegrationException","message":"NO service integration","code":0,"file":"/var/www/builds/someproject.com/build.lab.service-public-api.2020_07_22_12_17_45/app/Containers/Price/UI/API/Controllers/Controller.php:406"}},"date_server":"2020-07-24T08:40:34+00:00","date_unix":1595580034.975073}
{"level":"info","message":"No photo in priceId-3696930","service":"service-private-api","project":"someproject.com","event_id":"FBEDA2C9600BFE11523592114B32BAEB","payload":[],"date_server":"2020-07-24T12:16:40+00:00","date_unix":1595593000.97212}
{"level":"error","message":"C404HttpException: 404 \u0421\u0442\u0440\u0430\u043d\u0438\u0446\u0430 \u043d\u0435 \u043d\u0430\u0439\u0434\u0435\u043d\u0430 in \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/modules\/personal\/controllers\/RobotsController.php:65\nStack trace:\n#0 \/var\/www\/builds\/build.artox-lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(4226): RobotsController->actionIndex()\n#1 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(3739): CInlineAction->runWithParams(Array)\n#2 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(3724): CController->runAction(Object(CInlineAction))\n#3 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(3714): CController->runActionWithFilters(Object(CInlineAction), Array)\n#4 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(1799): CController->run('index')\n#5 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(1719): CWebApplication->runController('personal\/robots...')\n#6 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/protected\/vendor\/yiisoft\/yii\/framework\/yiilite.php(1236): CWebApplication->processRequest()\n#7 \/var\/www\/builds\/build.lab.classified-platform.2020_07_29_12_13_54\/htdocs\/index.php(22): CApplication->run()\n#8 {main}\nREQUEST_URI=\/robots.txt\n---","service":"artox-lab\/classified-platform","project":"someproject.com","event_id":"91a10782a3566a74d5abefa9589c926c","payload":"exception.C404HttpException.404","date_server":"2020-07-29T14:25:34+03:00","date_unix":1596021934.218448}
pipeline example:
PUT _ingest/pipeline/application_pipeline
{
"description" : "Pipeline for parsing application.log for services",
"processors" : [
{
"grok" : {
"field" : "message",
"patterns" : [
"%{JSON:json_message_payload}"
],
"pattern_definitions" : {
"JSON" : "{.*$"
},
"ignore_failure" : true,
"ignore_missing" : true
}
},
{
"remove" : {
"field" : "json_message_payload",
"ignore_failure" : true
}
}
]
}
}
output:
{
"_index" : "application_index",
"_type" : "_doc",
"_id" : "6",
"_version" : 1,
"_seq_no" : 3,
"_primary_term" : 1,
"found" : true,
"_source" : {
"date_server" : "2020-07-29T15:16:17+03:00",
"level" : "error",
"project" : "103by",
"message" : """
C404HttpException: 404 Страница не найдена in /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/modules/personal/components/PersonalController.php:140
Stack trace:
#0 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/vendor/yiisoft/yii/framework/yiilite.php(3737): PersonalController->beforeAction(Object(ShowGalleryPhotoAction))
#1 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/vendor/yiisoft/yii/framework/yiilite.php(3724): CController->runAction(Object(ShowGalleryPhotoAction))
#2 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/vendor/yiisoft/yii/framework/yiilite.php(3714): CController->runActionWithFilters(Object(ShowGalleryPhotoAction), Array)
#3 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/vendor/yiisoft/yii/framework/yiilite.php(1799): CController->run('showGalleryPhot...')
#4 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/vendor/yiisoft/yii/framework/yiilite.php(1719): CWebApplication->runController('personal/galler...')
#5 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/protected/vendor/yiisoft/yii/framework/yiilite.php(1236): CWebApplication->processRequest()
#6 /var/www/builds/build.artox-lab.classified-platform.2020_07_29_12_13_54/htdocs/index.php(22): CApplication->run()
#7 {main}
REQUEST_URI=/gallery/23609/1439643/
HTTP_REFERER=http://rnpcomr.103.by/gallery/23609/1439643/
---
""",
"date_unix" : 1.596024977817727E9,
"event_id" : "b75c7a1ef2f8780986931b038d2f8599",
"payload" : "exception.C404HttpException.404",
"service" : "artox-lab/classified-platform"
}
}
Filebeat config:
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["elk.artoxlab.com:9200"]
pipeline: application_pipeline
If you run GET _nodes/stats/ingest, you're going to see the usage statistics for your pipeline in nodes.xyz.ingest.pipelines.application_pipeline
Another thing worth noting is that you could also do the same thing in Filebeat itself without resorting to using an ingest pipeline simply by defining a decode_json_fields processor, like this:
processors:
- decode_json_fields:
fields: ["message"]
process_array: true
max_depth: 2
target: ""
overwrite_keys: true
add_error_key: false
UPDATE: if you still don't see your data being indexed, what I suggest to do is to build some failure handling into your pipeline. Change it to this, son on case the indexing fails for some reason, you can see the document in the failed-xyz index with the reason for the error.
PUT _ingest/pipeline/application_pipeline
{
"description": "Pipeline for parsing application.log for services",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"%{JSON:json_message_payload}"
],
"pattern_definitions": {
"JSON": "{.*$"
},
"ignore_failure": true,
"ignore_missing": true
}
},
{
"remove": {
"field": "json_message_payload",
"ignore_failure": true
}
}
],
"on_failure": [
{
"append": {
"field": "meta.errors",
"value": "{{ _ingest.on_failure_message }}, {{ _ingest.on_failure_processor_type }}, {{ _ingest.on_failure_processor_tag }}"
}
},
{
"set": {
"field": "_index",
"value": "failed-{{ _index }}"
}
}
]
}

Spring Boot & MongoDB, Query nested map for key

I'm trying to query MongoDB for all documents that have a specific key inside a document's nested map
{
"_id" : ObjectId("5a5cd9711736c32c45f11adf"),
"name" : "Test A",
"userSubscription" : {
"map" : {
"1234" : true,
"999" : true,
}
}
}
I'm querying as follows:
db.getCollection('myColl').find( { "userSubscription.map" : {"1234":true }})
It's working, but, it return only documents that holds a single value of "1234", So if "userSubscription.map" contains "1234" and "5555" the query not showing any results.
I'm using Robo3T to test queries, and SpringBoot annotations to query this,
Can the answer be in Spring Boot annotation query?
Spring Boot query example:
#Query("{QUERY : ?0}")
List<Person> findByUserSubscription(String key);
** The "?0" is to use the first method param
Thanks!!!
UPDATE:
Now the document looks like this:
{
"_id" : ObjectId("5a5cd9711736c32c45f11adf"),
"name" : "Test A",
"userSubscription" : {
"1234" : true,
"3333" : true
}
}
Robo 3T Query: db.getCollection('Person').find({ "userSubscription.1234": {$exists : true} })
Works PERFECT!
But, The query in Spring boot looks like:
#Query("{userSubscription.?0 : {$exists: true} }")
and it doesn't show any results...
What is the problem???
Model the map as array of embedded documents like
{
"map" : [
{k: "1234", v: true},
{k: "999", v: true}
]
}
and you can find by key
db.myColl.find({"userSubscription.map":{$elemMatch: { k: "1234", v: true } } })
and
#Query("{'userSubscription.map':{$elemMatch: { k: ?0, v: true } } }")
List<Person> findByUserSubscription(String key);
`
there is a more simple solution with using Query and Criteria Class in Spring Data mongodb. like below:
Query query = new Query();
query.addCriteria(Criteria.where("userSubscription."+key).exists(true));
return mongoTemplate.find(query, YourDocument.class);
You missed single quote for #Query.
#Query("{'userSubscription.?0' : {$exists: true} }")

I get "Assertion failed: You must include an `id` in a hash passed to `push` " when querying API

I must have read every question on stack overflow regarding my issue and have not found a solution.
I am very new to Ember and Node so please bear with me.
The server responds with this format:
{
"_id" : "53fddf59d72f9b4d3a3e164a"
"about" : [
{"from" : "foo"},
{"text" : "bar"},
... ]
}
My model looks like this:
App.About = DS.Model.extend({
from : DS.attr('string'),
text : DS.attr('string'),
...
}
Adapter & serializer:
App.ApplicationAdapter = DS.RESTAdapter.extend({
host: 'http://localhost:3000'
});
App.ApplicationSerializer = DS.JSONSerializer.extend({
primaryKey: '_id'
});
And my route:
App.AboutRoute = Ember.Route.extend({
model: function() {
return this.store.find('about');
}
});
Using handlebars:
<script type="text/x-handlebars" data-template-name="about">
<p>{{from}}</p>
</script>
The route will render, but the console gives the error:
DEBUG: ------------------------------- ember.js:3285
DEBUG: Ember : 1.3.0 ember.js:3285
DEBUG: Ember Data : 1.0.0-beta.5 ember.js:3285
DEBUG: Handlebars : 1.3.0 ember.js:3285
DEBUG: jQuery : 1.10.2 ember.js:3285
DEBUG: ------------------------------- ember.js:3285
generated -> route:application Object {fullName: "route:application"} ember.js:3285
Assertion failed: You must include an id in a hash passed to push ember.js:3285
generated -> controller:about Object {fullName: "controller:about"} ember.js:3285
Transitioned into 'about'
What am I doing wrong?!
Update:
Have accepted Kingpin2k's answer even though this was not the exact format I ended up using. See the comments for details.
Your format should have the type plural, and the id on the individual records.
{
"abouts" : [
{
"from" : "foo",
"text" : "bar",
"_id" : "53fddf59d72f9b4d3a3e164a"
},
{
"from" : "foo2",
"text" : "bar2",
"_id" : "53fddf59d72f9b4d3a3e164k"
},
... ]
}