"forcePullImage" param gets set to 'false' - json

I am trying to build a docker image with Marathon, however when I use this configuration, the "forcePullImage" param gets set to 'false'
{
"id": "name",
"mem": 1024,
"cpus": 0.5,
"instances": 1,
"container": {
"type": "DOCKER",
"volumes": [
{
"containerPath": "/etc/localtime",
"hostPath": "/etc/localtime",
"mode": "RO"
}
],
"docker": {
"image": "dockerimage",
"network": "BRIDGE",
"portMappings": [ {
"containerPort": 8080,
"hostPort": 0,
"servicePort": [PORTNUMBER],
"protocol": "tcp",
"name": "name"
}],
"parameters": [{ "key": "name", "value": "name" }]
},
"forcePullImage": true
},
"healthChecks": [
{
"path": "~/check/",
"portIndex": 0,
"protocol": "HTTP",
"gracePeriodSeconds": 10,
"intervalSeconds": 2,
"timeoutSeconds": 10,
"maxConsecutiveFailures": 10
}],
"labels":{
"HAPROXY_GROUP":"external"
}
}
When it is finally build in the marathon enviroment the config file gets set to this:
{
"id": "name",
"mem": 1024,
"cpus": 0.5,
"instances": 1,
"container": {
"type": "DOCKER",
"volumes": [
{
"containerPath": "/etc/localtime",
"hostPath": "/etc/localtime",
"mode": "RO"
}
],
"docker": {
"image": "dockerimage",
"network": "BRIDGE",
"portMappings": [ {
"containerPort": 8080,
"hostPort": 0,
"servicePort": [PORTNUMBER],
"protocol": "tcp",
"name": "name"
}],
"parameters": [{ "key": "name", "value": "name" }]
},
"forcePullImage": false
},
"healthChecks": [
{
"path": "~/check/",
"portIndex": 0,
"protocol": "HTTP",
"gracePeriodSeconds": 10,
"intervalSeconds": 2,
"timeoutSeconds": 10,
"maxConsecutiveFailures": 10
}],
"labels":{
"HAPROXY_GROUP":"external"
}
}
After it is build I have to manually change the 'false' param to 'true' and after that it actually works, but why is it getting set to false when adding it to marathon and how can I fix this problem?

The Marathon app spec you posted is in fact invalid. If you look at the schema you see that forcePullImage has to be a child of the docker field (not the container field as in your example). The correct usage would be:
"docker": {
"image": "dockerimage",
"forcePullImage": false,
...
}

Related

json jq request format

I need help to extract data from this JSON file using jq.
The app flv then verify the streamname mystrame is active and extract meta.video.height
I did try lot of queries without success and my jq knowledge is poor.
{
"port": 1935,
"server_index": 0,
"applications": [{
"name": "hls",
"live": {
"streams": [{
"name": "donbosco",
"time": 2380739,
"bw_in": 2112440,
"bytes_in": 541618713,
"bw_out": 0,
"bytes_out": 0,
"bw_audio": 35544,
"bw_video": 2076888,
"clients": [{
"id": 453,
"address": "127.0.0.1",
"time": 2380959,
"flashver": "FMLE/3.0 (compatible; Lavf57.83.100)",
"dropped": 0,
"avsync": 28,
"timestamp": 2382635,
"publishing": true,
"active": true
}],
"records": [],
"meta": {
"video": {
"width": 1168,
"height": 720,
"frame_rate": 25,
"codec": "H264",
"profile": "High",
"level": 3.1
},
"audio": {
"codec": "AAC",
"profile": "LC",
"channels": 2,
"sample_rate": 16000
}
},
"nclients": 1,
"publishing": true,
"active": true
}],
"nclients": 1
},
"recorders": {
"count": 0,
"lists": []
}
},
{
"name": "flv",
"live": {
"streams": [{
"name": "mystream",
"time": 2382811,
"bw_in": 2059096,
"bytes_in": 541841549,
"bw_out": 2059096,
"bytes_out": 543351459,
"bw_audio": 35472,
"bw_video": 2023624,
"clients": [{
"id": 452,
"address": "127.0.0.1",
"time": 2382727,
"flashver": "LNX 9,0,124,2",
"dropped": 0,
"avsync": -12,
"timestamp": 2384520,
"publishing": false,
"active": true
},
{
"id": 451,
"address": "127.0.0.1",
"time": 2383031,
"flashver": "FMLE/3.0 (compatible; Lavf58.74.100)",
"dropped": 0,
"avsync": -12,
"timestamp": 2384520,
"publishing": true,
"active": true
}
],
"records": [],
"meta": {
"video": {
"width": 1168,
"height": 720,
"frame_rate": 25,
"codec": "H264",
"profile": "High",
"level": 3.1
},
"audio": {
"codec": "AAC",
"profile": "LC",
"channels": 2,
"sample_rate": 16000
}
},
"nclients": 2,
"publishing": true,
"active": true
}],
"nclients": 2
},
"recorders": {
"count": 0,
"lists": []
}
}
]
}
Are you asking for help with the command-line JSON processor jq or the JavaScript library jQuery? They are not the same. For jq, try
jq '
.applications
| map(select(.name == "flv"))[].live.streams
| map(select(.name == "mystream" and .active))[].meta.video.height
'
720
Demo

Marathon Getting appId: Specify a path

I was Deploying Mesos Marathon cluster on ! Master server and 2 slave server. I Set up everything Fine with the load balancer but after setup everythinh i am getting this error from marathon. I was trying to deploy a Database name TiDB
This is my Marathon JSON file to deploy (groups)
[{
"acceptedResourceRoles": [],
"container": {
"docker": {
"forcePullImage": true,
"image": "grafana/grafana:6.0.1",
"network": "BRIDGE",
"parameters": [
{
"key": "user",
"value": "0"
}
],
"portMappings": [
{
"containerPort": 3000,
"hostPort": 3000,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/etc/grafana",
"hostPath": "./config/grafana",
"mode": "RW"
},
{
"containerPath": "/tmp/dashboards",
"hostPath": "./config/dashboards",
"mode": "RW"
},
{
"containerPath": "/var/lib/grafana",
"hostPath": "./data/grafana",
"mode": "RW"
}
]
},
"cpus": 1.0,
"env": {
"GF_LOG_LEVEL": "error",
"GF_PATHS_CONFIG": "/etc/grafana/grafana.ini",
"GF_PATHS_PROVISIONING": "/etc/grafana/provisioning"
},
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "grafana",
"instances": 1,
"mem": 128,
"requirePorts": true
},
{
"acceptedResourceRoles": [],
"args": [
"--name=pd0",
"--client-urls=http://0.0.0.0:2379",
"--peer-urls=http://0.0.0.0:2380",
"--advertise-client-urls=http://pd0:2379",
"--advertise-peer-urls=http://pd0:2380",
"--initial-cluster=pd0=http://pd0:2380,pd1=http://pd1:2380,pd2=http://pd2:2380",
"--data-dir=/data/pd0",
"--config=/pd.toml",
"--log-file=/logs/pd0.log"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/pd:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 2379,
"hostPort": 0,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/pd.toml",
"hostPath": "./config/pd.toml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "pd0",
"instances": 1,
"mem": 128,
"requirePorts": false
},
{
"acceptedResourceRoles": [],
"args": [
"--name=pd1",
"--client-urls=http://0.0.0.0:2379",
"--peer-urls=http://0.0.0.0:2380",
"--advertise-client-urls=http://pd1:2379",
"--advertise-peer-urls=http://pd1:2380",
"--initial-cluster=pd0=http://pd0:2380,pd1=http://pd1:2380,pd2=http://pd2:2380",
"--data-dir=/data/pd1",
"--config=/pd.toml",
"--log-file=/logs/pd1.log"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/pd:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 2379,
"hostPort": 0,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/pd.toml",
"hostPath": "./config/pd.toml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "pd1",
"instances": 1,
"mem": 128,
"requirePorts": false
},
{
"acceptedResourceRoles": [],
"args": [
"--name=pd2",
"--client-urls=http://0.0.0.0:2379",
"--peer-urls=http://0.0.0.0:2380",
"--advertise-client-urls=http://pd2:2379",
"--advertise-peer-urls=http://pd2:2380",
"--initial-cluster=pd0=http://pd0:2380,pd1=http://pd1:2380,pd2=http://pd2:2380",
"--data-dir=/data/pd2",
"--config=/pd.toml",
"--log-file=/logs/pd2.log"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/pd:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 2379,
"hostPort": 0,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/pd.toml",
"hostPath": "./config/pd.toml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "pd2",
"instances": 1,
"mem": 128,
"requirePorts": false
},
{
"acceptedResourceRoles": [],
"args": [
"--log.level=error",
"--storage.tsdb.path=/data/prometheus",
"--config.file=/etc/prometheus/prometheus.yml"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "prom/prometheus:v2.2.1",
"network": "BRIDGE",
"parameters": [
{
"key": "user",
"value": "root"
}
],
"portMappings": [
{
"containerPort": 9090,
"hostPort": 9090,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/etc/prometheus/prometheus.yml",
"hostPath": "./config/prometheus.yml",
"mode": "RO"
},
{
"containerPath": "/etc/prometheus/pd.rules.yml",
"hostPath": "./config/pd.rules.yml",
"mode": "RO"
},
{
"containerPath": "/etc/prometheus/tikv.rules.yml",
"hostPath": "./config/tikv.rules.yml",
"mode": "RO"
},
{
"containerPath": "/etc/prometheus/tidb.rules.yml",
"hostPath": "./config/tidb.rules.yml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "prometheus",
"instances": 1,
"mem": 128,
"requirePorts": true
},
{
"acceptedResourceRoles": [],
"args": [
"--log.level=error"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "prom/pushgateway:v0.3.1",
"network": "BRIDGE"
},
"type": "DOCKER"
},
"cpus": 1.0,
"fetch": [],
"id": "pushgateway",
"instances": 1,
"mem": 128
},
{
"acceptedResourceRoles": [],
"args": [
"--store=tikv",
"--path=pd0:2379,pd1:2379,pd2:2379",
"--config=/tidb.toml",
"--log-file=/logs/tidb.log",
"--advertise-address=tidb"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tidb:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 4000,
"hostPort": 4000,
"protocol": "tcp"
},
{
"containerPort": 10080,
"hostPort": 10080,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/tidb.toml",
"hostPath": "./config/tidb.toml",
"mode": "RO"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "tidb",
"instances": 1,
"mem": 128,
"requirePorts": true
},
{
"acceptedResourceRoles": [],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tidb-vision:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 8010,
"hostPort": 8010,
"protocol": "tcp"
}
]
},
"type": "DOCKER"
},
"cpus": 1.0,
"env": {
"PD_ENDPOINT": "pd0:2379"
},
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "tidb-vision",
"instances": 1,
"mem": 128,
"requirePorts": true
},
{
"acceptedResourceRoles": [],
"args": [
"--addr=0.0.0.0:20160",
"--advertise-addr=tikv0:20160",
"--data-dir=/data/tikv0",
"--pd=pd0:2379,pd1:2379,pd2:2379",
"--config=/tikv.toml",
"--log-file=/logs/tikv0.log"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tikv:latest",
"network": "BRIDGE"
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/tikv.toml",
"hostPath": "./config/tikv.toml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"id": "tikv0",
"instances": 1,
"mem": 128
},
{
"acceptedResourceRoles": [],
"args": [
"--addr=0.0.0.0:20160",
"--advertise-addr=tikv1:20160",
"--data-dir=/data/tikv1",
"--pd=pd0:2379,pd1:2379,pd2:2379",
"--config=/tikv.toml",
"--log-file=/logs/tikv1.log"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tikv:latest",
"network": "BRIDGE"
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/tikv.toml",
"hostPath": "./config/tikv.toml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"id": "tikv1",
"instances": 1,
"mem": 128
},
{
"acceptedResourceRoles": [],
"args": [
"--addr=0.0.0.0:20160",
"--advertise-addr=tikv2:20160",
"--data-dir=/data/tikv2",
"--pd=pd0:2379,pd1:2379,pd2:2379",
"--config=/tikv.toml",
"--log-file=/logs/tikv2.log"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tikv:latest",
"network": "BRIDGE"
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/tikv.toml",
"hostPath": "./config/tikv.toml",
"mode": "RO"
},
{
"containerPath": "/data",
"hostPath": "./data",
"mode": "RW"
},
{
"containerPath": "/logs",
"hostPath": "./logs",
"mode": "RW"
}
]
},
"cpus": 1.0,
"fetch": [],
"id": "tikv2",
"instances": 1,
"mem": 128
},
{
"acceptedResourceRoles": [],
"args": [
"/opt/spark/sbin/start-master.sh"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tispark:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 7077,
"hostPort": 7077,
"protocol": "tcp"
},
{
"containerPort": 8080,
"hostPort": 8080,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/opt/spark/conf/spark-defaults.conf",
"hostPath": "./config/spark-defaults.conf",
"mode": "RO"
}
]
},
"cpus": 1.0,
"env": {
"SPARK_MASTER_PORT": "7077",
"SPARK_MASTER_WEBUI_PORT": "8080"
},
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "tispark-master",
"instances": 1,
"mem": 128,
"requirePorts": true
},
{
"acceptedResourceRoles": [],
"args": [
"/opt/spark/sbin/start-slave.sh",
"spark://tispark-master:7077"
],
"container": {
"docker": {
"forcePullImage": true,
"image": "pingcap/tispark:latest",
"network": "BRIDGE",
"portMappings": [
{
"containerPort": 38081,
"hostPort": 38081,
"protocol": "tcp"
}
]
},
"type": "DOCKER",
"volumes": [
{
"containerPath": "/opt/spark/conf/spark-defaults.conf",
"hostPath": "./config/spark-defaults.conf",
"mode": "RO"
}
]
},
"cpus": 1.0,
"env": {
"SPARK_WORKER_WEBUI_PORT": "38081"
},
"fetch": [],
"healthChecks": [
{
"gracePeriodSeconds": 300,
"intervalSeconds": 60,
"maxConsecutiveFailures": 3,
"path": "/",
"portIndex": 0,
"protocol": "HTTP",
"timeoutSeconds": 20
}
],
"id": "tispark-slave0",
"instances": 1,
"mem": 128,
"requirePorts": true
}]
I will Deploy this With:
curl -X POST http://<mesos_master_IP>:8080/v2/groups -d#/root/cluster.json -H "Content-type:application/json"
But I am getting appId: Specify a path
Where Was My Mistake????
It looks like you miss groupID. The part that you posted is just a part of group definiton and should be placed in apps object in groups request
{
"id": "/",
"apps": [...]
}
https://mesosphere.github.io/marathon/api-console/index.html

How to get a json key that contains a specific string with jq?

I have a json data like this:
{
"success": true,
"module": {
"endpoint": {
"mode": "pc",
"protocolVersion": "2.0"
},
"reload": true,
"data": {
"leftContainer_CL": {
"id": "CL",
"tag": "leftContainer",
"fields": {
"css": {
"floatPosition": "left",
"width": "788px"
},
"tag": "leftContainer"
},
"type": "container"
},
"container_C": {
"id": "C",
"tag": "container",
"fields": {
"css": {
"marginTop": "12px"
},
"tag": "container"
},
"type": "container"
},
"delivery_dfdaf8a8a": {
"id": "dfdaf8a8a",
"tag": "delivery",
"fields": {
"selectPos": "right",
"deliveryBy": {
"text": "Disediakan oleh",
"poster": "ALL ITEM STORE"
},
"options": [
{
"highlight": false,
"deliveryId": "STANDARD",
"bgColor": "#fafafa",
"price": "Rp18.900",
"disable": false,
"reachTime": "Dapatkan pada\n 3-4 Apr 2018",
"liveUp": false,
"selected": true
}
],
"style": "bar"
},
"type": "biz"
},
"rightContainer_CR": {
"id": "CR",
"tag": "rightContainer",
"fields": {
"css": {
"floatPosition": "right",
"width": "388px"
},
"tag": "rightContainer"
},
"type": "container"
},
"delivery_d43597338a": {
"id": "d43597338a",
"tag": "delivery",
"fields": {
"selectPos": "right",
"deliveryBy": {
"text": "Disediakan oleh",
"poster": "incredible accessories hp"
},
"options": [
{
"highlight": false,
"deliveryId": "STANDARD",
"bgColor": "#fafafa",
"price": "Rp37.800",
"disable": false,
"reachTime": "Dapatkan pada\n 3-4 Apr 2018",
"liveUp": false,
"selected": true
}
],
"style": "bar"
},
"type": "biz"
},
"orderSummary_6": {
"id": "6",
"tag": "orderSummary",
"fields": {
"isOpen": "false",
"summarys": [
{
"tail": "(3 barang)",
"title": "Subtotal",
"value": "Rp23.557"
},
{
"title": "Biaya pengiriman",
"value": "Rp56.700"
}
],
"title": "Ringkasan Pesanan\r\n"
},
"type": "biz"
},
"root_0": {
"id": "0",
"tag": "root",
"fields": {
"count": 3,
"title": "Troli belanja Saya"
},
"type": "root"
},
"item_i77997d6b": {
"id": "i77997d6b",
"tag": "item",
"fields": {
"img": "https://id-live.slatic.net/original/08c1396908dc240625751b09decb4211.jpg",
"quantity": {
"qtyPrefix": "Kuantitas",
"min": 1,
"autoOptions": false,
"quantity": 1,
"max": 5,
"editable": true,
"showIncrDecr": true,
"showOptions": false,
"step": 1
},
"sellerName": "ALL ITEM STORE",
"title": "Case Slim Black Matte Xiaomi Redmi 4A Softcase Black",
"stockTip": {},
"valid": true,
"itemId": "143800088",
"operations": [
"wishlist",
"delete"
],
"sellerId": "100124080",
"price": {
"price": 6000,
"currentPrice": "Rp6.000",
"originPrice": "Rp30.000",
"promotionRatio": "-80%"
},
"restriction": false,
"isGift": false,
"sku": {
"skuText": "Softcase, Hitam",
"productVariant": "SO908ELAAVYY4AANID-72544754",
"brandId": "17818",
"skuId": "157608391"
},
"itemUrl": "https://www.lazada.co.id/products/i143800088-s157608391.html?urlFlag=true&mp=1",
"cartItemId": 2006547819
},
"type": "biz"
},
"location_2": {
"id": "2",
"tag": "location",
"fields": {
"buttonText": "GANTI\r\n",
"editable": true,
"postCode": "",
"style": "casAddress",
"label": "Lokasi",
"title": "Jawa Tengah,Kab. Boyolali,Ampel",
"addressId": "R2388357-R80010396-R80015219"
},
"type": "biz"
},
"voucherInput_7": {
"id": "7",
"tag": "voucherInput",
"fields": {
"buttonText": "GUNAKAN",
"placeHolder": "Masukkan Kode Voucher",
"status": "default"
},
"type": "biz",
"validate": {
"value": [
{
"msg": "Maaf, voucher ini tidak dapat digunakan. Silahkan periksa jika ada kesalahan penulisan",
"regex": "^$|^[ ]{0,5}[A-Za-z0-9~!##%&*()_+?<>{}|-]{1,100}[ ]{0,5}$"
}
]
}
},
"shop_43597338a_s2c": {
"id": "43597338a_s2c",
"tag": "shop",
"fields": {
"badges": [],
"link": "//www.lazada.co.id/shop/incredible-accessories-hp",
"name": "incredible accessories hp"
},
"type": "biz"
},
"item_i7799f86e": {
"id": "i7799f86e",
"tag": "item",
"fields": {
"img": "http://id-live-02.slatic.net/p/2/case-anti-shock-anti-crack-elegant-softcase-for-xiaomi-redmi-5a-white-clear-free-tempered-glass-1273-94487227-9f8ddff53bde3f8de9eb514ba2172361-catalog.jpg",
"quantity": {
"qtyPrefix": "Kuantitas",
"min": 1,
"autoOptions": false,
"quantity": 1,
"max": 5,
"editable": true,
"showIncrDecr": true,
"showOptions": false,
"step": 1
},
"sellerName": "incredible accessories hp",
"title": "Case Anti Shock / Anti Crack Elegant Softcase for Xiaomi Redmi 5A - White Clear + Free Tempered Glass",
"stockTip": {},
"valid": true,
"itemId": "160714927",
"operations": [
"wishlist",
"delete"
],
"sellerId": "53631",
"price": {
"price": 13580,
"currentPrice": "Rp13.580",
"originPrice": "Rp25.000",
"promotionRatio": "-46%"
},
"restriction": false,
"isGift": false,
"sku": {
"skuText": "Softcase, Bening",
"productVariant": "SO908ELAB716EPANID-97510528",
"brandId": "17818",
"skuId": "183461134"
},
"itemUrl": "https://www.lazada.co.id/products/i160714927-s183461134.html?urlFlag=true&mp=1",
"cartItemId": 2006579310
},
"type": "biz"
},
"package_p43597338a": {
"id": "p43597338a",
"tag": "package",
"fields": {},
"type": "biz"
},
"listHeader_H": {
"id": "H",
"tag": "listHeader",
"fields": {
"middle": "HARGA",
"left": "3 barang",
"right": "KUANTITAS"
},
"type": "biz"
},
"delivery_3": {
"id": "3",
"tag": "delivery",
"fields": {
"selectPos": "left",
"options": [
{
"highlight": false,
"deliveryId": "STANDARD",
"bgColor": "#fafafa",
"price": "Rp56.700",
"disable": false,
"icon": "https://laz-img-cdn.alicdn.com/tfs/TB1UpyCpfDH8KJjy1XcXXcpdXXa-72-72.png",
"name": "Standar",
"reachTime": "Dapatkan pada\n 3-4 Apr 2018",
"liveUp": false,
"selected": true
}
],
"style": "card",
"title": "Pengiriman yang dipilih"
},
"type": "biz"
},
"orderTotal_8": {
"id": "8",
"tag": "orderTotal",
"fields": {
"button": {
"enable": true,
"text": "LANJUTKAN KE PEMBAYARAN",
"clicked": false
},
"payment": {
"taxTip": "Termasuk PPN, jika berlaku",
"pay": "Rp80.257",
"title": "Total"
}
},
"type": "biz"
},
"shop_fdaf8a8a_s23f9": {
"id": "fdaf8a8a_s23f9",
"tag": "shop",
"fields": {
"badges": [],
"link": "//www.lazada.co.id/shop/all-item-store",
"name": "ALL ITEM STORE"
},
"type": "biz"
},
"floatTips_4": {
"id": "4",
"tag": "floatTips",
"fields": {
"tips": []
},
"type": "biz"
},
"item_i7790e0f9": {
"id": "i7790e0f9",
"tag": "item",
"fields": {
"img": "http://id-live-02.slatic.net/p/2/case-anti-shock-anti-crack-elegant-softcase-for-xiaomi-redmi-note4x-white-clear-8431-85175402-0bec01e88741744ae5461c4b3a4ae160-catalog.jpg",
"quantity": {
"qtyPrefix": "Kuantitas",
"min": 1,
"autoOptions": false,
"quantity": 1,
"max": 5,
"editable": true,
"showIncrDecr": true,
"showOptions": false,
"step": 1
},
"sellerName": "incredible accessories hp",
"title": "Case Anti Shock / Anti Crack Elegant Softcase for Xiaomi Redmi Note 4x - White Clear",
"stockTip": {},
"valid": true,
"itemId": "108849535",
"operations": [
"wishlist",
"delete"
],
"sellerId": "53631",
"price": {
"price": 3977,
"currentPrice": "Rp3.977",
"originPrice": "Rp15.000",
"promotionRatio": "-73%"
},
"restriction": false,
"isGift": false,
"sku": {
"skuText": "Sarung, Bening",
"productVariant": "CA529ELAAC6GUEANID-27304198",
"brandId": "10464",
"skuId": "110628148"
},
"itemUrl": "https://www.lazada.co.id/products/i108849535-s110628148.html?urlFlag=true&mp=1",
"cartItemId": 2005983481
},
"type": "biz"
},
"package_pfdaf8a8a": {
"id": "pfdaf8a8a",
"tag": "package",
"fields": {},
"type": "biz"
}
},
"hierarchy": {
"component": [
"container",
"delivery",
"item",
"shop",
"package",
"listHeader",
"orderSummary",
"leftContainer",
"orderTotal",
"floatTips",
"root",
"location",
"rightContainer",
"voucherInput"
],
"root": "root_0",
"structure": {
"package_p43597338a": [
"delivery_d43597338a",
"shop_43597338a_s2c",
"item_i7799f86e",
"item_i7790e0f9"
],
"leftContainer_CL": [
"delivery_3",
"listHeader_H",
"package_pfdaf8a8a",
"package_p43597338a"
],
"container_C": [
"leftContainer_CL",
"rightContainer_CR"
],
"rightContainer_CR": [
"location_2",
"orderSummary_6"
],
"orderSummary_6": [
"voucherInput_7",
"orderTotal_8"
],
"root_0": [
"container_C",
"floatTips_4"
],
"package_pfdaf8a8a": [
"delivery_dfdaf8a8a",
"shop_fdaf8a8a_s23f9",
"item_i77997d6b"
]
}
},
"linkage": {
"input": [],
"request": [
"voucherInput_7",
"item_i7799f86e",
"delivery_3",
"orderTotal_8",
"item_i7790e0f9",
"location_2",
"item_i77997d6b"
],
"common": {
"compress": true,
"queryParams": "^^$$1afe141216814f45e9fc6dba84d4863d{$_$}H4sIAAAAAAAAAFWU247bRgyGXyXQ9caYGXJOe5fkJmmLot3de4Ez5OwKlSVBklMkQd69lJsNHBgwZIHfz9NPf+t2Wc/DRGN3/61bRtrbvJ67+26p3V230LPoc6V1777fdROd5XGh+vpu05DLJuu7bRuep0OgjrRtf2rYETGfTyN9JabTNfh0nlnG02Xc13k6LbTSeTv9fZH1y1/XZxVbZWJZ3w9ff7xRxXGutA/z9DTs46H6G/1Lb55keqaXu9+pnN68n7/MI43D3bvzIqOKEPMq2/aJNfrBQUrg49uHZIw1kMP/T97ZfPQ3b/sHrUoju+/aoJa8zJNM+/aoDdF+WeXa1TztNEyy9h800jUoDoPFWnMIVHwCMMiWCJ2zgVRX+xw+a2M9aHyMsbpqwcUChTgxeMZqbWzkJXl7G88IPkeARApKMCYQeJNYojCEVPTboMcCmOIvibgxtURXLrRaKGKrWdhHMslkE1ss2TjnCA+ujTPtT8Oy9XgkilVQsgEX2AXJgQ3UZCNb1yKEY1LDLud+iDEbMS0rYwAzx+KbE6JmcgCbs02YJShhyi2TI4eiTAHr0ROiZS0mOteYW9JChZwR+CVPbikce2Ev4HJiJmuab6TJIApIkZo4JFZmlKZb/LmiP66jqzVlEhe5GJEUs/XNk6csxYacDmrY9o9C6rf+oxJAHqAkJMc69RbZu+yBQKxFqTEcxA8r9u5wgagLRISDNuECYjAFk3NYUnU+Htczryr+eDmfSRcUlNEukSyVmqLXeNG9NnWF16RBV95emad5p7FPx5QNGnSUpHDUllsyGINQUKSp78L1Rus/eqb9cmsdgGp1vmpLcKjD1JsNTUGdqmu6olvuxjmlpaa7KV6wtkqU1UimOXA6gIjXk1mH55fbaT8olhJJMhFrrlwxRwuAmK00ELD6+8Dmee9Ndz9dxvGu217mpf9Zb7+5qirZ1YoYOXJlWw1r1Zz1w7kAxKvKlXutVzG4WlFvGUyKxkhB9RNL9ibmmoGTnkJFBT/Pl/oi66dpuex9VMa65LRQ0fRWnDjSZDogFZJkkYL+HfwHDyba3BsFAAA=",
"submitParams": "^^$$f7898231e0216a924bcaa64d213a61ba{$_$}H4sIAAAAAAAAADXMQQrDMAwEwL/oHPSA3PqBUsgLto4bDJJtJPvS4L/XCfS2uwx7UoumKUNoPakK2qeY0ko10EIVR5w5wBqNhTI0bhXhv/kk3aM93NORr4MgcH9OdomiLPhiB9+YtexRuEuzkrnCoM5bf2tqr7vQGD8eVpdVjwAAAA=="
},
"signature": "89dffd9ca23307e603556a4c896e4c56"
}
}
}
With jq I can do this:
[me#linux]$ cat /tmp/json | jq '.module | .data | keys'
[
"container_C",
"delivery_3",
"delivery_d43597338a",
"delivery_dfdaf8a8a",
"floatTips_4",
"item_i7790e0f9",
"item_i77997d6b",
"item_i7799f86e",
"leftContainer_CL",
"listHeader_H",
"location_2",
"orderSummary_6",
"orderTotal_8",
"package_p43597338a",
"package_pfdaf8a8a",
"rightContainer_CR",
"root_0",
"shop_43597338a_s2c",
"shop_fdaf8a8a_s23f9",
"voucherInput_7"
]
I need to get orderTotal_8 but please note that the number 8 is always changing, so it can be orderTotal_10 or orderTotal_3 etc... How can I get this key with pure jq command if possible, without the help of grep / awk ? So the result should be like this orderTotal_8
If you want to form a subarray of items satisfying some condition, just add map(select( CONDITION )) to the pipeline, e.g.
.module | .data | keys | map(select(test("^orderTotal_")))
If you just want a stream of the items matching the condition, then you could first form the stream, and then make the selection:
.module | .data | keys[] | select(test("^orderTotal_"))
For efficiency and other reasons, you might want to consider using keys_unsorted instead of keys.

Grafana status board

Not sure if this is possible but we're trying to create an overall status dashboard in grafana using the singlestat panel. We used templating to group our hosts into two sites and using the packet loss value from hostalive in Icinga2. We'd like the singlestat panel to show the percentage of hosts down but sometimes we get null values. Here's the JSON from our panels:
{
"id": 2,
"title": "Host Group 2",
"span": 6,
"type": "singlestat",
"targets": [
{
"target": "icinga2.$group1.host.hostalive.perfdata.pl.value",
"refId": "A",
"hide": true
},
{
"target": "keepLastValue(averageSeries(#A))",
"refId": "B",
"textEditor": true,
"targetFull": "keepLastValue(averageSeries(icinga2.$group1.host.hostalive.perfdata.pl.value), 10000)"
}
],
"links": [],
"datasource": null,
"maxDataPoints": "",
"interval": null,
"cacheTimeout": null,
"format": "percent",
"prefix": "",
"postfix": "",
"nullText": null,
"valueMaps": [
{
"value": "null",
"op": "=",
"text": "N/A"
}
],
"mappingTypes": [
{
"name": "value to text",
"value": 1
},
{
"name": "range to text",
"value": 2
}
],
"rangeMaps": [
{
"from": "null",
"to": "null",
"text": "N/A"
}
],
"mappingType": 1,
"nullPointMode": "connected",
"valueName": "current",
"prefixFontSize": "50%",
"valueFontSize": "80%",
"postfixFontSize": "50%",
"thresholds": "50, 100",
"colorBackground": true,
"colorValue": false,
"colors": [
"rgba(50, 172, 45, 0.97)",
"rgba(237, 129, 40, 0.89)",
"rgba(245, 54, 54, 0.9)"
],
"sparkline": {
"show": false,
"full": false,
"lineColor": "rgb(31, 120, 193)",
"fillColor": "rgba(31, 118, 189, 0.18)"
},
"gauge": {
"show": false,
"minValue": 0,
"maxValue": 100,
"thresholdMarkers": true,
"thresholdLabels": false
}
}
The polling interval for the hosts is every ten minutes, the grafana board range is "today" and set to refresh every second.
Got it figured out with the keeplastvalue.
"targets": [
{
"hide": true,
"refId": "A",
"target": "exclude(keepLastValue(icinga2.$group1.host.hostalive.perfdata.pl.value, 144), '1-99')",
"textEditor": true
},
{
"refId": "B",
"target": "averageSeries(#A)",
"targetFull": "averageSeries(icinga2.$group1.host.hostalive.perfdata.pl.value)",
"textEditor": false
}
],

Can not filter results in vimeo api GET call with fields param

I am using Vimeo API, while making a get call to access a user's video I am doing
https://api.vimeo.com/users/61402929/videos?access_token=token
this is returning a json response
{
"total": 1,
"page": 1,
"per_page": 25,
"paging": {
"next": null,
"previous": null,
"first": "/users/61402929/videos?access_token=365879aad6244864dab70902890fc1a1&page=1",
"last": "/users/61402929/videos?access_token=365879aad6244864dab70902890fc1a1&page=1"
},
"data": [
{
"uri": "/videos/200383630",
"name": "Bhuvan bam _ Bb ki vines _ playing piano _ Saagar jaisi aankhon waali _ bhuvan bam live (360p_30fps_H264-96kbit_AAC)",
"description": "BB Ki Vines",
"link": "https://vimeo.com/200383630",
"duration": 59,
"width": 320,
"language": null,
"height": 320,
"embed": {
"uri": null,
"html": "<iframe src=\"https://player.vimeo.com/video/200383630?badge=0&autopause=0&player_id=0\" width=\"320\" height=\"320\" frameborder=\"0\" title=\"Bhuvan bam _ Bb ki vines _ playing piano _ Saagar jaisi aankhon waali _ bhuvan bam live (360p_30fps_H264-96kbit_AAC)\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>",
"buttons": {
"like": true,
"watchlater": true,
"share": true,
"embed": true,
"hd": false,
"fullscreen": true,
"scaling": true
},
"logos": {
"vimeo": true,
"custom": {
"active": false,
"link": null,
"sticky": false
}
},
"title": {
"name": "user",
"owner": "user",
"portrait": "user"
},
"playbar": true,
"volume": true,
"color": "00adef"
},
"created_time": "2017-01-20T17:57:04+00:00",
"modified_time": "2017-01-20T17:58:41+00:00",
"release_time": "2017-01-20T17:57:04+00:00",
"content_rating": [
"unrated"
],
"license": null,
"privacy": {
"view": "anybody",
"embed": "public",
"download": true,
"add": true,
"comments": "anybody"
},
"pictures": {
"uri": "/videos/200383630/pictures/613872508",
"active": true,
"type": "custom",
"sizes": [
{
"width": 100,
"height": 75,
"link": "https://i.vimeocdn.com/video/613872508_100x75.webp?r=pad",
"link_with_play_button": "https://i.vimeocdn.com/filter/overlay?src0=https%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F613872508_100x75.webp&src1=http%3A%2F%2Ff.vimeocdn.com%2Fp%2Fimages%2Fcrawler_play.png"
},
{
"width": 200,
"height": 150,
"link": "https://i.vimeocdn.com/video/613872508_200x150.webp?r=pad",
"link_with_play_button": "https://i.vimeocdn.com/filter/overlay?src0=https%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F613872508_200x150.webp&src1=http%3A%2F%2Ff.vimeocdn.com%2Fp%2Fimages%2Fcrawler_play.png"
},
{
"width": 295,
"height": 166,
"link": "https://i.vimeocdn.com/video/613872508_295x166.webp?r=pad",
"link_with_play_button": "https://i.vimeocdn.com/filter/overlay?src0=https%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F613872508_295x166.webp&src1=http%3A%2F%2Ff.vimeocdn.com%2Fp%2Fimages%2Fcrawler_play.png"
},
{
"width": 640,
"height": 640,
"link": "https://i.vimeocdn.com/video/613872508_640x640.webp?r=pad",
"link_with_play_button": "https://i.vimeocdn.com/filter/overlay?src0=https%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F613872508_640x640.webp&src1=http%3A%2F%2Ff.vimeocdn.com%2Fp%2Fimages%2Fcrawler_play.png"
},
{
"width": 960,
"height": 960,
"link": "https://i.vimeocdn.com/video/613872508_960x960.webp?r=pad",
"link_with_play_button": "https://i.vimeocdn.com/filter/overlay?src0=https%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F613872508_960x960.webp&src1=http%3A%2F%2Ff.vimeocdn.com%2Fp%2Fimages%2Fcrawler_play.png"
}
],
"resource_key": "fdb74e1e2dcaf7c929cfe14240765f45f2d2a302"
},
"tags": [],
"stats": {
"plays": 0
},
"metadata": {
"connections": {
"comments": {
"uri": "/videos/200383630/comments",
"options": [
"GET",
"POST"
],
"total": 0
},
"credits": {
"uri": "/videos/200383630/credits",
"options": [
"GET",
"POST"
],
"total": 1
},
"likes": {
"uri": "/videos/200383630/likes",
"options": [
"GET"
],
"total": 0
},
"pictures": {
"uri": "/videos/200383630/pictures",
"options": [
"GET",
"POST"
],
"total": 1
},
"texttracks": {
"uri": "/videos/200383630/texttracks",
"options": [
"GET",
"POST"
],
"total": 0
},
"related": null
},
"interactions": {
"watchlater": {
"added": false,
"added_time": null,
"uri": "/users/61402929/watchlater/200383630"
}
}
},
"user": {
"uri": "/users/61402929",
"name": "Rishabh Kumar",
"link": "https://vimeo.com/user61402929",
"location": null,
"bio": null,
"created_time": "2017-01-11T16:15:43+00:00",
"account": "basic",
"pictures": null,
"websites": [],
"metadata": {
"connections": {
"activities": {
"uri": "/users/61402929/activities",
"options": [
"GET"
]
},
"albums": {
"uri": "/users/61402929/albums",
"options": [
"GET"
],
"total": 0
},
"appearances": {
"uri": "/users/61402929/appearances",
"options": [
"GET"
],
"total": 0
},
"categories": {
"uri": "/users/61402929/categories",
"options": [
"GET"
],
"total": 0
},
"channels": {
"uri": "/users/61402929/channels",
"options": [
"GET"
],
"total": 0
},
"feed": {
"uri": "/users/61402929/feed",
"options": [
"GET"
]
},
"followers": {
"uri": "/users/61402929/followers",
"options": [
"GET"
],
"total": 0
},
"following": {
"uri": "/users/61402929/following",
"options": [
"GET"
],
"total": 0
},
"groups": {
"uri": "/users/61402929/groups",
"options": [
"GET"
],
"total": 0
},
"likes": {
"uri": "/users/61402929/likes",
"options": [
"GET"
],
"total": 0
},
"moderated_channels": {
"uri": "/users/61402929/channels?filter=moderated",
"options": [
"GET"
],
"total": 0
},
"portfolios": {
"uri": "/users/61402929/portfolios",
"options": [
"GET"
],
"total": 0
},
"videos": {
"uri": "/users/61402929/videos",
"options": [
"GET"
],
"total": 1
},
"watchlater": {
"uri": "/users/61402929/watchlater",
"options": [
"GET"
],
"total": 0
},
"shared": {
"uri": "/users/61402929/shared/videos",
"options": [
"GET"
],
"total": 0
},
"pictures": {
"uri": "/users/61402929/pictures",
"options": [
"GET",
"POST"
],
"total": 0
},
"watched_videos": {
"uri": "/me/watched/videos",
"options": [
"GET"
],
"total": 0
}
}
},
"preferences": {
"videos": {
"privacy": "anybody"
}
},
"content_filter": [
"language",
"drugs",
"violence",
"nudity",
"safe",
"unrated"
],
"resource_key": "6fe192b4cb782d1341fbf3fb3d0ba04a0295236d"
},
"app": null,
"status": "available",
"resource_key": "cad1f2b7d388491329363a4936f0219fa4dfd18b",
"embed_presets": null
}
]
}
however I am only interested in fields paging, total, and some fields of data array therefore i am using filter as below
https://api.vimeo.com/users/61402929/videos?access_token=token&fields=paging,data.name,data.description,data.link,data.pictures.sizes.link
but the response of above call is
{
"total": 1,
"page": 1,
"per_page": 25,
"paging": {
"next": null,
"previous": null,
"first": "/users/61402929/videos?access_token=365879********0902890fc1a1&fields=paging%2Cdata.name%2Cdata.description%2Cdata.link%2Cdata.pictures.sizes.link&page=1",
"last": "/users/61402929/videos?access_token=365879********0902890fc1a1&fields=paging%2Cdata.name%2Cdata.description%2Cdata.link%2Cdata.pictures.sizes.link&page=1"
},
"data": [
[]
]
}
Edit: Also the response for the call
https://api.vimeo.com/users/61402929/videos?access_token=token&fields=paging
is
{
"total": 1,
"page": 1,
"per_page": 25,
"paging": {
"next": null,
"previous": null,
"first": "/users/61402929/videos?access_token=365879*********70902890fc1a1&fields=paging&page=1",
"last": "/users/61402929/videos?access_token=365879*********70902890fc1a1&fields=paging&page=1"
},
"data": [
[]
]
}
I am unable to figure out why the request is not being processed correctly.
A couple of things: First, the access token should be passed in the auth header of the request, not in the request uri as you mentioned: https://developer.vimeo.com/api/authentication#making-requests
With regards to the fields filter, the filter only applies to the keys nested under data. The paging array is always returned for requests where multiple items are returned.
So your example request should look like this:
https://api.vimeo.com/users/61402929/videos?fields=name,description,link,pictures.sizes.link
More info here: https://developer.vimeo.com/api/spec#json-filter
Hope this helps!
This is not a direct answer to this question, but in case anyone else like me ends up finding this page through search results, I have wasted a couple of hours of my life with a similar problem, wondering why the results returned by Vimeo's API were seemingly completely erratic, when dialing up the API using Laravel with this package like so:
$response = Vimeo::request('/me/albums/xxxxxxx/videos?fields=name,uri,duration,width,height,link', `['per_page' => 100], 'GET');`
After a lot of trial and error I discovered that it always seemed to miss the last field listed in the query parameter.
Anyway, the solution was to always send the request with a trailing comma! i.e.
$response = Vimeo::request('/me/albums/xxxxxxx/videos?fields=name,uri,duration,width,height,link,', `['per_page' => 100], 'GET');`
Is this an absolutely face-palmingly ridiculous requirement? Why, yes. Yes it is. But there you have it.