Automatic job creation on jenkins using json - json

I want new jenkins to be created when i execute a particular job on jenkins.
I using json string to do so
following is what is used
json="{\"parameter\": [{\"name\": \"task\", \"value\": \"$task\"}], \"\": \"\"}"
url=http://xx.xx.xx.xx:8080/job/$task/build
curl -X POST $url -d token=zorn --data-urlencode json="$json"
but when i execute this i get the following error:
+ json='{"parameter": [{"name": "task", "value": "test123"}], "": ""}'
+ url=http://xx.xx.xx.xx:8080/job/test123/build
+ curl -X POST http://xx.xx.xx.xx:8080/job/soma/build -d token=zorn --data-urlencode 'json={"parameter": [{"name": "task", "value": "test123"}], "": ""}'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 138 0 286k --:--:-- --:--:-- --:--:-- 286k
200 263 131 263 0 138 158k 85132 --:--:-- --:--:-- --:--:-- 122k
<html><head><title>Error 404</title></head><body bgcolor="#ffffff"><h1>Status Code: 404</h1>Exception: <br>Stacktrace: <pre>(none)
</pre><br><hr size="1" width="90%"><i>Generated by Winstone Servlet Engine v0.9.10 at Mon Aug 26 12:36:10 IST 2013</i></body></html>Notifying upstream projects of job completion
Finished: SUCCESS
Please could someone guide me as in where i am going wrong.
New job test123 does not get created.

Step1:
GET http://username:password#xx.xx.xx.xx:8080/crumbIssuer/api/json
Response:
{
"_class": "hudson.security.csrf.DefaultCrumbIssuer",
"crumb": "203a7c1e1d9c9b0accba64f41362801c",
"crumbRequestField": "Jenkins-Crumb"
}
Step2:
+ json='{"parameter": [{"name": "task", "value": "test123"}], "": ""}'
+ url=http://xx.xx.xx.xx:8080/job/test123/build
+ curl -X POST http://xx.xx.xx.xx:8080/job/soma/build -d \
token=zorn --data-urlencode \
'json={"parameter": [{"name": "task", "value": "test123"}], "": ""}' \
-H 'Jenkins-Crumb: 203a7c1e1d9c9b0accba64f41362801c'

Related

curl: (3) [globbing] unmatched brace at pos 2

I keep receiving this error : curl: (3) [globbing] unmatched brace at pos 2
and I don't even understand where is "pos 2"
Although strangely I don't have the same problem when I do that from bash. I run that code in Jenkins pipeline. That is why you see sh '''' wrapper
sh '''
curl -u ${GIT_USERNAME}:${GIT_PASSWORD} -H "Content-Type: application/json" -X POST https://tools.company.my.com/bitbucket/rest/build-status/1.0/commits/$GIT_COMMIT --data-binary #- <<BODY \
{
"state": "SUCCESSFUL",
"key": "$JOB_BASE_NAME",
"name": "$BUILD_TAG",
"url": "$BUILD_URL",
"description": "change"
}
BODY
'''
Assuming there is no single quote in ${GIT_PASSWORD} :
sh -c "curl -u '${GIT_USERNAME}:${GIT_PASSWORD}'\
-H 'Content-Type: application/json'\
-X POST 'https://tools.company.my.com/bitbucket/rest/build-status/1.0/commits/$GIT_COMMIT'\
--data-binary #-" << BODY
{
"state": "SUCCESSFUL",
"key": "$JOB_BASE_NAME",
"name": "$BUILD_TAG",
"url": "$BUILD_URL",
"description": "change"
}
BODY

Feathersjs verify user via verifySignupShort pin code

I am working on a proof of concept using feathersJS for signing users up via phone number and verifying via sms and pin https://github.com/morenoh149/feathers-chat-phone-signup-sms
Currently I get the following error
$ sh curls/user-verify.sh 118903
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 208 100 156 100 52 17067 5689 --:--:-- --:--:-- --:--:-- 17333
{
"name": "BadRequest",
"message": "Expected string value. (authManagement)",
"code": 400,
"className": "bad-request",
"data": {},
"errors": {
"$className": "badParams"
}
}
the api is using https://github.com/feathers-plus/feathers-authentication-management and I've done my best to adapt it to this use case. My curl looks like
curl --header "Content-Type: application/json" \
--data '{ "action": "verifySignupShort", "value": 12345 }' \
http://localhost:3030/authmanagement
According to the docs The value should be an Object with a user and token. Token should be a string as well.
{
"action": "verifySignupShort",
"value": {
"token": "390494",
"user": {
"phoneNumber": "5005550006"
}
}
}
Docs: https://github.com/feathers-plus/feathers-authentication-management/blob/master/docs.md

Filtering JSON curl results with jq selectors

I have the following multi-record JSON structure coming as a response from a web service:
[
{
"model": "analytics.request",
"pk": 89,
"fields": {
"intent": "GetUserAccountInformation",
"sub_intent": "",
"request_text": "what is my balance",
"response_text": "You have $111,111.11 and owe $111,111. Shall I break that down?",
"session_id": "1v69yiptamgse1niap51zr2o9x154169081480766b2d6b7-d3ce-4166-9292-c4669a3dfc14",
"user_id": "amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ6OPPMEPQ7CT2LZXNPAFJMJELXXCCTEZ2JUE5PXRSBJHJTO3AVXT3C63AOEZLIEP3D3HIFBT5M23G4ORENFRT54AAK7I4X2HCORXJAB2UGQAHPE2TC75F2GWWZWSPO2CWEAZJJN7LJYFWEYHEJDLNQ6FSUD5LQWKBUC347K3IF32IV3I",
"device_id": "amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ",
"device_type": "alexa",
"device_model": "NA",
"device_os": "alexa",
"elapsed_time": 23286,
"region": "NA",
"latitude": 0,
"longitude": 0,
"date": "2018-11-08T15:31:48.757+0000",
"ext_session_id": "",
"ext_user_id": "",
"score": 1,
"platform_type": "alexa",
"platform_user_id": "amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ6OPPMEPQ7CT2LZXNPAFJMJELXXCCTEZ2JUE5PXRSBJHJTO3AVXT3C63AOEZLIEP3D3HIFBT5M23G4ORENFRT54AAK7I4X2HCORXJAB2UGQAHPE2TC75F2GWWZWSPO2CWEAZJJN7LJYFWEYHEJDLNQ6FSUD5LQWKBUC347K3IF32IV3I",
"platform_conversation_id": "amzn1.echo-api.session.9122211c-7fa8-43ae-b7e3-e56ebbdacc01",
"registered": false,
"segment_names": [
"Seg0715080610AM",
"Seg0715071857AM",
"Seg0715070437AM",
"Seg0715063307AM",
"Seg0715054215AM",
"Seg0715054027AM",
"Seg0715053845AM",
"Seg0715053646AM",
"Seg0715053156AM",
"Seg0715052407AM",
"Seg0715045748AM",
"Seg0714145246PM",
"Seg0714135150PM",
"Seg0714134041PM",
"Seg0714012505AM",
"Seg0714004152AM",
"Seg0714002843AM",
"Seg0713235444PM",
"Seg0713230241PM",
"Seg0713230030PM",
"Seg0713225825PM",
"Seg0713225548PM",
"Seg0713225028PM",
"Seg0713224159PM",
"Seg0713220356PM",
"Seg0711155346PM",
"Seg0711154018PM",
"Seg0711153432PM",
"Seg0711140748PM",
"Seg0711135636PM",
"Seg0711131412PM",
"Seg0711130857PM",
"Seg0711130157PM",
"Seg0711125338PM",
"Seg0711113158AM",
"customer",
"cmstest7",
"cmstest5",
"cmstest3",
"cmstest2",
"gffyrth",
"1testQR",
"1testgallery",
"121212"
]
}
},
{
"model": "analytics.request",
"pk": 90,
"fields": {
"intent": "GetUserAccountInformation",
"sub_intent": "",
"request_text": "what is my balance",
"response_text": "You have $111,111.11 and owe $111,111. Shall I break that down?",
"session_id": "1v69yiptamgse1niap51zr2o9x154169081480766b2d6b7-d3ce-4166-9292-c4669a3dfc14",
"user_id": "amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ6OPPMEPQ7CT2LZXNPAFJMJELXXCCTEZ2JUE5PXRSBJHJTO3AVXT3C63AOEZLIEP3D3HIFBT5M23G4ORENFRT54AAK7I4X2HCORXJAB2UGQAHPE2TC75F2GWWZWSPO2CWEAZJJN7LJYFWEYHEJDLNQ6FSUD5LQWKBUC347K3IF32IV3I",
"device_id": "amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ",
"device_type": "alexa",
"device_model": "NA",
"device_os": "alexa",
"elapsed_time": 2013,
"region": "NA",
"latitude": 0,
"longitude": 0,
"date": "2018-11-08T15:32:40.090+0000",
"ext_session_id": "",
"ext_user_id": "",
"score": 1,
"platform_type": "alexa",
"platform_user_id": "amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ6OPPMEPQ7CT2LZXNPAFJMJELXXCCTEZ2JUE5PXRSBJHJTO3AVXT3C63AOEZLIEP3D3HIFBT5M23G4ORENFRT54AAK7I4X2HCORXJAB2UGQAHPE2TC75F2GWWZWSPO2CWEAZJJN7LJYFWEYHEJDLNQ6FSUD5LQWKBUC347K3IF32IV3I",
"platform_conversation_id": "amzn1.echo-api.session.8586bff8-21a5-4336-9c9f-1c261362b89d",
"registered": false,
"segment_names": [
"Test_Segment0731174425PM",
"Test_Segment0731172344PM",
"Test_Segment0731165438PM",
"Test_Segment0731164321PM",
"Test_Segment0731163349PM",
"Test_Segment0731161939PM",
"Test_Segment0731160424PM",
"Test_Segment0730160925PM",
"Test_Segment0730154627PM",
"Test_Segment0730152806PM",
"Test_Segment0730152328PM",
"Test_Segment0730150203PM",
"Test_Segment0730141720PM",
"Test_Segment0730141304PM",
"testseg",
"Tester",
"Test0730162816PM",
"Test0730162304PM",
"Test0730161031PM",
"Test0730160632PM",
"Test0730160502PM",
"Seg0730154521PM",
"Seg0730154322PM",
"Seg0730153811PM",
"Seg0730153303PM",
"Seg0716231700PM",
"Seg0716230741PM",
"Seg0715080610AM",
"Seg0715071857AM",
"Seg0715070437AM",
"Seg0715063307AM",
"Seg0715054215AM",
"Seg0715054027AM",
"Seg0715053845AM",
"Seg0715053646AM",
"Seg0715053156AM",
"Seg0715052407AM",
"Seg0715045748AM",
"Seg0714145246PM",
"Seg0714135150PM",
"Seg0714134041PM",
"Seg0714012505AM",
"Seg0714004152AM",
"Seg0714002843AM",
"Seg0713235444PM",
"Seg0713230241PM",
"Seg0713230030PM",
"Seg0713225825PM",
"Seg0713225548PM",
"Seg0713225028PM",
"Seg0713224159PM",
"Seg0713220356PM",
"Seg0711155346PM",
"Seg0711154018PM",
"Seg0711153432PM",
"Seg0711140748PM",
"Seg0711135636PM",
"Seg0711131412PM",
"Seg0711130857PM",
"Seg0711130157PM",
"Seg0711125338PM",
"Seg0711113158AM",
"customer",
"cmstest7",
"cmstest5",
"cmstest3",
"cmstest2",
"gffyrth",
"1testQR",
"1testgallery",
"121212"
]
}
},
{...},
{...},
{...}
]
and would like to extract only certain objects from it (with jq) utilizing its selectors.
Say, return only the records where .fields.user_id=="abc"
Or, return only the records where .fields.session_id=="1e2d3f"
select seems to be the way to achieve this, but I'm not sure how to express the above with it given the JSON structure above.
The following produces empty result whereas it should return records for that session_id:
curl -X GET 'http://localhost:8090/xxx/api/v1/results?dr=last5days' -H 'Cache-Control: no-cache' -H 'Postman-Token: f6b819d0-f4f6-4def-bccb-3967366779c7' -H 'secret: 5a04bfef-39eb-435a-a0d0-b274592790bb' | jq '.[] | select(.fields.session_id=="1v69yiptamgse1niap51zr2o9x154169081480766b2d6b7-d3ce-4166-9292-c4669a3dfc14")'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 21785 0 21785 0 0 272k 0 --:--:-- --:--:-- --:--:-- 272k
$
Trying different combinations with jq's entries functions which don't quite work:
curl -X GET 'http://localhost:8080/xxx/api/v1/results?dr=last5days' -H 'Cache-Control: no-cache' -H 'Postman-Token: f6b819d0-f4f6-4def-bccb-3967366779c7' -H 'secret: 5a04bf34ef-39eb-435a-a0d0-b278765790bb' | jq 'with_entries(select(.value.pk==98))'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 31755 0 31755 0 0 127k 0 --:--:-- --:--:-- --:--:-- 127k
jq: error (at <stdin>:274): Cannot use number (9) as object key
Can someone please give me a hand creating a correct jq query? TIA.
===
Trying to implement a suggestion expressed in the comments:
This jq expression:
map(select(.fields.user_id=="amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ6OPPMEPQ7CT2LZXNPAFJMJELXXCCTEZ2JUE5PXRSBJHJTO3AVXT3C63AOEZLIEP3D3HIFBT5M23G4ORENFRT54AAK7I4X2HCORXJAB2UGQAHPE2TC75F2GWWZWSPO2CWEAZJJN7LJYFWEYHEJDLNQ6FSUD5LQWKBUC347K3IF32IV3I")) | length
seems to work in the JQ Playground (returns the correct count of 3):
but when used on the command line, it returns a wrong result (and empty array of length 0) :
$ curl -X GET 'http://localhost:8080/xxx/api/v1/results?dr=last5days' -H 'Cache-Control: no-cache' -H 'Postman-Token: f6b819d0-f4f6-4def-bccb-3967366779c7' -H 'secret: 5a04bfef-39eb-435a-a0d0-b271392790bb' | jq 'map(select(.fields.user_id=="amzn1.ask.account.AHMFB2H7ZXMSCB5TLPYWCPBJGWGHFYQZ6OPPMEPQ7CT2LZXNPAFJMJELXXCCTEZ2JUE5PXRSBJHJTO3AVXT3C63AOEZLIEP3D3HIFBT5M23G4ORENFRT54AAK7I4X2HCORXJAB2UGQAHPE2TC75F2GWWZWSPO2CWEAZJJN7LJYFWEYHEJDLNQ6FSUD5LQWKBUC347K3IF32IV3I")) | length'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 21785 0 21785 0 0 95630 0 --:--:-- --:--:-- --:--:-- 95969
0
$
What am I still doing wrong?
======
As per suggestions in comments, the following query with debug returns JSON, but still doesn't produce correct counts:
$ curl -X GET 'http://localhost:8080/xxx/api/v1/results?dr=last5days' -H 'Cache-Control: no-cache' -H 'Postman-Token: f6b819d0-f4f6-4def-bccb-3967366779c7' -H 'secret: 5a04bfef-39eb-435a-a0d0-b271392790bb' | jq ' debug | map(select(.fields.session_id=="1v69yiptamgse1niap51zr2o9x154169081480766b2d6b7-d3ce-4166-9292-c4669a3dfc14")) | length'
Using jq versions 1.6, 1.5, and 1.4, I've tested your jq program using the JSON version of the data, and all is well (on a Mac).
Please read http://stackoverflow.com/help/mcve - if you follow those guidelines, it should make debugging, communication, and/or life in general much easier for all concerned.
I'd suggest putting your jq program into a file and using the -f command-line option, at least while you're debugging this.

Fiware - Context broker: Issue with NGSIv2 subscriptions

I'm working with an Orion context broker version 1.2.0. I have subscribed in it two different cygnus (0.11 and 0.13) using NGSIv2, as follows:
(curl 172.21.0.23:1026/v2/subscriptions -s -S --header 'Fiware-Service: prueba_015_adapter' --header 'Fiware-ServicePath: /Prueba/Planta_3' --header 'Content-Type: application/json' -d #- ) <<EOF
{
"description": "Cygnus subscription",
"subject": {
"entities": [
{
"idPattern": ".*",
"type": "density_algorithm"
}
],
"condition": {
"attrs": []
}
},
"notification": {
"http": {
"url": "http://172.21.0.33:5050/notify"
},
"attrs": []
}
}
EOF
But when the context broker sends a notification to any of these cygnus modules, the next error appears in the log:
15 jun 2016 12:46:48,641 INFO [1469152682#qtp-857344131-3153] (com.telefonica.iot.cygnus.handlers.OrionRestHandler.getEvents:150) - Starting transaction (1463998603-759-0001644173) 15 jun 2016 12:46:48,641 INFO [1469152682#qtp-857344131-3153] (com.telefonica.iot.cygnus.handlers.OrionRestHandler.getEvents:232) - Received data ({"subscriptionId":"57612ed9efa20b5b23e71bd5","data":[{"id":"C-A2","type":"density_algorithm","densityPlan":{"type":"string","value":"C-A2","metadata":{}},"devices":{"type":"string","value":"43","metadata":{}},"timestamp":{"type":"string","value":"2016-06-15T12:53:26.294+02:00","metadata":{}}}]}) 15 jun 2016 12:46:48,641 INFO [1469152682#qtp-857344131-3153] (com.telefonica.iot.cygnus.handlers.OrionRestHandler.getEvents:255) - Event put in the channel (id=957931298, ttl=-1) 15 jun 2016 12:46:48,642 WARN [1469152682#qtp-857344131-3153] (com.telefonica.iot.cygnus.interceptors.GroupingInterceptor.intercept:289)
- No context responses within the notified entity, nothing is done 15 jun 2016 12:46:48,642 WARN [1469152682#qtp-857344131-3153] (org.apache.flume.source.http.HTTPSource$FlumeHTTPServlet.doPost:203)
- Error appending event to channel. Channel might be full. Consider increasing the channel capacity or make sure the sinks perform faster. org.apache.flume.ChannelException: Unable to put batch on required channel: org.apache.flume.channel.MemoryChannel{name: mongo-channel}
at org.apache.flume.channel.ChannelProcessor.processEventBatch(ChannelProcessor.java:200)
at org.apache.flume.source.http.HTTPSource$FlumeHTTPServlet.doPost(HTTPSource.java:201)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:725)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:814)
at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511)
at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:401)
at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)
at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:945)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:756)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582) Caused by: java.lang.IllegalArgumentException: put() called with null event!
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:88)
at org.apache.flume.channel.BasicTransactionSemantics.put(BasicTransactionSemantics.java:89)
at org.apache.flume.channel.BasicChannelSemantics.put(BasicChannelSemantics.java:80)
at org.apache.flume.channel.ChannelProcessor.processEventBatch(ChannelProcessor.java:189)
... 16 more
If I use NGSIv1 instead to register both subscriptions, everything goes fine: no log error is shown and the data is persisted into both cygnus modules.
(curl 172.21.0.23:1026/v1/subscribeContext -s -S --header 'Fiware-Service: prueba_015_adapter' --header 'Fiware-ServicePath: /Prueba/Planta_3' --header 'Content-Type: application/json' --header 'Accept: application/json' -d #- ) <<EOF
{
"entities": [
{
"type": "density_algorithm",
"isPattern": "true",
"id": ".*"
}
],
"attributes": [],
"reference": "http://172.21.0.33:5050/notify",
"duration": "P1M",
"notifyConditions": [
{
"type": "ONCHANGE",
"condValues": []
}
]
}
EOF
I'm sending the entities to the context broker using NGSIv1. Can the problem be due to an incompatibility between NGSIv1 and NGSIv2?
Thanks in advance
For the time being, NGSIv2 notifications are not supported in Cygnus. It is expected to be implemented, but it has not been scheduled yet.
However, you can use attrFormat (inside nofitication field) equal to legacy to use NGSIv1 notification format (have a look to more detailed information here). NGSIv1 notification format is fully supported by Cygnus, so that should work.

Why does this JSON POST post an empty file?

I am trying to post the following JSON to a URL using cURL in Terminal:
[
{
"token": "ABCDEF",
"templateId": "{1234-5678-9}",
"senders": "null",
"viewers": "null",
"peoples": "null",
"fields": {
"Matter Name": "My test matter name",
"Matter Number": "ABC123"
}
}
]
This is how I POST it in Terminal:
curl -v -k -X POST -H "Content-Type: application/json" -d docfile=#test.json https://myWebsite.com/extension/extension/extension
The Terminal output clearly says that something was posted but the part that confuses me is this excerpt from the output: upload completely sent off: 18 out of 18 bytes
Only 18 bytes were sent? My file is 218 bytes...Why is this file not being POSTed? What is being POSTed?
The problem is with your -d switch. From the documentation:
-d
Sends the specified data in a POST request to the HTTP server... If you start the data with the letter #, the rest should be a file name to read the data from.
What you are passing to the -d switch does not begin with "#", so it is being interpreted as actual data. You'll notice docfile=#test.json IS actually 18 bytes.
You need to change it from -d docfile=#test.json to -d #test.json.