pass data with SimpleHttpOperator to trigger cloud function 2nd gen - google-cloud-functions

I have the following task:
this_is_a_task = SimpleHttpOperator(
task_id= 'task_id',
method='POST',
http_conn_id='conn_id',
endpoint='/?test=foo',
# data={"test": "foo"},
headers={"Content-Type": "application/json"}
on the cloud functions side, I'm trying to catch the parameters with the two following ways:
# catching data
# test_data = request.get_json().get('test')
# print('test: {}'.format(test))
# catching end point
test_endpoint = request.args.get('test')
print('test: {}'.format(test))
the second option is working (request.args.get('test')) however when trying the first option request.get_json().get('test') I'm getting a 400 request error.
so if I'm not using the endpoint variable from my SimpleHttpOperator how can I catch the json object pass into the data variable?

I've tried to replicate your issue and based on this documentation you need to add json.dumps when you are calling a POST with json data. Then provide authentication credentials as a Google-generated ID token stored in an Authorization header.
See below sample code:
import datetime
import json
from airflow import models
from airflow.operators import bash
from airflow.providers.http.operators.http import SimpleHttpOperator
YESTERDAY = datetime.datetime.now() - datetime.timedelta(days=1)
default_args = {
'owner': 'Composer Example',
'depends_on_past': False,
'email': [''],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': datetime.timedelta(minutes=5),
'start_date': YESTERDAY,
}
with models.DAG(
'composer_quickstart',
catchup=False,
default_args=default_args,
schedule_interval=datetime.timedelta(days=1)) as dag:
# Print the dag_run id from the Airflow logs
gen_auth = bash.BashOperator(
task_id='gen_auth', bash_command='gcloud auth print-identity-token '
)
auth_token = "{{ task_instance.xcom_pull(task_ids='gen_auth') }}"
this_is_a_task = SimpleHttpOperator(
task_id='task_id',
method='POST',
http_conn_id='cf_conn1',
data=json.dumps({"test": "foo"}),
headers={"Content-Type": "application/json","Authorization": "Bearer " + auth_token}
)
gen_auth >> this_is_a_task
On the cloud functions side tried to use below sample code:
test_data = request.get_json().get('test')
print(test_data)
return test_data
You can also test your function using this curl command:
curl -i -X POST -H "Content-Type:application/json" -d '{"test": "foo"}' "Authorization: bearer $(gcloud auth print-identity-token)" https://function-5-k6ssrsqwma-uc.a.run.app

Related

Send multiple Tasks to Celery worker JSON

I am quite new at this. I have a Celery- FastAPI model that takes JSON Tasks and basically does some easy calculations with them.
#app.post("/ex1")
def run_task(data=Body(...)):
a = int(data["a"])
b = data["b"]
add = data["add"]
sub = data["sub"]
mul = data["mul"]
div = data["div"]
task = addieren_task.delay(a, b, add, sub, mul, div)
return JSONResponse({"Result": task.get()})
This is how the transition of the JSON to the task looks like.
I send the tasks with bash with this command
curl http://localhost:8000/ex1 -H "Content-Type: application/json" --data '{"a": 4, "b":19, "add": 0, "sub": 1, "mul": 0, "div": 1}'
Can someone help me send multiple JSON tasks so I can monitor the whole model better using Flower ?
Thanks in advance

make a controller in odoo to handle a json

I am new to odoo and I have created a module with the scaffold command as follows:
"C:\Program Files (x86)\Odoo 11.0\python\python.exe" "C:\Program Files (x86)\Odoo 11.0\server\odoo-bin" scaffold api4"C:\Users\Carlos\Desktop\custom_addons"
and when i create this base redirect controller it works fine
# - * - coding: utf-8 - * -
from odoo import http
from odoo.http import request
import json
class Api4 (http.Controller):
    # http.route ('/ api4 / api4 /', auth = 'public', website = True)
    def index (self):
        return request.redirect ('/ web /')
but when I create another # http.route to receive a json and be able to process your data it doesn't work for me and the one I have done previously stops working.
    # http.route ('/ api / json_get_request', auth = 'public', type = 'json', csrf = False)
def jsontest (self, ** kw):
     return {'attribute': 'test'}
the code is basic but I wanted to see if sending any json would return {'attribute': 'test'} and instead it returned this:
{
    "jsonrpc": "2.0",
    "id": null,
    "error": {
        "code": 404,
        "message": "404: Not Found",
        "data": {
            "name": "werkzeug.exceptions.NotFound",
            "debug": "Traceback (most recent call last): \ n File \" C: \\ Program Files (x86) \\ Odoo 11.0 \\ server \\ odoo \\ http.py \ ", line 653, in _handle_exception \ n return super (JsonRequest, self) ._ handle_exception (exception) \ n File \ "C: \\ Program Files (x86) \\ Odoo 11.0 \\ server \\ odoo \\ http.py \", line 312, in _handle_exception \ n raise pycompat.reraise (type (exception), exception, sys.exc_info () [2]) \ n File \ "C: \\ Program Files (x86) \\ Odoo 11.0 \\ server \\ odoo \\ tools \\ pycompat.py \ ", line 86, in reraise \ n raise value.with_traceback (tb) \ nwerkzeug.exceptions.NotFound: 404 Not Found: The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again. \ n ",
            "message": "404 Not Found: The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again.",
            "arguments": [],
            "exception_type": "internal_error"
        },
        "http_status": 404
    }
}
error postman
Add -d or --db-filter in your odoo-bin command to single out only one database. For e.g. python3 odoo-bin --addons-path addons,mymodules -d newdatabase. As far as I know, api with auth='public' raise this kind of error when there are multiple odoo databases.
Alternative solution is you can use endpoint with auth='user'. You will need to get login cookie first tho. More on this: How to connect to Odoo database from an android application
Hello Carlos Alberto Florio Luis,
1) Clear all the cache and history in your browser.
2) keep only one database for use and remove other databases
or
1) Use **--db-filter dabase-name** to load a single database.
And make sure your route defines there has no whitespace opt-in('/api/json_get_request').
Thanks

Requests for Instagram Access token

I am working on Instagram API in Django(python)
I am getting code from
'https://api.instagram.com/oauth/authorize/?client_id=%s&response_type=code&redirect_uri=%s' % (INSTAGRAM_APP_CONSUMER_KEY, redirect_uri)
but when i am exchanging code for access token code is failing
# All arguments are valid
def execute(self, code, redirect_uri, app_id, app_secret):
exchange_url = 'https://api.instagram.com/oauth/access_token'
try:
#Update : get request to post
r = requests.post(exchange_url, params={
'client_id': app_id,
'redirect_uri': redirect_uri,
'client_secret': app_secret,
'code': code,
'grant_type': 'authorization_code'
})
#print(r)
#print(json.loads(r))
print(r.json())
return r.json()
except Exception as e:
print(e)
r.json() gives simplejson.scanner.JSONDecodeError: Expecting value: line 1 column 1
Update 1 : r.json() works after request changed from get to post but error
message comming 'You must provide a client_id'
Please let me know what i am doing wrong
I think it requires data in post data, not in query params.
Try this :
your_post_data = {'client_id': '', ... }
r = requests.post('your_url', data=your_post_data)
Ref: Python Requests Docs

mosquitto - disable subscribing without authorization

I am using mosquitto version 1.4.10 with tls-certificate. I am using this plugin https://github.com/mbachry/mosquitto_pyauth to authorize a user.And it works well for mosquitto_pub ( as in, when someone tries to publish , it gets authorized by the module first ).
However, it seems that mosquitto_sub is able to subscribe to anything without authorizing. How do I force security when someone is just trying to access a topic in read only mode?
I went through the mosquitto.conf file and cant seem to find anything related to this.
for example, I am able to subscribe like this:
mosquitto_sub --cafile /etc/mosquitto/ca.crt --cert /etc/mosquitto/client.crt --key /etc/mosquitto/client.key -h ubuntu -p 1883 -t c/# -d
and able to see messages coming from some publisher like this:
mosquitto_pub --cafile /etc/mosquitto/ca.crt --cert /etc/mosquitto/client.crt --key /etc/mosquitto/client.key -h ubuntu -p 1883 -t c/2/b/3/p/3/rt/13/r/123 -m 32 -q 1
What I am trying to do is prevent mosquitto_sub reading all messages at root level without authorization .
the python code that does the authorization looks like this : ( auth data is stored in cassandra db )
import sys
import mosquitto_auth
from cassandra.cluster import Cluster
from cassandra import ConsistencyLevel
## program entry point from mosquitto...
def plugin_init(opts):
global cluster, session, select_device_query
conf = dict(opts)
cluster = Cluster(['192.168.56.102'])
session = cluster.connect('hub')
select_device_query = session.prepare('SELECT * from devices where uid=?')
select_device_query.consistency_level = ConsistencyLevel.QUORUM
print 'Cassandra cluster initialized'
def acl_check(clientid, username, topic, access):
device_data = session.execute(select_device_query, [username])
if device_data.current_rows.__len__() > 0:
device_data = device_data[0]
# sample device data looks like this :
# Row(uid=u'08:00:27:aa:8f:91', brand=3, company=2, device=15617, property=3, room=490, room_number=u'3511', room_type=13, stamp=datetime.datetime(2016, 12, 12, 6, 29, 54, 723000))
subscribable_topic = 'c/' + str(device_data.company) \
+ '/b/' + str(device_data.brand) \
+ '/p/' + str(device_data.property) \
+ '/rt/' + str(device_data.room_type) \
+ '/r/' + str(device_data.room) \
+ '/#'
matches = mosquitto_auth.topic_matches_sub(subscribable_topic, topic)
print 'ACL: user=%s topic=%s, matches = %s' % (username, topic, matches)
return matches
return False
function acl_check seems to be always called when mosquitto_pub tries to connect, but never called when mosquitto_sub connects.
the C code behind this python module is here: https://github.com/mbachry/mosquitto_pyauth/blob/master/auth_plugin_pyauth.c
add the following to your mosquitto.conf
...
allow_anonymous false
...
This will stop users without credential from logging on to the broker.
You can also add an acl rule for the anonymous user if there are certain topics you would want unauthenticated clients to be able to see.

How to schedule a downtime in icinga2 by using icinga-api with groovy?

I'm searching for a way to schedule a downtime in icinga2 with a groovy script.
I already tried creating a small groovy script. Tried using the examples from icinga documentation:
curl -u root:icinga -k -s 'https://localhost:5665/v1/actions/schedule-downtime?type=Host&filter=host.vars.os==%22Linux%22' -d '{ "author" : "michi", "comment": "Maintenance.", "start_time": 1441136260, "end_time": 1441137260, "duration": 1000 }' -X POST | python -m json.tool
but adapting this to my script didn't work. Very important are the " around each attribute name, I noted.
Solution was this way:
Using wslite as webservice client. This is the minimal example.
Now I connect to my server with api enabled. The certificate is self signed, why "sslTrustAllCerts" was needed.
I select all services from my host "testserver" and set the downtime (duration in seconds).
#Grab('com.github.groovy-wslite:groovy-wslite:1.1.2')
import wslite.rest.*
import wslite.http.auth.*
def client = new RESTClient("https://myicinga2server:5665/")
client.authorization = new HTTPBasicAuthorization("root", "secret")
def timeFrom = System.currentTimeMillis() / 1000L
def timeDurationSec = 600
def timeTo = timeFrom + timeDurationSec
try
{
def response = client.post(
path: '/v1/actions/schedule-downtime?type=Service&filter=host.name==%22testserver%22',
headers: ["Accept": "application/json"],
sslTrustAllCerts: true)
{
json "author":"mstein", "comment":"Test-Downtime", "start_time": timeFrom, "end_time": timeTo, "duration": timeDurationSec, "fixed": true
}
assert 200 == response.statusCode
print response.text
}
catch (Exception exc)
{
println "Error: " + exc.getClass().toString()
println "Message: " + exc.getMessage()
println "Response: " + exc.getResponse()
System.exit(1)
}
That worked for me!