Sinatra access-control-allow-origin for sinatra public folder - html

How do I set up Sinatra so that static files in the public folder
are returned with the response Access-Control-Allow-Origin = "*" ?

Have a look at this question here: Sinatra OPTIONS HTTP Verb. It's implemented in sinatra now so you don't have to hack around it.
If that doesn't help take a look at this blog post: Cross Origin Resource Sharing with Sinatra, and its repo at github: sinatra-corss_origin
Although the simplest way to do it should work just by adding this:
response['Access-Control-Allow-Origin'] = 'http://whatever.org'
before the return value in your route.

get '/foo' do
headers 'Access-Control-Allow-Origin' => 'http://example.com'
'hello world'
end
There's also a nice extension for cross origin sharing:
https://github.com/britg/sinatra-cross_origin
require 'sinatra'
require 'sinatra/cross_origin'
# To enable cross origin requests for all routes:
configure do
enable :cross_origin
end
# To only enable cross origin requests for certain routes:
get '/cross_origin' do
cross_origin
"This is available to cross-origin javascripts"
end

I did this on a server side, my file was called server.rb:
before do
content_type :json
headers 'Access-Control-Allow-Origin' => '*',
'Access-Control-Allow-Methods' => ['OPTIONS', 'GET', 'POST']
end

This setup works for me:
Gemfile:
# Gemfile
gem 'sinatra'
gem 'sinatra-cross_origin'
Sinatra App:
# app.rb
require 'sinatra'
require 'sinatra/cross_origin'
class MyApp < Sinatra::Base
set :bind, '0.0.0.0'
configure do
#This is enable cross on the server
enable :cross_origin
end
#This before blocks gets invoked on every request and
#the (*) mark tells your server that share the resource with anyone,
#if you want to share it with specific domain you can mention the domain/s
#by removing the asterisk sign.
before do
response.headers['Access-Control-Allow-Origin'] = '*'
end
# routes...
options "*" do
response.headers["Allow"] = "GET, PUT, POST, DELETE, OPTIONS"
response.headers["Access-Control-Allow-Headers"] = "Authorization,
Content-Type, Accept, X-User-Email, X-Auth-Token"
response.headers["Access-Control-Allow-Origin"] = "*"
200
end
end
The options block described above sends a 200 response to the preflight request sent by the browser. Then the browser makes the CORS request. In response to this request, the server sends Access-Control-Allow-Origin = * in response headers.
If we want only a specific domain to access the resources:
before do
response.headers['Access-Control-Allow-Origin'] = 'http://example.com'
end

this solution works for me and is based on an answer on a similar question How to add "Access-Control-Allow-Origin" headers to API Response in Ruby
get '/' do
response['Access-Control-Allow-Origin'] = '*'
"asdf" # return "asdf"
end

Related

Trying to make a POST request, works with cURL, get a 403 when using Python requests

I'm trying to get some JSON data from this API - https://ped.uspto.gov/api/queries
This cURL request works fine and returns what is expected:
curl -X POST "https://ped.uspto.gov/api/queries" -H "accept: application/json" -H "Content-Type: application/json" -d "{ \"searchText\":\"*:*\", \"fq\":[ \"totalPtoDays:[1 TO 99999]\", \"appFilingDate:[2005-01-01T00:00:00Z TO 2005-12-31T23:59:59Z]\" ], \"fl\":\"*\", \"mm\":\"100%\", \"df\":\"patentTitle\", \"facet\":\"true\", \"sort\":\"applId asc\", \"start\":\"0\"}"
I have this python script to do the same thing:
from requests.structures import CaseInsensitiveDict
import json
url = "https://ped.uspto.gov/api/queries"
headers = CaseInsensitiveDict()
headers["accept"] = "application/json"
headers["Content-Type"] = "application/json"
data = json.dumps({
"searchText":"*:*",
"fq":[
"totalPtoDays:[1 TO 99999]",
"appFilingDate:[2005-01-01T00:00:00Z TO 2005-12-31T23:59:59Z]"
],
"fl":"*",
"mm":"100%",
"df":"patentTitle",
"facet":"true",
"sort":"applId asc",
"start":"0"
})
resp = requests.post(url, headers=headers, data=data)
print(resp.status_code)
but it returns a 403 error code and the following response header:
"Date":"Mon, 24 Oct 2022 16:13:58 GMT",
"Content-Type":"text/html",
"Content-Length":"919",
"Connection":"keep-alive",
"X-Cache":"Error from cloudfront",
"Via":"1.1 d387fec28536c5aa92926c56363afe9a.cloudfront.net (CloudFront)",
"X-Amz-Cf-Pop":"LHR50-P8",
"X-Amz-Cf-Id":"RMd69prehvXNAl97mo0qyFtuBIiY8r9liIxcQEmbdoBV1zwXLhirXA=="
I'm at quite a loss at what to do, because I really don't understand what my Python is missing to replicate the cURL request.
Thanks very much.
I was interested in this. I got an account with uspto.gov and acquired an access key. Their other API's work well. But the PEDS API? I kept getting the Cloudflare Gateway Timeout 503 error. While I was on their website, I looked into the PEDS API, I could not load any link to a https://ped.uspto.gov page.
I called them and they gave me an email address. I got this reply:
The PEDS API was taken down, because repeated data mining was bringing the entire PEDS System down.
The PEDS Team is working on a solution to fix the PEDS API, so that it can be re-enabled.
I tried it using PHP.
Cloudflare has been causing a lot of problems for curl.
I got a timeout.
I may have gotten past the 403 Forbidden, but did not have credentials and so the server dropped the connection.
An HTTP 504 status code (Gateway Timeout) indicates that when
CloudFront forwarded a request to the origin (because the requested
object wasn't in the edge cache), one of the following happened: The
origin returned an HTTP 504 status code to CloudFront. The origin
didn't respond before the request expired.
AWS Cloudflare Curl Issues
bypassing CloudFlare 403
How to Fix Error 403 Forbidden on Cloudflare
403 Forbidden cloudflare
██████████████████████████████████████████████████████████████
This is a conversion from you curl.
The Content-Type:application/data is added by default when you send JSON data.
I do not know about your json_data.dump or you putting the JSON in parentheses.
import requests
headers = {
'accept': 'application/json',
}
json_data = {
'searchText': '*:*',
'fq': [
'totalPtoDays:[1 TO 99999]',
'appFilingDate:[2005-01-01T00:00:00Z TO 2005-12-31T23:59:59Z]',
],
'fl': '*',
'mm': '100%',
'df': 'patentTitle',
'facet': 'true',
'sort': 'applId asc',
'start': '0',
}
response = requests.post('https://ped.uspto.gov/api/queries', headers=headers, json=json_data)

I am trying to call a DAG( wrtitten in Python) using Cloud Function(Python 3.7) and getting error "405 Method Not Found" Could someone help here?

I have used below code available on the Google Cloud Docs platform:
from google.auth.transport.requests import Request
from google.oauth2 import id_token
import requests
IAM_SCOPE = 'https://www.googleapis.com/auth/iam'
OAUTH_TOKEN_URI = 'https://www.googleapis.com/oauth2/v4/token'
def trigger_dag(data, context=None):
"""Makes a POST request to the Composer DAG Trigger API
When called via Google Cloud Functions (GCF),
data and context are Background function parameters.
For more info, refer to
https://cloud.google.com/functions/docs/writing/background#functions_background_parameters-python
To call this function from a Python script, omit the ``context`` argument
and pass in a non-null value for the ``data`` argument.
"""
# Fill in with your Composer info here
# Navigate to your webserver's login page and get this from the URL
# Or use the script found at
# https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/composer/rest/get_client_id.py
client_id = '87431184677-jitlhi9o0u9sin3uvdebqrvqokl538aj.apps.googleusercontent.com'
# This should be part of your webserver's URL:
# {tenant-project-id}.appspot.com
webserver_id = 'b368a47a354ddf2f6p-tp'
# The name of the DAG you wish to trigger
dag_name = 'composer_sample_trigger_response_dag'
webserver_url = (
#'https://'
webserver_id
+ '.appspot.com/admin/airflow/tree?dag_id='
+ dag_name
#+ '/dag_runs'
)
# Make a POST request to IAP which then Triggers the DAG
make_iap_request(
webserver_url, client_id, method='POST', json={"conf": data, "replace_microseconds": 'false'})
# This code is copied from
# https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/iap/make_iap_request.py
# START COPIED IAP CODE
def make_iap_request(url, client_id, method='GET', **kwargs):
"""Makes a request to an application protected by Identity-Aware Proxy.
Args:
url: The Identity-Aware Proxy-protected URL to fetch.
client_id: The client ID used by Identity-Aware Proxy.
method: The request method to use
('GET', 'OPTIONS', 'HEAD', 'POST', 'PUT', 'PATCH', 'DELETE')
**kwargs: Any of the parameters defined for the request function:
https://github.com/requests/requests/blob/master/requests/api.py
If no timeout is provided, it is set to 90 by default.
Returns:
The page body, or raises an exception if the page couldn't be retrieved.
"""
# Set the default timeout, if missing
if 'timeout' not in kwargs:
kwargs['timeout'] = 90
# Obtain an OpenID Connect (OIDC) token from metadata server or using service
# account.
google_open_id_connect_token = id_token.fetch_id_token(Request(), client_id)
# Fetch the Identity-Aware Proxy-protected URL, including an
# Authorization header containing "Bearer " followed by a
# Google-issued OpenID Connect token for the service account.
resp = requests.request(
method, url,
headers={'Authorization': 'Bearer {}'.format(
google_open_id_connect_token)}, **kwargs)
if resp.status_code == 403:
raise Exception('Service account does not have permission to '
'access the IAP-protected application.')
elif resp.status_code != 200:
raise Exception(
'Bad response from application: {!r} / {!r} / {!r}'.format(
resp.status_code, resp.headers, resp.text))
else:
return resp.text
# END COPIED IAP CODE
You encounter the error "405 method not found" because you are trying to send a request to your_webserver_id.appspot.com/admin/airflow/tree?dag_id=composer_sample_trigger_response_dag which is the URL seen "Tree view" in the airflow webserver.
To properly send request to the Airflow API you need to construct the webserver_url just like in the documentation in Trigger DAGs in Cloud Functions. The webserver_url was constructed to use Trigger a new DAG endpoint to send requests. So if you'd like to trigger the DAG you can stick with the code below.
Airflow run DAG endpoint:
POST https://airflow.apache.org/api/v1/dags/{dag_id}/dagRuns
webserver_url = (
'https://'
+ webserver_id
+ '.appspot.com/api/experimental/dags/'
+ dag_name
+ '/dag_runs'
)
Moving forward, if you would like to perform different operations using the Airflow API you can check the Airflow REST reference

axios POST method seems to be 'transformed' to OPTIONS method on the fly

I use this sample code:
axios({
method: 'post',
headers: { 'content-type': 'application/json' },
url: 'http://somePlace:8040/someWSendpoint',
data: {
firstName: 'Fred',
lastName: 'Flintstone'
}
});
But that request never reach its destination. Apparently, because the POST method is transformed to OPTIONS method, and rejected by the endpoint url.
This is what the 'Network' window shows in the Chrome inspector:
Request Method: OPTIONS
Status Code: 405 Method Not Allowed
This is what the 'Console' window shows in the Chrome inspector:
Access to XMLHttpRequest at 'http://somePlace:8040/someWSendpoint' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
xhr.js?b50d:160 POST http://somePlace:8040/someWSendpoint net::ERR_FAILED
dispatchXhrRequest # xhr.js?b50d:160
xhrAdapter # xhr.js?b50d:11
dispatchRequest # dispatchRequest.js?5270:59
Promise.then (async)
request # Axios.js?0a06:51
wrap # bind.js?1d2b:9
send_firma # AltaFirma.vue?c240:504
click # AltaFirma.vue?3ed8:1812
invokeWithErrorHandling # vue.runtime.esm.js?2b0e:1854
invoker # vue.runtime.esm.js?2b0e:2179
original._wrapper # vue.runtime.esm.js?2b0e:6917
createError.js?2d83:16 Uncaught (in promise) Error: Network Error
at createError (createError.js?2d83:16)
at XMLHttpRequest.handleError (xhr.js?b50d:69)
I am wondering if people "at the other side" need to configure something related to that CORS thing in his server ¿?
Any help will be very appreciated.

Bad URI Ruby on Rails: is it because long?

So I have a website that's fully working, with some URI encoded in the URL.
however, when I try to pass the URL to my chrome browser:
http://somewhere:3000/find/someOne?utf8=%E2%9C%93&search=someThing&choicen=no&querys={%22peopleName%22%3A%22%22%2C%22peopleGroup%22%3A%22%22%2C%22place%22%3A%22%22%2C%22pip%22%3A%22%22%2C%22hw%22%3A%22%22%2C%22somerock%22%3A%22%22%2C%22rocksomerock%22%3A%22%22%2C%22diedAt%22%3A%222016-01-01%20-%202016-12-31%22%2C%22borndAt%22%3A%22%22%2C%22taxRate%22%3A%22%22}
-- it throws me an error in the browser:
Bad Request
bad URI `/find/someOne?utf8=%E2%9C%93&search=someThing&choicen=no&querys={%22peopleName%22%3A%22%22%2C%22peopleGroup%22%3A%22%22%2C%22place%22%3A%22%22%2C%22pip%22%3A%22%22%2C%22hw%22%3A%22%22%2C%22somerock%22%3A%22%22%2C%22rocksomerock%22%3A%22%22%2C%22diedAt%22%3A%222016-01-01%20-%202016-12-31%22%2C%22borndAt%22%3A%22%22%2C%22taxRate%22%3A%22%22}'.
WEBrick/1.3.1 (Ruby/1.9.3/2014-11-13) at somewhere.com:3000
Also shows [2016-07-04 18:11:31] ERROR bad URI in the rails console
Versions:
rails3
Ruby 1.9.3
Any idea how to get it working? Is it because the { and } in the URI or because it is too long?
Parse the path in the controller upon incoming request, using Rack::Utils#parse_nested_query, see: http://www.rubydoc.info/github/rack/rack/master/Rack/Utils.parse_nested_query
# config/routes.rb
get '/find/someOne/*str' => 'find#someOne'
# app/controllers/find_controller.rb
class FindController < ApplicationController
def someOne
custom_params = Rack::Utils.parse_nested_query(request.env['ORIGINAL_FULLPATH'])
querys_hash = JSON.parse(custom_params["querys"])
end
end
Example via console:
$ bundle exec rails c
Running via Spring preloader in process 31944
Loading development environment (Rails 5.0.0)
irb(main):001:0> custom_params = Rack::Utils.parse_nested_query "utf8=%E2%9C%93&search=someThing&choicen=no&querys={%22peopleName%22%3A%22%22%2C%22peopleGroup%22%3A%22%22%2C%22place%22%3A%22%22%2C%22pip%22%3A%22%22%2C%22hw%22%3A%22%22%2C%22somerock%22%3A%22%22%2C%22rocksomerock%22%3A%22%22%2C%22diedAt%22%3A%222016-01-01%20-%202016-12-31%22%2C%22borndAt%22%3A%22%22%2C%22taxRate%22%3A%22%22}"
=> {"utf8"=>"✓", "search"=>"someThing", "choicen"=>"no", "querys"=>"{\"peopleName\":\"\",\"peopleGroup\":\"\",\"place\":\"\",\"pip\":\"\",\"hw\":\"\",\"somerock\":\"\",\"rocksomerock\":\"\",\"diedAt\":\"2016-01-01 - 2016-12-31\",\"borndAt\":\"\",\"taxRate\":\"\"}"}
irb(main):002:0> querys_hash = JSON.parse custom_params["querys"]
=> {"peopleName"=>"", "peopleGroup"=>"", "place"=>"", "pip"=>"", "hw"=>"", "somerock"=>"", "rocksomerock"=>"", "diedAt"=>"2016-01-01 - 2016-12-31", "borndAt"=>"", "taxRate"=>""}

Create entity in a service using IDAS and ContextBroker

So I'm having some problems connection virtual devices to the contextBroker and i thing it's because of the Fiware-Service. I don't want to use the OpenIoT (even though that didn't worked for me either). I didn't manage to find any documentation about service creation and maybe i'm creating it wrong.
I did Python CreateService bus_auto 4jggokgpepnvsb2uv4s40d59ovand i'm not sure it returns me 201. I updated the config.ini file to work on MY service but when i send the observations it doesn't change the value of the entity on the contextBroker
I'm now running it in
My config.ini file:
[user]
# Please, configure here your username at FIWARE Cloud and a valid Oauth2.0 TOKEN for your user (you can use get_token.py to obtain a valid TOKEN).
username=
token=NULL
[contextbroker]
host=127.0.0.1
port=1026
OAuth=no
# Here you need to specify the ContextBroker database you are querying.
# Leave it blank if you want the general database or the IDAS service if you are looking for IoT devices connected by you.
fiware_service=bus_auto
[idas]
host=130.206.80.40
adminport=5371
ul20port=5371
OAuth=no
# Here you need to configure the IDAS service your devices will be sending data to.
# By default the OpenIoT service is provided.
fiware-service=bus_auto
fiware-service-path=/
apikey=4jggokgpepnvsb2uv4s40d59ov
[local]
#Choose here your System type. Examples: RaspberryPI, MACOSX, Linux, ...
host_type=CentOS
# Here please add a unique identifier for you. Suggestion: the 3 lower hexa bytes of your Ethernet MAC. E.g. 79:ed:af
# Also you may use your e-mail address.
host_id=db:00:ff
I'm using the python script GetEntity.py:
python2.7 GetEntity.py bus_auto_2
I also tried using a python script that i created:
import json
import urllib
import urllib2
BASE_URL = 'http://127.0.0.1:1026'
QUERY_URL = BASE_URL+'/v1/queryContext'
HEADERS = {
'Content-Type': 'application/json',
'Accept': 'application/json'
}
QUERY_EXAMPLE = {
"entities": [
{
"type": "bus_auto_2",
"isPattern": "false",
"id": "Room1"
}
]
}
def post(url, data):
""""""
req = urllib2.Request(url, data, HEADERS)
f = urllib2.urlopen(req)
result = json.loads(f.read())
f.close()
return result
if __name__ == "__main__":
print post(UPDATE_URL, json.dumps(UPDATE_EXAMPLE))
print post(QUERY_URL, json.dumps(QUERY_EXAMPLE))
I see the service is well created and actually I see one device defined within it.
I have even successfully sent an observation (t|23) bus_auto_2 device
Later, I check in the ContextBroker this entity: "thing:bus_auto_2" and I see the latest observation I sent.
Did you update in the config.ini file the FIWARE_SERVICE both at ContextBroker and IDAS sections ?
Cheers,
Looking to your script, it seems you are not including the Fiware-Service header in you queryContext request. Thus, the query is resolved in the "default service" and not in bus_auto service.
Probably changing the HEADERS map in the following way would solve the issue:
HEADERS = {
'Content-Type': 'application/json',
'Accept': 'application/json',
'Fiware-Service: 'bus_auto'
}
EDIT: In addition to the above change, note that the BASE_URL is pointint to a local Orion instance, not the one connected with IDAS (which run in the same machine that IDAS). Thus, I think you also need to modify BASE_URL in the following way:
BASE_URL = 'http://130.206.80.40:1026'