urllib & python3: HTTP Error 405: Method Not Allowed - json

I am trying to do a simple authentication using Python3 and urllib on an API that should return account balances.
The code I have is the following:
import urllib
import urllib.request
import json
id = "nkkhuz6" # fake
secret = "s9MeR0J9yxtndLBPVA" # fake
auth_str = id + ":" + secret
def getBalances():
values = {'u' : auth_str}
data = urllib.parse.urlencode(values)
data = data.encode('utf-8') # data should be bytes
request = urllib.request.Request(url = "https://api.com", data = data)
with urllib.request.urlopen(request) as f:
print(json.loads(f.read().decode('utf-8')))
However when I run getBalances() I get the following errors:
Adriaans-MacBook-Pro:Documents adriaanjoubert$ python3 main.py
Traceback (most recent call last):
File "main.py", line 96, in <module>
getBalances()
File "main.py", line 19, in getBalances
with urllib.request.urlopen(request) as f:
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 161, in urlopen
return opener.open(url, data, timeout)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 469, in open
response = meth(req, response)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 579, in http_response
'http', request, response, code, msg, hdrs)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 507, in error
return self._call_chain(*args)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 441, in _call_chain
result = func(*args)
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 587, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 405: Method Not Allowed
I am sure the URL is correct and if I append a trailing / I get the error urllib.error.HTTPError: HTTP Error 404: Not Found
When I run the following code I do get my account balances:
cmd = """curl -u """ + auth_str + """ https://api.com/"""
os.system(cmd)
What am I doing wrong? I would like to use urllib so that I can store the stuff I get back from the API in a variable.

Related

How to get https://stocks.exchange/api2/ticker

import json, requests
def tick():
r = requests.get('https://stocks.exchange/api2/ticker')
return r.json()
print tick()
This code outputs,
Traceback (most recent call last):
File "C:\Users\Steven\Desktop\Auto\tradeogre\stocksexchange.py", line 6, in
<module>
print tick()
File "C:\Users\Steven\Desktop\Auto\tradeogre\stocksexchange.py", line 4, in
tick
r = requests.get('https://stocks.exchange/api2/ticker')
File "C:\Python27\lib\site-packages\requests\api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 508, in
request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 618, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 506, in send
raise SSLError(e, request=request)
SSLError: HTTPSConnectionPool(host='stocks.exchange', port=443): Max retries
exceeded with url: /api2/ticker (Caused by SSLError(SSLEOFError(8, u'EOF
occurred in violation of protocol (_ssl.c:661)'),))
How would I do this without violating the protocol? I have used urllib, urllib2, and get similer responses. If I enter the url in the browser, it displays the data I need. Any help would be greatly appreciated.

Python 3.6: get JSON from aiohttp request

I have some application which uses aiohttp.
I sent POST request into approptiate endpoint, e.g.:
POST mysite.com/someendpoind/
with data similar to:
{"param1": "value1", "param2": "value2", ..., "paramn": None}
Then on backend side, I want to add some additional conditional into this request:
data = await request.json()
data["additional_conditional"] = True
But request.json() fails with an error:
[ERROR] Error handling request
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/aiohttp/web_protocol.py", line 422, in start
resp = yield from self._request_handler(request)
File "/usr/local/lib/python3.5/dist-packages/aiohttp/web.py", line 306, in _handle
resp = yield from handler(request)
File "/usr/local/lib/python3.5/dist-packages/aiohttp_session/__init__.py", line 129, in middleware
response = yield from handler(request)
File "/opt/bikeamp/auth/__init__.py", line 57, in wrapped
return (yield from f(request, user))
File "<my_module>.py", line 185, in <my_func>
data_json = await request.json()
File "/usr/local/lib/python3.5/dist-packages/aiohttp/web_request.py", line 469, in json
return loads(body)
File "/usr/lib/python3.5/json/__init__.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.5/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Then I decided to check somehow what is the content of my request:
await request.read()
b'field1=value1&field2=value2&field3=value3&field4=&field5=&field6='
So, I'm not sure, but the problem may be with empty parameters.
Also, I was trying to get this data via:
data = await request.post()
data["additional_condition"] = True
But this returns MultiDictProxy. Python can't pickle these objects.
Is there any known solutions?
I had the same issue, if post was something like {"email": "some#email.com"} check it with:
#router('/', methods=['POST', ])
async def post_request(request):
post = await request.post()
email = post.get('email') # because it's MultiDict
logging.warning(post) # see post details
logging.warning(email) # shows value "some#email.com"
json = await request.text() #
logging.warning(json) # shows json if it was ajax post request

urllib2.URLError: <urlopen error [Errno 8]

import urllib2
import urllib
import json
url = "http://ajax/googleapis.com/ajax/services/search/web?v=1.0&"
query = raw_input ("What do you want to search for ? >> ")
query = urllib.urlencode({'q': query})
response = urllib2.urlopen (url + query).read()
data = json.loads (response)
results = data ['responseData'] ['results']
for result in results:
title = result['title']
url = result['url']
print (title + ';' + url)
ERROR
/System/Library/Frameworks/Python.framework/Versions/2.6/bin/python2.6 /Users/dragonleo/PycharmProjects/untitled2/googleapi
What do you want to search for ? >> apple
Traceback (most recent call last):
File "/Users/dragonleo/PycharmProjects/untitled2/googleapi", line 8, in
response = urllib2.urlopen (url + query).read()
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 126, in urlopen
return _opener.open(url, data, timeout)
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 1181, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 1156, in do_open
raise URLError(err)
urllib2.URLError:
Appreciate if expert can explain why I am getting the error
Two problems stand out immediately:
There are multiple typos in the code above. Specifically, there are no spaces between brackets and parens. Also, the URL should be ajax.googleapis.com.
The Google Web Search API is no longer available. You should migrate to the Google Custom Search API

Python 3 urllib HTTP Error 412: Precondition Failed

I'm trying to parse HTML data of a website. I wrote this code:
import urllib.request
def parse(url):
response = urllib.request.urlopen(url)
html = response.read()
strHTML = html.decode()
return strHTML
website = "http://www.manarat.ac.bd/"
string = parse(website)
but it is showing this error:
Traceback (most recent call last):
File "C:\Users\pupewekate\Videos\RAW\2.py", line 11, in
string = parse(website)
File "C:\Users\pupewekate\Videos\RAW\2.py", line 5, in parse
response = urllib.request.urlopen(url)
File
"C:\Users\pupewekate\AppData\Local\Programs\Python\Python36-32\lib\urllib\request.py",
line 223, in urlopen return opener.open(url, data, timeout)
File
"C:\Users\pupewekate\AppData\Local\Programs\Python\Python36-32\lib\urllib\request.py",
line 532, in open response = meth(req, response)
File
"C:\Users\pupewekate\AppData\Local\Programs\Python\Python36-32\lib\urllib\request.py",
line 642, in http_response 'http', request, response, code, msg,
hdrs)
File
"C:\Users\pupewekate\AppData\Local\Programs\Python\Python36-32\lib\urllib\request.py",
line 570, in error return > self._call_chain(*args)
File
"C:\Users\pupewekate\AppData\Local\Programs\Python\Python36-32\lib\urllib\request.py",
line 504, in _call_chain result = func(*args)
File
"C:\Users\pupewekate\AppData\Local\Programs\Python\Python36-32\lib\urllib\request.py",
line 650, in http_error_default raise HTTPError(req.full_url, code,
msg, hdrs, fp) urllib.error.HTTPError: HTTP Error 412: Precondition
Failed
Any solution?
This website checks the user agent header. If it doesn't recognize its value it returns status code 412:
import requests
print(requests.get('http://www.manarat.ac.bd/'))
# <Response [412]>
print(requests.get('http://www.manarat.ac.bd/', headers={'User-Agent': 'Chrome'}))
# <Response [200]>
See this answer for how to set user agent in urlib.
You could use requests module as it is easier to implement, else if you are determined to use urllib, you can use this:
import urllib
def parse(url):
headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 6.3;Win64;x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36'}
response = urllib.request.urlopen(url,headers=headers)
print response
website = "http://www.manarat.ac.bd/"
string = parse(website)

How to pass urls from CSV list into a python GET request

I have a CSV file, which contains a list of Google extension IDs.
I'm writing a code that will read the extension IDs, add the webstore url, then perform a basic get request:
import csv
import requests
with open('small.csv', 'rb') as f:
reader = csv.reader(f)
for row in reader:
urls = "https://chrome.google.com/webstore/detail/" + row[0]
print urls
r = requests.get([urls])
Running this code results in the following Traceback:
Traceback (most recent call last):
File "C:\Users\tom\Dropbox\Python\panya\test.py", line 9, in <module>
r = requests.get([urls])
File "C:\Python27\lib\site-packages\requests\api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 567, in send
adapter = self.get_adapter(url=request.url)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 641, in get_adapter
raise InvalidSchema("No connection adapters were found for '%s'" % url)
InvalidSchema: No connection adapters were found for '['https://chrome.google.com/webstore/detail/blpcfgokakmgnkcojhhkbfbldkacnbeo']'
How can revise the code, so that it would accept the urls in the list, and make the GET request?
requests.get expects a string, but you're creating and passing a list [urls]
r = requests.get([urls])
Change it to just
r = requests.get(urls)
and it should work.