How to call a REST API using Python using json request/response- Crucible Development - json

I am a newbie to python and am trying to create a script to login to crucible and use the token to pass to other services.
1) I am able to make a xml request and get a response but as soon as I pass the headers to my conn.request it says HTTP Error 415, unsupported Media Type.
I have done quiet a bit of research on this topic and found out that the rest API might not be supporting the json reques, but Crucible says that there API supports json so seems to be some other issue,
2) when trying to pass the args generated using feauth the auth token is not getting used , for now I have appended it to url and it works.
Please help me with the same , below is my script
import httplib
import urllib
import json
from xml.etree.ElementTree import XML
import xml.dom.minidom
conn = httplib.HTTPSConnection("fisheye")
args=urllib.urlencode({'userName':'UNAME', 'password':'PWD'})
headers={'content-type':'application/json', 'accept':'application/json'}
#headers={'Authorization' : 'Basic %s' % base64.b64encode("username:password")}
r1 = conn.request("post", "/rest-service/auth-v1/login", args)
#status = r1[u'headers']['status']
#conn.connect()
r2 = conn.getresponse()
print r1,r2.status,r2.reason,r2
r3=r2.read()
print(r3)
r4=str(r3)
print r4
data = XML(r4).find("token").text
print data
# data1=urllib.quote_plus(data, safe=":")
# print data1
args=urllib.urlencode({'FEAUTH':data}).replace("%3A", ":")
print "args is", args
#args={}
req = conn.request("get","/rest-service/reviews-v1")
r3 = conn.getresponse()
status = r3.status
print "the url is"#, r3.getheader('Location')
url=r3.getheader('location', '')
print url
url1=r3.msg#.dict['location']
print url1
#print req.url
#print req.get_method()
print dir(req) # list lots of other stuff in Request
print "after sending open review request"
print r3
print req,r3.status,r3.reason,r3
r4=r3.read()
print(r4)
r5=str(r4)
print r5
# json_ob=json.loads(r3.read())
# print json_ob

I was able to resolve the issue by
1) removing the Content-Type from the headers and changed the accept to Accept(sentence cased).
2) The login request was a get request and hence it supports data transfer by URL append, it is only for post request that we can pass an argument.

In the header of the request, try to specify the media type:
headers = { 'Content-Type' : 'application/json' }
req = urllib2.Request(url, headers=headers)

Related

Need to create a python script to authenticate via token and call data through rest-api from multiple urls

I need to create a python script to fetch data from multiple URLs and populate the same in the my sql data base. the authenticate method i need to use is bearer token. so once i get token from all three URLs of a application, i need to process it to fetch data from another 3 urls (to get the configuration item data).
Attached
import json
# to send request to the server
import requests
import urllib3
# to ignore the SSL verification warning
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
from Get_Token import Generate_TK
# Generat_TK claas generates the token
obj = Generate_TK()
# Token
tk = obj.token
resp = 0
def get_CIs_data():
url1 = "https://abc"
# pass the token in the header
header = {
"accept" : "application/json",
"Authorization": "Bearer "+str(tk),
}
response = requests.get(url1, headers=header , verify=False)
return response
resp = get_CIs_data()
if resp.status_code != 200:
print('error: ' + str(resp.status_code))
elif resp.status_code != 401: # code 401 token expired
#if token expires generates new token
tk = obj.create_token()
data = get_CIs_data()
else:
print('connected Successfully')
# Convert response object into JSON
data = resp.json()
# returns data in list of tuples
# content = data.items()
# unpacking tuple
for key , value in data.items():
print(key , value)
is the script i am currently using only for single authentication
Expecting the results from not only 1 url, but require a script which could fetch data from 3 different urls and send data to my sql database.

Using the reults of multiple for loops to post a single json response

Okay, so this is a loaded question but and I'm sure theres an easy method to use here, but I'm stuck.
Long story short, I am tasked with creating a function in python (to be run an AWS lambda) which can perform acceptance tests on a series of URL's using python-requests. These requests will be used to assert the HTTP response codes and a custom HTTP header identifying if an haproxy backend is correct.
The URL's themselves will be maintained in a yaml document which will be converted to a dict in python and passed to a for loop which will use python requests to HTTP GET the response code and header of the URL.
The issue I am having is getting a single body object to return the results of multiple for loops.
I have tried to find similar use cases but cannot
import requests
import json
import yaml
def acc_tests():
with open("test.yaml", 'r') as stream:
testurls = yaml.safe_load(stream)
results = {}
# endpoint/path 1
for url in testurls["health endpoints"]:
r = requests.get(url, params="none")
stat = r.status_code
result = json.dumps(print(url, stat))
results = json.dumps(result)
# endpoint path with headers
for url in testurls["xtvapi"]:
headers = {'H': 'xtvapi.cloudtv.comcast.net'}
r = requests.get(url, headers=headers, params="none")
stat = r.status_code
head = r.headers["X-FINITY-TANGO-BACKEND"]
result = json.dumps((url, stat, head))
results = json.dumps(result)
return {
'statusCode': 200,
'body': json.dumps(results)
}
acc_tests()
YAML file:
health endpoints:
- https://xfinityapi-tango-production-aws-us-east-1-active.r53.aae.comcast.net/tango-health/
- https://xfinityapi-tango-production-aws-us-east-1-active.r53.aae.comcast.net/
- https://xfinityapi-tango-production-aws-us-east-2-active.r53.aae.comcast.net/tango-health/
- https://xfinityapi-tango-production-aws-us-east-2-active.r53.aae.comcast.net/
- https://xfinityapi-tango-production-aws-us-west-2-active.r53.aae.comcast.net/tango-health/
- https://xfinityapi-tango-production-aws-us-west-2-active.r53.aae.comcast.net/
xtvapi:
- https://xfinityapi-tango-production-aws-us-east-1-active.r53.aae.comcast.net/
- https://xfinityapi-tango-production-aws-us-east-2-active.r53.aae.comcast.net/
- https://xfinityapi-tango-production-aws-us-west-2-active.r53.aae.comcast.net/
What I think is happening is that both for loops are running one after another, but the value of results is empty, but I'm not sure what to do in order to update/append the results dict with the results of each loop.
Thanks folks. I ended up solving this by creating a dict with immutable keys for each test type and then using append to add the results to a nested list within the dict.
Here is the "working" code as it is in the AWS Lambda function:
from botocore.vendored import requests
import json
import yaml
def acc_tests(event, context):
with open("test.yaml", 'r') as stream:
testurls = yaml.safe_load(stream)
results = {'tango-health': [], 'xtvapi': []}
# Tango Health
for url in testurls["health endpoints"]:
r = requests.get(url, params="none")
result = url, r.status_code
assert r.status_code == 200
results["tango-health"].append(result)
# xtvapi default/cloudtv
for url in testurls["xtvapi"]:
headers = {'H': 'xtvapi.cloudtv.comcast.net'}
r = requests.get(url, headers=headers, params="none")
result = url, r.status_code, r.headers["X-FINITY-TANGO-BACKEND"]
assert r.status_code == 200
assert r.headers["X-FINITY-TANGO-BACKEND"] == "tango-big"
results["xtvapi"].append(result)
resbody = json.dumps(results)
return {
'statusCode': 200,
'body': resbody
}

How to pass a soup object in a JSON payload prepared for requests.put method in python?

I'm trying to read a web page using requests module in python, scrape the read data using beautifulsoup, modify the data in the soup object and write it back to the same web page using requests.put() method.
Is there anything that I got to do additionally in the soup object to get it assigned to the JSON payload?
Modules used for this purpose:
1. JSON
2. requests
3. BeautifulSoup
If the JSON payload that I'm preparing is without soup object, then the requests.put() writes the data properly
import requests
from bs4 import BeautifulSoup
response = requests.get(confluenceServerURL+getPageURL, verify=False, headers=headers)
# 200 OK is the response for the web page is accessible
if response.status_code == 200:
jsonD = response.json()
pageId = jsonD["results"][0]['id']
versionNum = jsonD["results"][0]['version']['number']
body = jsonD["results"][0]['body']['storage']['value']
soup = BeautifulSoup(response.content, 'html.parser')
input_str = "test string"
testMan = "found string"
for row in soup.find_all('tr'):
for node in row.find_all('td'):
if node.text == testMan:
node.string = input_str
newBody = soup
else: #response.status_code
logger.error("Error while obtaining page with given page title %s ", response.status_code)
exit(1);
## Space to write confluence page
logger.info('--------------Updating page--------------')
updatePageURL = "/rest/api/content/"+pageId
jsonPayload = {"id":pageId,"type":"page","title":pageTitle,"space":{"key":spaceKey},
"body":{"storage":{"value":"{0}".format(newBody),"representation":"storage"}},
"version":{"number":int(versionNum)+1}}
#print(jsonPayload)
# update page
response = requests.put(confluenceServerURL+updatePageURL,verify=False,data=json.dumps(jsonPayload),headers=headers)
logger.debug('URL for updating page is %s', response.url)
if response.status_code == 200:
logger.info("Page with id %s updated successfully",pageId)
#break;
else:
logger.error("Unable to update page with id %s",pageId)
logger.error(response.text)
I get the response code as 200 for request.put() but the page remains empty if beautifulsoup is being used. If it is directly in JSON format, the page is getting updated successfully
INFO:conf_rest_get_page:--------------Getting page -------------------
DEBUG:conf_rest_get_page:Getting SMSESSION from URL
Accessible
DEBUG:conf_rest_get_page:URL for getting confluence page is URL
INFO:conf_rest_get_page:--------------Updating page ----------------
INFO:conf_rest_get_page:Page with id 261042277 updated successfully
2019-06-09 10.26.58 ===== table procedure Ended, Status : True =====

Python 3.6 asyncio send json message error

I'm trying to set-up a TCP echo client and server that can exchange messages using the JSON format.
I took the code from the documentation and modified it as follows:
Edit: include fix and have both server and client send JSON style messages.
import asyncio
# https://docs.python.org/3/library/asyncio-stream.html
import json
async def handle_echo(reader, writer):
data = await reader.read(100)
message = json.loads(data.decode())
addr = writer.get_extra_info('peername')
print("Received %r from %r" % (message, addr))
print("Send: %r" % json.dumps(message)) # message
json_mess_en = json.dumps(message).encode()
writer.write(json_mess_en)
#writer.write(json_mess) # not wokring
#writer.write(json.dumps(json_mess)) # not working
# Yielding from drain() gives the opportunity for the loop to schedule the write operation
# and flush the buffer. It should especially be used when a possibly large amount of data
# is written to the transport, and the coroutine does not yield-from between calls to write().
#await writer.drain()
#print("Close the client socket")
writer.close()
loop = asyncio.get_event_loop()
coro = asyncio.start_server(handle_echo, '0.0.0.0', 9090, loop=loop)
server = loop.run_until_complete(coro)
# Serve requests until Ctrl+C is pressed
print('Serving on {}'.format(server.sockets[0].getsockname()))
try:
loop.run_forever()
except KeyboardInterrupt:
pass
# Close the server
server.close()
loop.run_until_complete(server.wait_closed())
loop.close()
and the client code:
import asyncio
import json
async def tcp_echo_client(message, loop):
reader, writer = await asyncio.open_connection('0.0.0.0', 9090,
loop=loop)
print('Send: %r' % message)
writer.write(json.dumps(message).encode())
data = await reader.read(100)
data_json = json.loads(data.decode())
print('Received: %r' % data_json)
print(data_json['welcome'])
print('Close the socket')
writer.close()
message = {'welcome': 'Hello World!'}
loop = asyncio.get_event_loop()
loop.run_until_complete(tcp_echo_client(message, loop))
loop.close()
Error
TypeError: data argument must be a bytes-like object, not 'str'
Should I use another function than writer.write to encode for JSON? Or any suggestions?
Found the solution, replace:
writer.write(json.dumps(json_mess))
for
# encode as 'UTF8'
json_mess_en = json.dumps(json_mess).encode()
writer.write(json_mess_en)

Error trying to load a JSON string in Python

I am new to Python and reading about this it seems to be very easy but for some reason I am unable to debug the error. I am guessing it is something very simple...
2 Functions
def get_json():
return json.load(open('environment.json', 'r'))
def curlopia(j_son=get_json()):
sf_url = j_son['sf_sandbox_url']['url']
grant_type = j_son['oauth_parms']['grant_type']
client_id = j_son['oauth_parms']['client_id']
client_secret = j_son['oauth_parms']['client_secret']
username = j_son['oauth_parms']['username']
password = j_son['oauth_parms']['password']
param = '-d'
I have a curl statement in a subprocess.call which returns a json string.
x=subrpocess.call(["curl", sf_url, param, "grant_type=%s" % (grant_type), param, "client_id=%s" % (client_id), param, "client_secret=%s" % (client_secret), param, "username=%s" % (username), param, "password=%s" % (password)])
or
x=os.system('curl {0} -d "grant_type={1}" -d "client_id={2}" -d "client_secret={3}" -d "username={4}" -d "password={5}" -H "X-PrettyPrint:1"'.format(sf_url, grant_type, client_id, client_secret, username, password))
When I print x the result is with trailing zero at the end.
{"id":"https://blah#blah.blah/","issued_at":"xxxxxxxxxxx","token_type":"Bearer","instance_url":"xxxxxxxxxx","signature":"xxxxxxxxx","access_token":"xxxxxxxxxxxxx"}0
Unsure why.
when I do
json.loads(x)
gives me the below error. Also I have tried various combinations
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/init.py", line 326, in loads
return _default_decoder.decode(s)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 360, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
I am trying to understand why there is a trailing zero and if the error is related to that.If is the case can someone suggest a way around it and maybe the correct method of doing this.
Thanks
You are trying to load invalid JSON document.
From your reference to curl I guess, you need to get this document by some http request.
Try getting it by using requests library.
import requests
url = "http://example.com/api"
req = requests.get(url)
assert req.ok
data = req.json()
print data
You real case might require different url, method (POST...) and possibly headers, but these you shall already know from your existing curl statement)