I am trying to create a python function in which I can sub in a ticker into an API get request. Here below I have a simple code example of what I am intending to do.
def tickr_search(tickr):
print(requests.get(f'https://financialmodelingprep.com/api/v3/income-statement\
/**tickr**?limit=120&apikey=MYAPIKEY').json())
return
tickr_search(AAPL)
Related
I have found various methods of invoking a JSON/REST API call from Maximo to an external system on the web, but none of them have matched exactly what I'm looking for and all of them seem to use different methods, which is causing me a lot of confusion because I am EXTREMELY rusty at jython coding. So, hopefully, you all can help me out. Please be as detailed as possible: script language, script type (for integration, object launch point, publish channel process/user exit class, etc).
My Maximo Environment is 7.6.1.1. I have created the following...
Maximo Object Structure (MXSRLEAK): with 3 objects (SR, TKSERVICEADDRESS, and WORKLOG)
Maximo Publish Channel (MXSRLEAK-PC): uses the MXSRLEAK OS and contains my processing rules to skip records if they don't meet the criteria (SITEID = 'TWOO', TEMPLATEID IN ('LEAK','LEAKH','LEAKW'), HISTORYFLAG = 0)
Maximo End Point (LEAKTIX): HTTP Handler, HEADERS ("Content-Type:application/json"), HTTPMETHOD ("POST"), URL (https:///api/ticket/?ticketid=), USERNAME ("Maximo"), and PASSWORD (). The Allow Override is checked for HEADERS, HTTPMETHOD, and URL.
At this point, I need an automation script to:
Limit the Maximo attributes that I'm sending. This will vary depending on what happens on the Maximo side. If an externally created (SOURCE = LEAKREP, EXTERNALRECID IS NOT NULL) service request ticket gets cancelled, I need to send the last worklog with logtype = "CANCOMM" (Both summary/description and details/description_longdescription) as well as the USERID that changed status. If an externally created SR ticket gets closed, I need to send the last worklog with logtype <> "CANCOMM". If the externally created SR ticket was a duplicate, I need to also include a custom field called "DUPLICATE" (which uses a table domain to show all open SR's with similar TEMPLATEID's in the UI). If a "LEAK" SR ticket originated in Maximo (doesn't have a SOURCE or EXTERNALRECID), then I need to send data from the SR (ex. DESCRIPTION, REPORTDATE, REPORTEDBY, etc), TKSERVICEADDRESS (FORMATTEDADDRESS,etc), and WORKLOG (DESCRIPTION, LONGDESCRIPTION if they exist) objects to the external system and parse the response to update SOURCE and EXTERNALRECID.
Update Maximo End Point values for API call: HTTPMETHOD to "POST" or "PATCH", Add HEADERS (Authorization: Basic Base64Userid/Password), etc.
Below is my latest attempt with an automation script, which doesn't work because the "mbo is not defined" (I'm sure there are more problems with it but it fails early on in script). The script was created for integration, with a publish channel (MXSRLEAK-PC) using the External Exit option in Jython. I was trying to start with just one scenario where the Maximo SR ticket was originally created via an API call from the external system into Maximo and was actually a duplicate of another Maximo SR ticket. My thought was if I got this part correct, I could update the script to include the other scenarios, such as if the SR ticket originated in Maximo and needed to POST a new record to external system.
My final question is, is it better (easier for future eyes to understand) to have one Object Structure, Publish Channel, End Point, and Automation Script to handle all scenarios or to create separate ones for each scenario?
from com.ibm.json.java import JSONObject
from java.io import BufferedReader, IOException, InputStreamReader
from java.lang import System, Class, String, StringBuffer
from java.nio.charset import Charset
from java.util import Date, Properties, List, ArrayList, HashMap
from org.apache.commons.codec.binary import Base64
from org.apache.http import HttpEntity, HttpHeaders, HttpResponse, HttpVersion
from org.apache.http.client import ClientProtocolException, HttpClient
from org.apache.http.client.entity import UrlEncodedFormEntity
from org.apache.http.client.methods import HttpPost
from org.apache.http.entity import StringEntity
from org.apache.http.impl.client import DefaultHttpClient
from org.apache.http.message import BasicNameValuePair
from org.apache.http.params import BasicHttpParams, HttpParams, HttpProtocolParamBean
from psdi.mbo import Mbo, MboRemote, MboSet, MboSetRemote
from psdi.security import UserInfo
from psdi.server import MXServer
from psdi.iface.router import Router
from sys import *
leakid = mbo.getString("EXTERNALRECID")
#Attempting to pull current SR worklog using object relationship and attribute
maxlog = mbo.getString("DUPWORKLOG.description")
maxloglong = mbo.getString("DUPWORKLOG.description_longdescription")
clientEndpoint = Router.getHandler("LEAKTIX")
cEmap = HashMap()
host = cEmap.get("URL")+leakid
method = cEmap.get("HTTPMETHOD")
currhead = cEmap.get("HEADERS")
tixuser = cEmap.get("USERNAME")
tixpass = cEmap.get("PASSWORD")
auth = tixuser + ":" + tixpass
authHeader = String(Base64.encodeBase64(String.getBytes(auth, 'ISO-8859-1')),"UTF-8")
def createJSONstring():
jsonStr = ""
obj = JSONObject()
obj.put("status_code", "1")
obj.put("solution", "DUPLICATE TICKET")
obj.put("solution_notes", maxlog+" "+maxloglong)
jsonStr = obj.serialize(True)
return jsonStr
def httpPost(path, jsonstring):
params = BasicHttpParams()
paramsBean = HttpProtocolParamBean(params)
paramsBean.setVersion(HttpVersion.HTTP_1_1)
paramsBean.setContentCharset("UTF-8")
paramsBean.setUseExpectContinue(True)
entity = StringEntity(jsonstring, "UTF-8")
client = DefaultHttpClient()
request = HttpPost(host)
request.setParams(params)
#request.addHeader(currhead)
request.addHeader(HttpHeaders.CONTENT_TYPE, "application/json")
request.addHeader(HttpHeaders.AUTHORIZATION, "Basic "+authHeader)
request.setEntity(entity)
response = client.execute(request)
status = response.getStatusLine().getStatusCode()
obj = JSONObject.parse(response.getEntity().getContent())
System.out.println(str(status)+": "+str(obj))
Sorry for the late response. Ideally, the external exit script doesn’t have mbo as its instance. Instead, it uses irdata after breaking the structure data. Then another manipulation comes.
What I understood is that you need to post the payload dynamically based on some conditions in Maximo. For that, I think you can write a custom handler which will be called during the post.
I'm using curl to submit form data to my website.
curl -F some_file=#file.txt -F name=test_01 https://localhost:8000
It's not an API but I have a requirement for a single endpoint that behaves as an API. I'm a little out of my depth here, so I'm hoping someone can help me.
I've got the model set up and working and the CreateView, as well:
class CreateFile(CreateView):
model = SomeFile
fields = ['name', 'some_file', . . .]
When I send a POST request with curl as above to the specified URL (/file/request), the object is created in the DB and I get a response (eg, /thanks now which is an HTTP response from template view). But since a non-browser will be sending this request, I was hoping to respond with some JSON. Maybe with the object's name, status, etc.
I've tried a few things with mixed results... For example, if I use View instead of CreateView, I can return JSON but I really like the ease and convenience of the CreateView CBV, so I'm hoping I can do what I want this way.
How can I do this? I found a SO question that gave some clues: How do I return JSON response in Class based views, instead of HTTP response
But this deals with the typical form/view model in the browser. If I have to override the post method, what's the best way to get the form data so I can create the object? Do I need a form class even though I'm not processing a rendered form?
I ended up going with something from the Django docs:
from django.http import JsonResponse
class JSONResponseMixin:
"""
A mixin that can be used to render a JSON response
"""
def render_to_json_response(self, context, **response_kwargs):
return JsonResponse(self.get_data(context), **response_kwargs)
def get_data(self, context):
return context
Then I used this in a DetailView, overriding both get and post methods.
class FileRequest(JSONResponseMixin, DetailView):
def get:
. . .
return self.render_to_response(response_data)
def post:
. . .
return self.render_to_response(response_data)
def render_to_response(self, context, **response_kwargs):
return self.render_to_json_response(context, **response_kwargs)
I'm trying to create a view to import a csv using drf and django-import-export.
My example (I'm doing baby steps and debugging to learn):
class ImportMyExampleView(APIView):
parser_classes = (FileUploadParser, )
def post(self, request, filename, format=None):
person_resource = PersonResource()
dataset = Dataset()
new_persons = request.data['file']
imported_data = dataset.load(new_persons.read())
return Response("Ok - Babysteps")
But I get this error (using postman):
Tablib has no format 'None' or it is not registered.
Changing to imported_data = Dataset().load(new_persons.read().decode(), format='csv', headers=False) I get this new error:
InvalidDimensions at /v1/myupload/test_import.csv
No exception message supplied
Does anyone have any tips or can indicate a reference? I'm following this site, but I'm having to "translate" to drf.
Starting with baby steps is a great idea. I would suggest get a standalone script working first so that you can check the file can be read and imported.
If you can set breakpoints and step into the django-import-export source, this will save you a lot of time in understanding what's going on.
A sample test function (based on the example app):
def test_import():
with open('./books-sample.csv', 'r') as fh:
dataset = Dataset().load(fh)
book_resource = BookResource()
result = book_resource.import_data(dataset, raise_errors=True)
print(result.totals)
You can adapt this so that you import your own data. Once this works OK then you can integrate it with your post() function.
I recommend getting the example app running because it will demonstrate how imports work.
InvalidDimensions means that the dataset you're trying to load doesn't match the format expected by Dataset. Try removing the headers=False arg or explicitly declare the headers (headers=['h1', 'h2', 'h3'] - swap in the correct names for your headers).
I've been looking for info about this for hours without any result. I am rendering a page using React, and I would like it to display a list of Django models. I am trying to use ajax to fetch the list of models but without any success.
I am not sure I understand the concept behind JSon, because when I use the following code in my view:
data = list(my_query_set.values_list('categories', 'content'))
return JsonResponse(json.dumps(data, cls=DjangoJSONEncoder), safe=False)
It seems to only return a string that I cannot map (React says that map is not a function when I call it on the returned object). I thought map was meant to go through a JSon object and that json.dumps was suppose to create one...
Returned JSon "object" (which I believe to just be a string):
For the time being I have only one test model with no category and the content "At least one note "
[[null, "At least one note "]]
React code:
$.ajax({
type:"POST",
url: "",
data: data,
success: function (xhr, ajaxOptions, thrownError) {
var mapped = xhr.map(function(note){
return(
<p>
{note.categories}
{note.content}
</p>
)
})
_this.setState({notes: mapped})
},
error: function (xhr, ajaxOptions, thrownError) {
alert("failed");
}
});
Can someone please point me to the best way to send Models from Django to React, so I can use the data from this model in my front end?
I recommend using the Django REST Framework to connect Django to your React front-end. The usage pattern for DRF is:
Define serializers for your models. These define what fields are included in the JSONified objects you will send to the front-end. In your case you might specify the fields 'categories' and 'content.'
Create an API endpoint. This is the URL you will issue requests to from React to access objects/models.
From React, issue a GET request to retrieve a (possibly filtered) set of objects. You can also set up other endpoints to modify or create objects when receiving POST requests from your React front-end.
In the success function of your GET request, you will receive an Array of Objects with the fields you set in your serializer. In your example case, you would receive an Array of length 1 containing an object with fields 'categories' and 'content.' So xhr[0].content would have the value "At least one note ".
In your code, the call to json.dumps within the JsonResponse function is redundant. Check out the docs for an example of serializing a list using JsonResponse. If you are serializing the object manually (which I don't recommend), I'd use a dictionary rather than a list -- something like {'categories': <value>, 'content' : <value>}. DRF will serialize objects for you like this, so the fields are easier to access and interpret on the front-end.
I am create controller in OpenERP Framework. Following is my code and i set http.route type="http",
import openerp.http as http
from openerp.http import request
class MyController(http.Controller):
#http.route('demo_html', type="http")
def some_html(self):
return "<h1>This is a test</h1>"
Above code work perfect once i login into openerp after i modify URL http://localhost:8069/demo_html show me return result This is a test in h1 heading tag.
But same way i try to type="json" and add following json code and again try to call URL http://localhost:8069/demo_json Its not work properly and show me error "Internal Server Error".
import openerp.http as http
from openerp.http import request
class MyController(http.Controller):
#http.route('demo_html', type="http") // Work Pefrect when I call this URL
def some_html(self):
return "<h1>This is a test</h1>"
#http.route('demo_json', type="json") // Not working when I call this URL
def some_json(self):
return {"sample_dictionary": "This is a sample JSON dictionary"}
So my question is how to route json. Any help would be appreciate Thank you.
This is because there is difference between type="json" and type="http".
type="json":
it will call JSONRPC as an argument to http.route() so here , there will be only JSON data be able to pass via JSONRPC, It will only accept json data object as argument.
type="http":
As compred to JSON, http will pass http request arguments to http.route() not json data.
I think , you need to do some extra stuff while working with type="json",you have to trigger that method using json rpc from js.
like :
$(document).ready(function () {
openerp.jsonRpc("demo_json", 'call', {})
.then(function (data) {
$('body').append(data[0]);
});
return;
})
and yes do not forget to return your dictionary in list like
#http.route('demo_json', type="json")
def some_json(self):
return [{"sample_dictionary": "This is a sample JSON dictionary"}]