Maximo Integration With External JSON API - json

I have found various methods of invoking a JSON/REST API call from Maximo to an external system on the web, but none of them have matched exactly what I'm looking for and all of them seem to use different methods, which is causing me a lot of confusion because I am EXTREMELY rusty at jython coding. So, hopefully, you all can help me out. Please be as detailed as possible: script language, script type (for integration, object launch point, publish channel process/user exit class, etc).
My Maximo Environment is 7.6.1.1. I have created the following...
Maximo Object Structure (MXSRLEAK): with 3 objects (SR, TKSERVICEADDRESS, and WORKLOG)
Maximo Publish Channel (MXSRLEAK-PC): uses the MXSRLEAK OS and contains my processing rules to skip records if they don't meet the criteria (SITEID = 'TWOO', TEMPLATEID IN ('LEAK','LEAKH','LEAKW'), HISTORYFLAG = 0)
Maximo End Point (LEAKTIX): HTTP Handler, HEADERS ("Content-Type:application/json"), HTTPMETHOD ("POST"), URL (https:///api/ticket/?ticketid=), USERNAME ("Maximo"), and PASSWORD (). The Allow Override is checked for HEADERS, HTTPMETHOD, and URL.
At this point, I need an automation script to:
Limit the Maximo attributes that I'm sending. This will vary depending on what happens on the Maximo side. If an externally created (SOURCE = LEAKREP, EXTERNALRECID IS NOT NULL) service request ticket gets cancelled, I need to send the last worklog with logtype = "CANCOMM" (Both summary/description and details/description_longdescription) as well as the USERID that changed status. If an externally created SR ticket gets closed, I need to send the last worklog with logtype <> "CANCOMM". If the externally created SR ticket was a duplicate, I need to also include a custom field called "DUPLICATE" (which uses a table domain to show all open SR's with similar TEMPLATEID's in the UI). If a "LEAK" SR ticket originated in Maximo (doesn't have a SOURCE or EXTERNALRECID), then I need to send data from the SR (ex. DESCRIPTION, REPORTDATE, REPORTEDBY, etc), TKSERVICEADDRESS (FORMATTEDADDRESS,etc), and WORKLOG (DESCRIPTION, LONGDESCRIPTION if they exist) objects to the external system and parse the response to update SOURCE and EXTERNALRECID.
Update Maximo End Point values for API call: HTTPMETHOD to "POST" or "PATCH", Add HEADERS (Authorization: Basic Base64Userid/Password), etc.
Below is my latest attempt with an automation script, which doesn't work because the "mbo is not defined" (I'm sure there are more problems with it but it fails early on in script). The script was created for integration, with a publish channel (MXSRLEAK-PC) using the External Exit option in Jython. I was trying to start with just one scenario where the Maximo SR ticket was originally created via an API call from the external system into Maximo and was actually a duplicate of another Maximo SR ticket. My thought was if I got this part correct, I could update the script to include the other scenarios, such as if the SR ticket originated in Maximo and needed to POST a new record to external system.
My final question is, is it better (easier for future eyes to understand) to have one Object Structure, Publish Channel, End Point, and Automation Script to handle all scenarios or to create separate ones for each scenario?
from com.ibm.json.java import JSONObject
from java.io import BufferedReader, IOException, InputStreamReader
from java.lang import System, Class, String, StringBuffer
from java.nio.charset import Charset
from java.util import Date, Properties, List, ArrayList, HashMap
from org.apache.commons.codec.binary import Base64
from org.apache.http import HttpEntity, HttpHeaders, HttpResponse, HttpVersion
from org.apache.http.client import ClientProtocolException, HttpClient
from org.apache.http.client.entity import UrlEncodedFormEntity
from org.apache.http.client.methods import HttpPost
from org.apache.http.entity import StringEntity
from org.apache.http.impl.client import DefaultHttpClient
from org.apache.http.message import BasicNameValuePair
from org.apache.http.params import BasicHttpParams, HttpParams, HttpProtocolParamBean
from psdi.mbo import Mbo, MboRemote, MboSet, MboSetRemote
from psdi.security import UserInfo
from psdi.server import MXServer
from psdi.iface.router import Router
from sys import *
leakid = mbo.getString("EXTERNALRECID")
#Attempting to pull current SR worklog using object relationship and attribute
maxlog = mbo.getString("DUPWORKLOG.description")
maxloglong = mbo.getString("DUPWORKLOG.description_longdescription")
clientEndpoint = Router.getHandler("LEAKTIX")
cEmap = HashMap()
host = cEmap.get("URL")+leakid
method = cEmap.get("HTTPMETHOD")
currhead = cEmap.get("HEADERS")
tixuser = cEmap.get("USERNAME")
tixpass = cEmap.get("PASSWORD")
auth = tixuser + ":" + tixpass
authHeader = String(Base64.encodeBase64(String.getBytes(auth, 'ISO-8859-1')),"UTF-8")
def createJSONstring():
jsonStr = ""
obj = JSONObject()
obj.put("status_code", "1")
obj.put("solution", "DUPLICATE TICKET")
obj.put("solution_notes", maxlog+" "+maxloglong)
jsonStr = obj.serialize(True)
return jsonStr
def httpPost(path, jsonstring):
params = BasicHttpParams()
paramsBean = HttpProtocolParamBean(params)
paramsBean.setVersion(HttpVersion.HTTP_1_1)
paramsBean.setContentCharset("UTF-8")
paramsBean.setUseExpectContinue(True)
entity = StringEntity(jsonstring, "UTF-8")
client = DefaultHttpClient()
request = HttpPost(host)
request.setParams(params)
#request.addHeader(currhead)
request.addHeader(HttpHeaders.CONTENT_TYPE, "application/json")
request.addHeader(HttpHeaders.AUTHORIZATION, "Basic "+authHeader)
request.setEntity(entity)
response = client.execute(request)
status = response.getStatusLine().getStatusCode()
obj = JSONObject.parse(response.getEntity().getContent())
System.out.println(str(status)+": "+str(obj))

Sorry for the late response. Ideally, the external exit script doesn’t have mbo as its instance. Instead, it uses irdata after breaking the structure data. Then another manipulation comes.
What I understood is that you need to post the payload dynamically based on some conditions in Maximo. For that, I think you can write a custom handler which will be called during the post.

Related

Azure Message Routing: JSON message in wrong format

I'm working with a raspberry pi zero and Python to send and recieve sensor data with Azure IoT. I've already created an endpoint and message routing to the storage container. But when I check the JSON-Files in the container, I've got two problems:
The file include various general data which i don't need
My message body is in Base24-format
My message looks like this:
{"EnqueuedTimeUtc":"2021-06-25T13:03:25.7110000Z","Properties":{},"SystemProperties":{"connectionDeviceId":"RaspberryPi","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637555519600003402","enqueuedTime":"2021-06-25T13:03:25.7110000Z"},"Body":"eyJ0ZW1wZXJhdHVyZSI6IDI4Ljk1LCAicHJlc3N1cmUiOiA5ODEuMDg2Njk1NDU5MzMyNiwgImh1bWlkaXR5IjogNDYuMjE0ODE3NjkyOTEyODgsICJ0aW1lIjogIjIwMjEtMDYtMjUgMTQ6MDM6MjUuNjMxNzk1In0="}
The body included my sensor data in Base64-format. I've already read about contentType = application/JSON and contentEncoding = UTF-8 so that Azure can work with correct JSON files. But where do i apply these settings? When I apply it to the routing query, I get the following error:
Routing Query Error (The server didn't understand your query. Check your query syntax and try again)
I just want to get the body-message in correct JSON Format.
Thank you all for any kind of help! Since it's my first experience with this kind of stuff, I'm a little helpless.
Zero clue if this helps, but here is my code for sending data from Raspberry Pi Python to AWS - Parse Server using base64/JSON. The only reason I use base64 is to send pictures. You should only have to use JSON to send your other data.
import requests
import random, time
import math
import json
import Adafruit_DHT
import base64
from Adafruit_CCS811 import Adafruit_CCS811
from picamera import PiCamera
from time import sleep
DHT_SENSOR = Adafruit_DHT.DHT22
DHT_PIN =4
ccs = Adafruit_CCS811()
camera = PiCamera()
while True:
time.sleep(5)
camera.start_preview()
sleep(5)
camera.capture('/home/pi/Desktop/image.jpg')
camera.stop_preview()
with open('/home/pi/Desktop/image.jpg', 'rb') as binary_file:
binary_file_data = binary_file.read()
base64_encoded_data = base64.b64encode(binary_file_data)
base64_message = base64_encoded_data.decode('utf-8')
humidity, temperature = Adafruit_DHT.read_retry(DHT_SENSOR, DHT_PIN)
ccs.readData()
parseServer = {
"temp": temperature,
"humid": humidity,
"co2": ccs.geteCO2(),
"pic": base64_message
}
resultJSON = json.dumps(parseServer)
headers = {
'X-Parse-Application-Id': 'myappID',
'Content-Type': 'application/json',
}
data = resultJSON
response =
requests.put('http://1.11.111.1111/parse/classes/Gamefuck/TIuRnws3Ag',
headers=headers, data=data)
print(data)
If you're using the Python SDK for Azure IoT, sending the message as UTF-8 encoded JSON is as easy as setting two properties on your message object. There is a good example here
msg.content_encoding = "utf-8"
msg.content_type = "application/json"
Furthermore, you don't need to change anything in IoT Hub for this. This message setting is a prerequisite to be able to do message routing based on the body of the message.

Import csv file in drf

I'm trying to create a view to import a csv using drf and django-import-export.
My example (I'm doing baby steps and debugging to learn):
class ImportMyExampleView(APIView):
parser_classes = (FileUploadParser, )
def post(self, request, filename, format=None):
person_resource = PersonResource()
dataset = Dataset()
new_persons = request.data['file']
imported_data = dataset.load(new_persons.read())
return Response("Ok - Babysteps")
But I get this error (using postman):
Tablib has no format 'None' or it is not registered.
Changing to imported_data = Dataset().load(new_persons.read().decode(), format='csv', headers=False) I get this new error:
InvalidDimensions at /v1/myupload/test_import.csv
No exception message supplied
Does anyone have any tips or can indicate a reference? I'm following this site, but I'm having to "translate" to drf.
Starting with baby steps is a great idea. I would suggest get a standalone script working first so that you can check the file can be read and imported.
If you can set breakpoints and step into the django-import-export source, this will save you a lot of time in understanding what's going on.
A sample test function (based on the example app):
def test_import():
with open('./books-sample.csv', 'r') as fh:
dataset = Dataset().load(fh)
book_resource = BookResource()
result = book_resource.import_data(dataset, raise_errors=True)
print(result.totals)
You can adapt this so that you import your own data. Once this works OK then you can integrate it with your post() function.
I recommend getting the example app running because it will demonstrate how imports work.
InvalidDimensions means that the dataset you're trying to load doesn't match the format expected by Dataset. Try removing the headers=False arg or explicitly declare the headers (headers=['h1', 'h2', 'h3'] - swap in the correct names for your headers).

HTTP-GET via SSIS

I have some ethernet device which collect data and it's possible to download it via data export interface: HTTP-GET query returns the data in [Content-Type: text/plain Charset: utf-8]
I saw this: How to make an HTTP request from SSIS? - it rather doesn't work for me (C# is a little Chinese for me) and it's about how to fetch this data to variable into SSIS
In your SSIS package add a C# Script Task
Edit the Script Task
At the top with the other using statements add using System.Net;
in Main use the following code snippet to make a GET request (Note: Change "https://somewhere.com/contacts/get" to your actual endpoint.)
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("https://somewhere.com/contacts/get");
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
using(HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using(Stream stream = response.GetResponseStream())
using(StreamReader reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}

How to pull data from Toggl API with Power Query?

First timer when it comes to connecting to API. I'm trying to pull data from Toggl using my API token but I can't get credentials working. I tried to replicate the method by Chris Webb (https://blog.crossjoin.co.uk/2014/03/26/working-with-web-services-in-power-query/) but I can't get it working. Here's my M code:
let
Source = Web.Contents(
"https://toggl.com/reports/api/v2/details?workspace_id=xxxxx&client=xxxxxx6&billable=yes&user_agent=xxxxxxx",
[
Query=[ #"filter"="", #"orderBy"=""],
ApiKeyName="api-token"
])
in
Source
After that I'm inputting my API Token into Web API method in Access Web content windows but I get an error that credentials could not be authenticated. Here's Toggl API specification:
https://github.com/toggl/toggl_api_docs/blob/master/reports.md
Web.Contents function receives two parameters: url + options
Inside options, you define the headers and the api_key, and other queryable properties, such as:
let
baseUrl = "https://toggl.com/",
// the token part can vary depending on the requisites of the API
accessToken = "Bearer" & "insert api token here"
options = [
Headers = [Authorization = accessToken, #"Content-Type" =
"application/Json"], RelativePath ="reports/api/v2/details", Query =
[workspace_id=xxxxx, client=xxxxxx6 , billable=yes, user_agent=xxxxxxx]
]
Source = Web.Contents(baseUrl, options)
// since Web.Contents() doesn't parse the binaries it fetches, you must use another
// function to see if the data was retreived, based on the datatype of the data
parsedData = Json.Document(Source)
in
parsedData
The baseUrl is the smallest url that works and never changes;
The RelativePath is the next part of the url before the first "?".
The Query record is where you define all the attributes to query as a record.
This is usually the format, but check the documentation of the API you're querying to see if it is similar.

PUT requests with Custom Ember-Data REST Adapter

I'm using Ember-Data 1.0.0.Beta-9 and Ember 1.7 to consume a REST API via DreamFactory's REST Platform. (http://www.dreamfactory.com).
I've had to extend the RESTAdapter in order to use DF and I've been able to implement GET and POST requests with no problems. I am now trying to implement model.save() (PUT) requests and am having a serious hiccup.
Calling model.save() sends the PUT request with the correct data to my API endpoint and I get a 200 OK response with a JSON response of { "id": "1" } which is what is supposed to happen. However when I try to access the updated record all of the properties are empty except for ID and the record on the server is not updated. I can take the same JSON string passed in the request, paste it into the DreamFactory Swagger API Docs and it works no problem - response is good and the record is updated on the DB.
I've created a JSBin to show all of the code at http://emberjs.jsbin.com/nagoga/1/edit
Unfortunately I can't have a live example as the servers in question are locked down to only accept requests from our company's public IP range.
DreamFactory provides a live demo of the API in question at
https://dsp-sandman1.cloud.dreamfactory.com/swagger/#!/db/replaceRecordsByIds
OK in the end I discovered that you can customize the DreamFactory response by adding a ?fields=* param to the end of the PUT request. I monkey-patched that into my updateRecord method using the following:
updateRecord: function(store, type, record) {
var data = {};
var serializer = store.serializerFor(type.typeKey);
serializer.serializeIntoHash(data, type, record);
var adapter = this;
return new Ember.RSVP.Promise(function(resolve, reject) {
// hack to make DSP send back the full object
adapter.ajax(adapter.buildURL(type.typeKey) + '?fields=*', "PUT", { data: data }).then(function(json){
// if the request is a success we'll return the same data we passed in
resolve(json);
}, function(reason){
reject(reason.responseJSON);
});
});
}
And poof we haz updates!
DreamFactory has support for tacking several params onto the end of the requests to fully customize the response - at some point I will look to implement this correctly but for the time being I can move forward with my project. Yay!
EmberData is interpreting the response from the server as an empty object with an id of "1" an no other properties in it. You need to return the entire new object back from the server with the changes reflected.