IBM Maximo - Querying API for data with very slow response time - json

I have been looking everywhere for a solution to this problem.
At my work, we are trying to integrate Maximo with another system via the other systems REST API (which returns JSON responses). I am able to make this integration work on a small scale, however this API is taking upwards of 5 seconds to respond per request. Currently, I have defined this system as a JSON Resource, and I copy daily "snapshots" of the non-persistent data to a persistent attribute using an automation script. The requests all run in a sequence - which works slowly for 5 assets in testing, and will definitely not scale to 1000's of calls a day.
Assume that the API of the external system cannot be modified in any way... Is there a way to query this API in a non-blocking way? I'd imagine that if I could send a request, and send the next, etc. without needing to wait for a reply to proceed, this would solve the problem.
I looked into Invocation and Publishing Channels, and also Enterprise Services, and it seems like Enterprise Services along with JMS Queues might be what I need, however documentation says that these only support queuing incoming data... and I can't see how this solves my problem.
Any help? I am completely stuck on this.
Thank you!

I had to do something that sounds similar, once. I tried JSON Resources, but they didn't work for me. I ended up using the examples in Maximo 7.6 Scripting Features to do it. The first code sample in that document is a library script for making HTTP/S calls using out-of-the-Maximo-box libraries, and other examples in that document use IBM's JSONObject and JSONArray classes (also available out of the Maximo box) to parse responses.
To get things going concurrently / multithreaded, you could configure a cron task to call your automation script, and configure multiple instances on various schedules to call the same one and use the args or some other mechanism to prevent collisions.

Related

How do I know which NetSuite integration option to choose (suiteTalk, suitelet or restlet) for integrating NetSuite to our third party application?

I am trying to integrate our third party application with NetSuite. I want to be able to import sales invoice details generated from our third party system (which uses REST API) into the NetSuite invoice form.
The frequency of import is not too crucial- an immediate import will be ideal, but sending data once a day is fine as well.
I want to know what I have to use to do this API integration - SuiteTalk, RESTlet or Suitelet.
I am completely new to this topic and after a few days of research, I learned that there are 3 options for an API integration with netsuite (Suitelets, restlets and suitetalk which comprises REST and SOAP based web services). I also learned that there are scheduled scripts and user events, but I'm not too clear on the idea.
I need some help identifying which integration option I should choose.
Any and all information about netsuite API integration is appreciated!
I would avoid REST/SOAP. SOAP is outdated, and REST is incomplete and difficult to use.
Suitelet's are for when you want to present your own custom UI to frontend users, like a special new kind of custom form not relevant to any particular record. Probably not what you want.
What you probably want is to design a restlet. A restlet is a way for you to setup your own custom url inside NetSuite that your program can talk to from outside NetSuite. Like a webpage. You can pass in data to the restlet either inside the URL, or inside the body of an HTTP request (e.g. like a JSON object), and you can get data back out from the body of the HTTP response.
A restlet is a part of SuiteTalk. The method of authenticating a restlet is the same for the method of authenticating a request to the REST API. So, learning about SuiteTalk is helpful. The code you use to write the restlet, SuiteScript, is the same kind of code used to write suitelets and other kinds of scripts.
So you will want to learn about SuiteTalk, and then, in particular, SuiteTalk restlets.
this is a really subjective issue.
It used to be that SOAP/SuiteTalk was a little easier in terms of infrastructure and since Netsuite's offerings are ever changing the REST/SuiteTalk might fill this space in the future.
Since Netsuite deprecated the Full Access role setting up integrations almost always involves the integrator having to provide a permissions spec. The easiest way to do that is via a Bundle. For token based authentication (TBA) there also needs to be an integration record from which you need Consumer Id and Secret Tokens.
So as of this writing the set up for SOAP/SuiteTalk and RESTLets is roughly the same. The easiest way to communicate these is with a bundle so if you are a Netsuite dev with a dev account you can set these up in a bundle and have your customer import them.
So equal so far but differences:
SOAP/Suitetalk is slow. IMO not suiteable for an interactive interface
SOAP/Suitetalk the code is all in your external app so changes to the code don't require any changes in the target account.
RESTlets can be pretty speedy. I've used these for client interactions.
Updates require re-loading your bundle or overwriting your bundle files in the target account (with the resulting havoc if an admin refreshes the bundle)
RESTlets give you access to the features of the account on which you are running so that code can run appropriate chunks For instance features such as matrix items, multi-location inventory, one-world, pick/pack/ship, volume pricing, multi-currency will all change the data model of the account your code is running against. RESTlets can detect which features are enabled; SOAP/SuiteTalk cannot.
So really the only advantage at this point that I see for SOAP/Suitetalk is that code updates don't require access to the target account.
Who is making the changes? If it is your NetSuite developers, then your options are SUITELET or RESTLET.
If its your third-party application team, they own the code and the process and do all their work sitting outside of NetSuite - your option is SUITETALK/SOAP. Of course, they need to know something about NetSuite, but your business analyst would be sufficient to support them. As of 2020.1+, there is also support for native REST APIs in addition to SOAP in case you still want to use REST, but not write your own RESTLETS.
As the above comments mention, Suitetalk does perform a little slower than calling RESTLETS. So that maybe one of the deciding factors.
You may consider SUITELETs for integration only if you want to bypass all authentication schemes, by setting the suitelet as public. Highly inadvisable though.
If the third-party application supports REST APIs, you could call them directly from within NetSuite - either from user events or from scheduled scripts.
You can also consider iPAAS platforms like Dell Boomi, Celigo, Jitterbit, etc. These are general-purpose integration platforms, and make connecting one platform to another easy, with minimal coding. If your Company is already invested in these iPAAS platforms for other enterprise applications, then the choice is that much simpler.

Use R as a restful service

Currently I have a shiny web app that can do some calculations on a 3GB data.frame loaded in memory.
Now instead of implementing this feature on shiny web app, I need to make it a restful service that pipelines its calculation to another app in JSON format, so that people can use it by sending http request with url like http://my-app.com/function
I'm trying opencpu right now, but I don't quite understand how can I load and keep the large data in memory so that I can use api of opencpu to call functions in package just to do calculations, not to load the large data from disk every time I send http request.
One workaround might be to use hbase as in memory database and use rhbase to load the data. But before I invest time in learning it I want to know if it is a reasonable choice for 3GB data.frame since it might add more overhead in serialization and other stuff that offset its speed benefit.
What would be a better way to implement this functionality? Solutions using packages other than opencpu are also welcome, and it's better to be free.
You may want to look at Plumber. You decorate your R functions with comment code (that can include you loading data) and it makes it available via a REST API.
You should put your data in a package and add this package to preload in the server config.

Connect Sproutcore App to MySQL Database

I'm trying to build my first Sproutcore App and I struggle to connect it to a MySQL-Database or any datasource other than fixture. I can't seem to find ANY tutorial except this one from 2009 which is marked as deprecated: http://wiki.sproutcore.com/w/page/12413058/Todos%2007-Hooking%20Up%20to%20the%20Backend .
Do people usually not connect SC-Apps to a Database? If they do so, how do they find out how to? Or does the above mentioned tutorial still work? A lot of gem-commands in the introduction seems to already differ from the official Sproutcore getting-started-guide.
SproutCore apps, as client-side "in-browser" apps, cannot connect directly to a MySQL or any other non-browser database. The application itself runs only within the user's browser (it's just HTML, CSS & JavaScript once built and deployed) and typically accesses any external data via XHR requests to an API or APIs. Therefore, you will need to create a service wrapper around your MySQL database in order for your client-side app to be able to load and update data.
There are two things worth mentioning. The first is that since the SproutCore app contains all of your user interface and a great deal of business logic, your API can be quite simple and should only return raw data (such as JSON). The second is that, I should mention that the client-server design, while more tedious to implement, is absolutely necessary in practice, because you can never trust the client side code, which is in the hands of a possibly nefarious user. Therefore, your API should also act as the final gatekeeper to validate all requests from the client.
This tutorial I found helped me a lot. Its very brief and demonstrates how to implement a very simple login-app, how to send post-requests (triggered by the login-button-action) to the backend-server and how to asynchronously process the response inside the Sproutcore-App:
http://hawkins.io/2011/04/sproutcore_login_tutorial/

http push django comet

I want to make a django server to refresh the content that you approach the database, if the idea is to first make the user see the current contents of the database and as the valley became the new content, this content comes and is placed above the previous content without reloading the page, in another part of the site is to make you change the current content with the new as it gets to the database?
evserver clearer is my choice, but really do not know how and what would be the most simple and efficient?
I think you should avoid HTTP Polling. Here's why:
if the frequency of the setInterval combined with the number of users on your web app is going to lead to a big resource drain. If you go through slides 9 to 19 in this presentation you'll see some quite dramatic figures for using Push (Note: this example uses a hosted service but hosting your own realtime server and using Push also has similar benefits)
between setInterval calls the data displayed in your app is potentially out of data. Using a Push technology means the instant that new data is available it can be push and displayed in your app. You don't want users looking at an app and thinking they are seeing correct information when they are not.
You should take a the following StackOverflow questions:
Django / Comet (Push): Least of all evils?
Need help understanding Comet in Python (with Django)
For Python/Comet see:
Python Comet Server
The latest recommendation for Comet in Python?
I'd recommend you also start considering "WebSockets" as well as "Comet". Most Comet servers now prefer to use a WebSocket connection when possible.
If you'd prefer to avoid installing and managing your own Comet/WebSocket solution then you could use a realtime hosted service which will allow you Push data through them using a REST API and your clients can receive events by embedding a JavaScript library and writing a small about of code to subscribe and receive the event.
The steps are quite straightforward:
Write a model to store data in DB
Write a view that will generate JSON-serialized data upon POST request.
Write a template that will contain JavaScript with setInterval() that will
proceed AJAX requests to the view and render recieved data. (I'd suggest using JQuery as it's well documented and widespread).

Is it exists any "rss hosting" with API for creating feeds

I am creating a desktop app that will create some reports. I want to export these reports as RSS or ATOM feeds. I can easily create feeds with Rome lib for Java. But I have no idea how to spread them. I thought about embedding httpd into my app, but it's bad idea, because a computer can be behind NAT or turned off.
I need some kind of "proxy" server, where can I push my feeds, and clients will be able to pull content from that server.
I can probable write server side app fore this, but first I'd like to find out if some dedicated solution is available for problems like this.
I was also thinking about using some blogging platform and using its API. What do you think about this approach?
One more thing I have to consider when choosing platform ability to handle lot of updates. Sometimes desktop app will be shut down but when it will be running, it generates quite a lot of updates.
Check out Google's feedburner.
EDIT
Here's a better link for their help / faq. You'll still need to use some service to generate your feed, but it won't have to handle a heavy load. Feedburner will poll your feed every 30 minutes and their servers will act as a proxy for your feed. As far as how to publish the feed for Feedburner to read, I would recommend writing a service to handle this, even more considering that you getting the data for the feeds from a number of desktop applications, and it'll probably be easier to write a custom service to interface with them, store your data in a DB, and publish feeds than it would be to try and modify a blogging service for this purpose.
I don't know why I didn't think of this when I first answered your question, but Yahoo has a service called Yahoo Pipes which allows you could use to generate feeds from various kinds of inputs. I'm not sure how well it would scale but it might work for you.