How to use scripted rest api in ServiceNow to import data from an external tool - integration

I need to know how I can use scripted rest api to import data from an external tool into a serviceNow table and I need to know if it's an inbound or outbound integration .
Thanks in advance.

Let's start with inbound vs outbound.
Inbound means traveling into a place. Example: inbound flights = flights coming to an airport.
Outbound means traveling away from a place: Example: outbound flights = flights coming to an airport
In the examples above, replace "place" with ServiceNow and you have your answer.
To implement the import of data from an external tool into a ServiceNow table you'd use a REST integration inside an IntegrationHub Custom Spoke or Flow Action. Check out this Exercise: Create an IntegrationHub Spoke.

Related

How to test WebHooks without an on-premise external system?

I'm trying to teach myself about integrating systems via WebHooks.
In a free/hosted GIS system, I can create a WebHook that would, in theory, POST a JSON object to an external system.
The problem is, I don't have an external system that's available right now for for receiving the POST.
I think I need some sort of publicly available sample server that would:
Receive the POST requests
Do something with the requests (ie. create some sort of record)
...so that I could determine if the WebHook worked correctly or not.
How can I test my WebHooks without having an on-premise external system?
I've poked around websites like Postman Echo and Amazon Lambda. But to my untrained eye, it seems like they're not quite designed for what I need.
You could use any of these options depending on your requirements:
You could use webhooks modules in services like Integromat or Zapier to receive webhook data and then apply transformation.
You could deploy a script on heroku and use the URL generated there to send the webhooks calls.
You could also use services like requestbin, webhook.site etc if you just want to receive webhooks data.
Regards

How to get data from two or more mongo schema in a single api call in feathersJs?

i have configured a basic fathers setup with two services, users and messages(with mongoose) all API endpoints are working. but how do i create an extra endpoint in a service?
i am confused how would i develop a complete webapp only using the default feathers REST endpoints.. for example i have to use an api where i need to get data from two or more mongo schema in a single api call. if i deal it in outside of feathers and use expess routing for this how would i setup the feathers authentications for these new created express routes.. please advise me a best solution my situation.
As explained in the basics guide, Feathers services do not have to be tied to a single collection. In fact, a service does not even have to make database calls at all. You can implement the service interface and make calls to as many collections/models as you want. The chat application guide also shows how to associate data using hooks and further information for that can also be found in this FAQ.
You can try looking into GraphQL to handle such queries. You can use it to create a query that fetches data from multiple mongo schemas. Here's a feathers-graphql tutorial that can guide you: https://medium.com/#mattchewone/graphql-with-feathersjs-4cc67e785bd

How to identify the Requests received in azure API management

we have an production issue where the order is submitted twice. Currently we have an API for order and we are exposing this to client using API management and in these we have policies for URL mapping for customer facing to actual .
Now , our actual API got 2 request so we thought customer submitted twice but they have confirmed that they have not submitted twice , so either there is issue with API management which fired 2 request.
How can i Identify the request received by the API management ?
Is there any chance that API management will fire the request twice ?
Appreciate any pointers
The only way to fire request twice in APIM would be by the means of Retry policy or manually using SendRequest. Otherwise it should be a client calling your API two times. Each request in APIM get it's own unique id accessible in policies as context.RequestId, this is the main way to track and identify them. But these ids are produced inside APIM itself thus are useful only if you're tracking a call from APIM and into backend.
Your best option now is to try to identify requests by client ip, method, uri, and time frame. APIM allows you to grab logs for certain periods of time (better if kept short) in JSON or CSV with data I mentioned above. To do that look into byRequest report (https://learn.microsoft.com/en-us/rest/api/apimanagement/reports#ReportByRequest), grab JSON/CSV and try to identify calls of interest,
For future you could look into onboarding your service to azure monitor (https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-azure-monitor) or log analytics those provide easier way to traverse logs.

How to set up Azure API Management for mult-tenant API

I have multi-tenant application, which exposes some API for our customers to use. I would like to expose it using Azure API Management. Mostly to provide Development Portal to our customers, which I find very useful, and maybe use some other features.
If I understand correctly, our customers will set up their own subscription keys for authentication, which API Management proxy will validate.
Question: How can I link and identify user/subscription to the tenant of my application, to ensure that only data from this tenant are returned.
One direction I can see to explore is to use delegated sign up, which I guess will help me to link subscription to the tenant. But then still the question is how to get user id in my backend API?
Any direction to documentation or samples is very appreciated
You could create separate groups in APIM to represent your tenants and then put users into those groups using delegation hookups. Withing APIM policy in expressions you can reference context.User.Groups to list groups user making the call belongs to and forward that information to backend.
Alternatively you could use Note field to store tenant name and access it as context.User.Note. Or if you're willing to store mapping on your side the just take an id context.User.Id.
All of above could be passed as a header using set-header policy like:
<set-header name="userId">
<value>#(context.User.Id)</value>
</set-user>
All scenarios would require you to have delegation setup to fill this information automatically for every new user created.

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo