I'm trying to teach myself about integrating systems via WebHooks.
In a free/hosted GIS system, I can create a WebHook that would, in theory, POST a JSON object to an external system.
The problem is, I don't have an external system that's available right now for for receiving the POST.
I think I need some sort of publicly available sample server that would:
Receive the POST requests
Do something with the requests (ie. create some sort of record)
...so that I could determine if the WebHook worked correctly or not.
How can I test my WebHooks without having an on-premise external system?
I've poked around websites like Postman Echo and Amazon Lambda. But to my untrained eye, it seems like they're not quite designed for what I need.
You could use any of these options depending on your requirements:
You could use webhooks modules in services like Integromat or Zapier to receive webhook data and then apply transformation.
You could deploy a script on heroku and use the URL generated there to send the webhooks calls.
You could also use services like requestbin, webhook.site etc if you just want to receive webhooks data.
Regards
Related
Have been working on the integration between Azure DevOps Services and ServiceNow. Our goal is to send Change Requests from ServiceNow to Azure DevOps, where they would become Features or User Stories. Whenever there is some update on Azure DevOps, that update should be sent to ServiceNow, and vice versa.
The idea is to work with REST API.
From our investigation, we have found that it is possible to send updates to other applications through Web Hooks. We are still not sure if this will suite our needs and if we are able to work with this. The problem is that the webhooks only support the HTTP method POST while Service Now requests PATCH to update on it’s side. Is this correct is there any way of creating webhooks with PATCH method?
Other way that we can integrate is to create some software that will send response needed. However, we cannot seem to find a way to automate this response. As I understood, it will generate response only when the script run, not when work item is updated. Is there any way to trigger the sending of a json file with all information within the work item whenever the work item state is updated?
As a workaround, you can try to create a custom service hook. Here is the document you can refer to .
Marketplace provides an extension(Azure DevOps Service Hooks DSL) . This extension framework is designed to ease the development of your own REST Web Hook web site to do this type of integration. It does this by providing a MVC WebAPI endpoint and a collection of helper methods, implemented as an extensible Domain Specific Language (DSL), for common processing steps and API operations such as calling back to the TFS/VSTS server that called the endpoint or accessing SMTP services.
Is there any way to trigger the sending of a json file with all
information within the work item whenever the work item state is
updated?
I am not sure if it is possible to trigger that.
But there is a ServiceNow DevOps extension for the integration between Azure Devops and Snow. You may use that.
I'm trying to integrate sonarcloud (not sonarqube) with a Slack channel. I want to have the same behaviour in Slack that the one we have in Github or Travis integration: I mean a push notification on a channel.
In Slack exists the option of a webhook but it's limited because only accepts an input format:
{
"text": "message"
}
On the other side, on sonarcloud, there is the possibility to send a POST message to a webhook, but doesn't exist the chance to choose the format of the message, because it's predefined. Has someone any idea about how to connect these two services?
I have thought to use a AWS lambda in order to adapt the message as a bridge but i'm looking for simpler ideas which do not require more infrastructure.
I've used email notifications from sonar cloud and added Zapier "Gmail-Slack" integration for emails with a specific filter. A bit hacky but it works well.
A little late, but for people who might be looking for the answer on this one. I didn't integrate SonarCloud with Slack (yet), but I've had success integrating both CircleCI and SonarCloud with GeckoBoard using zapier which is an online service. It can accept a webhook and then allows you to connect it to a different service (ie, Geckoboard or Slack) by selecting and modifying values in that webhook before sending it on in the correct format. Quite easy to do as well; no programming and no servers to maintain. Hope this helps.
I have been doing some research about the IBM Bluemix Cloud Integration Service and found the following links:
ftp://public.dhe.ibm.com/cloud/bluemix/cloudintegration/Cloud_Integration_for_Bluemix_User_Guide.pdf
https://www.ng.bluemix.net/docs/services/CloudIntegration/index.html
From what I have read, I have not been able to understand whether it is able to run some kind of "protocol transformation" or if it just publishes a REST or SOAP API.
I mean, imagine for example that I have a full backend publishing everything as SOAP services, but for some reason my apps only can get information through REST APIs. Does the basic connector o maybe the standard one make that kind of integration? Or do I need to put a third party product (or maybe even DataPower) to make that transformation?
Using the Cloud Integration service you can also create a REST API that links to an existing on-premises API (both SOAP and REST). Please take a look here: Creating a REST API that links to an existing on-premises API. You can upload a file that defines the on-premises API (WSDL or Swagger definition).
Please note that currently Cloud Integration cannot retrieve automatically that definition from your on-premises system. It has to be uploaded manually by the user.
I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo
I'm trying to develop a test framework for some ActionScript code we're developing (Flex 3.5). What's happening is this:
As part of a Web Analytics function we are calling a track method in a class, providing the relevant information as part of the call. This method is provided in a library (SWC), and we have no access to the code.
Ultimately the track method sends an outgoing http request to the tracking server. We can see this quite happily in HttpFox.
I was hoping to be able to capture this outgoing request and interrogate it in my test class, allowing us to a) run tests in a more standalone fashion, and b) programmatically determine that the correct information is being tracked.
No problem just run this developer tool that displays all requests leaving your machine.
http://www.charlesproxy.com/
Unless you're going to use a sniffing tool, which probably would be hard to use for a programmatic evaluation, I would recommend using a proxy to channel your request. You could let the track method send the request to a php script on the proxy server, have it evaluate the request content, and then forward it to the actual tracking server. I suppose on a tracking system, you won't need to worry about the response, so it shouldn't be too hard to implement.
You could run a web server on a localhost (or any really) and just make sure the DNS entry the code is trying to access points to the server you are running.