How to create a ServiceNow Change Request as a step in TFS 2018 Release - azure-pipelines-release-pipeline

I am trying to create a ServiceNow Change Request as one of the steps in my Release. I was trying to Agentless Phase step (Invoke Rest API: Post).
I found one article online that suggested to create a Generic Endpoint for ServiceNow. I tried that in the step failed, I'm sure I don't have it set up correctly.
2019-11-12T12:55:28.8833838Z POST https://xyzhelpdesk.service-now.com/api/now/table/change_request
Response Code: 0
Response: An error was encountered while processing request.
Exception: {"error":{"message":"Exception while reading request","detail":"Cannot decode: java.io.StringReader#90f857"},"status":"failure"}
Exception Message: The remote server returned an error: (400) Bad Request. (type WebException)
The endpoint has a username and password defined, but I think in the setup for Step I may need more information in the Header section.
I can create the CR via a Powershell script, I guess I could just use that but Not sure the correct way to go.
Basically I want to create a ServiceNow CR as part of my deployment process. Then there is a TFS plugin gated step that will check the status on the CR and when it's approves the process moves forward.
Who has examples ?
Thanks

Actually there is a built-in extension-- ServiceNow Change Management which provide by Microsoft almost fit your needs.
It includes:
A release gate to hold the pipeline till the change management
process signals implementation for a change request. You can create a
new change request for every deployment or use an existing change
request.
An agentless task to update a change request during the deployment
process. It is typically used as the last task in the stage.
However, this extension works only with Azure DevOps Services and Azure DevOps Server 2019 Update 1 onwards.. Not available to use on tfs2018. You could consider to upgrade your TFS to latest Azure DevOps Version.
With TFS 2018, suggest you to use a powershell script to handle this. It's able to use
ServiceNow and Azure DevOps Rest API. You could also take a look at great article blogs(similar for TFS):
Integrating VSTS Release Management with ServiceNow using Deployment Gate for Change Management
Implement an Azure DevOps Release Gate to ServiceNow

Related

Permission denied when running scheduling Vertex Pipelines

I wish to schedule a Vertex Pipelines and deploy it from my local machine for now.
I have defined my pipeline which runs well I deploy it using: create_run_from_job_spec, on AIPlatformClient running it once.
When trying to schedule it with create_schedule_from_job_spec, I do have a Cloud Scheduler object well created, with a http endpoint to a Cloud Function. But when the scheduler runs, it fails because of Permission denied error. I used several service accounts with owner permissions on the project.
Do you know what could have gone wrong?
Since AIPlatformClient from Kubeflow pipelines raises deprecation warning, I also want to use PipelineJob from google.cloud.aiplatform but I cant see any direct way to schedule the pipeline execution.
I've spent about 3 hours banging my head on this too. In my case, what seemed to fix it was either:
disabling and re-enabling cloud scheduler api. Why did I do this? There is supposed to be a service account called service-[project-number]#gcp-sa-cloudscheduler.iam.gserviceaccount.com. If it is missing then re-enabling API might fix it
for older projects there is an additional step: https://cloud.google.com/scheduler/docs/http-target-auth#add
Simpler explanations include not doing some of the following steps
creating a service account for scheduler job. Grant cloud function invoker during creation
use this service account (see create_schedule_from_job_spec below)
find the (sneaky) cloud function that was created for you it will be called something like 'templated_http_request-v1' and add your service account as a cloud function invoker
response = client.create_schedule_from_job_spec(
job_spec_path=pipeline_spec,
schedule="*/15 * * * *",
time_zone="Europe/London",
parameter_values={},
cloud_scheduler_service_account="<your-service-account>#<project_id>.iam.gserviceaccount.com"
)
If you are still stuck, it is also useful to run gcloud scheduler jobs describe <pipeline-name> as it really helps to understand what scheduler is doing. You'll see cloudfunction url, POST payload which is some base64 encoded and contains pipeline yaml and you'll see that it is using OIDC/service account for security. Also useful is to view the code of the 'templated_http_request-v1' cloud function (sneakily created!). I was able to invoke the cloudfunction from POSTMAN using the payload obtained from scheduler job.

Trigger for status update to send a patch method in Azure DevOps

Have been working on the integration between Azure DevOps Services and ServiceNow. Our goal is to send Change Requests from ServiceNow to Azure DevOps, where they would become Features or User Stories. Whenever there is some update on Azure DevOps, that update should be sent to ServiceNow, and vice versa.
The idea is to work with REST API.
From our investigation, we have found that it is possible to send updates to other applications through Web Hooks. We are still not sure if this will suite our needs and if we are able to work with this. The problem is that the webhooks only support the HTTP method POST while Service Now requests PATCH to update on it’s side. Is this correct is there any way of creating webhooks with PATCH method?
Other way that we can integrate is to create some software that will send response needed. However, we cannot seem to find a way to automate this response. As I understood, it will generate response only when the script run, not when work item is updated. Is there any way to trigger the sending of a json file with all information within the work item whenever the work item state is updated?
As a workaround, you can try to create a custom service hook. Here is the document you can refer to .
Marketplace provides an extension(Azure DevOps Service Hooks DSL) . This extension framework is designed to ease the development of your own REST Web Hook web site to do this type of integration. It does this by providing a MVC WebAPI endpoint and a collection of helper methods, implemented as an extensible Domain Specific Language (DSL), for common processing steps and API operations such as calling back to the TFS/VSTS server that called the endpoint or accessing SMTP services.
Is there any way to trigger the sending of a json file with all
information within the work item whenever the work item state is
updated?
I am not sure if it is possible to trigger that.
But there is a ServiceNow DevOps extension for the integration between Azure Devops and Snow. You may use that.

How to test WebHooks without an on-premise external system?

I'm trying to teach myself about integrating systems via WebHooks.
In a free/hosted GIS system, I can create a WebHook that would, in theory, POST a JSON object to an external system.
The problem is, I don't have an external system that's available right now for for receiving the POST.
I think I need some sort of publicly available sample server that would:
Receive the POST requests
Do something with the requests (ie. create some sort of record)
...so that I could determine if the WebHook worked correctly or not.
How can I test my WebHooks without having an on-premise external system?
I've poked around websites like Postman Echo and Amazon Lambda. But to my untrained eye, it seems like they're not quite designed for what I need.
You could use any of these options depending on your requirements:
You could use webhooks modules in services like Integromat or Zapier to receive webhook data and then apply transformation.
You could deploy a script on heroku and use the URL generated there to send the webhooks calls.
You could also use services like requestbin, webhook.site etc if you just want to receive webhooks data.
Regards

Azure - API Management - Management API SAS token expiry

We have an API management instance in azure.We have also enabled the management API. There we have set a SAS token & that has been used in the application. Here, we have to change the SAS token in every 30 days. If the token is expired, that will result into an application outage. Is there any way to get notified via email or any other means about the token expiry (in advance). I did some research on this, but , unfortunately could not find anything useful.
I don't believe there is built-in support for this.
You could instead have a logic app that takes in a SAS Token as input and schedules emails based on the expiry of the token.
TIP Apart from an email, you could do more interesting workflows like sending actionable messages using office 365 outlook or https://learn.microsoft.com/en-us/connectors/teams/#post-an-adaptive-card-to-a-teams-channel-and-wait-for-a-response connectors and even create a work item in Azure Devops or create an issue on GitHub, depending on what you use.
Alternatively you could create Azure AD application with client id and client secret, give it permissions to do what you need to be done in APIM and use its credentials to do all the same operations via ARM.

Transform service types when using Bluemix Cloud Integration Service

I have been doing some research about the IBM Bluemix Cloud Integration Service and found the following links:
ftp://public.dhe.ibm.com/cloud/bluemix/cloudintegration/Cloud_Integration_for_Bluemix_User_Guide.pdf
https://www.ng.bluemix.net/docs/services/CloudIntegration/index.html
From what I have read, I have not been able to understand whether it is able to run some kind of "protocol transformation" or if it just publishes a REST or SOAP API.
I mean, imagine for example that I have a full backend publishing everything as SOAP services, but for some reason my apps only can get information through REST APIs. Does the basic connector o maybe the standard one make that kind of integration? Or do I need to put a third party product (or maybe even DataPower) to make that transformation?
Using the Cloud Integration service you can also create a REST API that links to an existing on-premises API (both SOAP and REST). Please take a look here: Creating a REST API that links to an existing on-premises API. You can upload a file that defines the on-premises API (WSDL or Swagger definition).
Please note that currently Cloud Integration cannot retrieve automatically that definition from your on-premises system. It has to be uploaded manually by the user.