I'm trying to create service, that writes input and output pipeline(xmldata) every time, that one of installed services is invoked. How my service should know, when other service is invoked? Is there any builtin tool to catch in&out pipeline and save to file or any other source?
You can use one of webMethods' build-in service in WmPublic package:
pub.flow:savePipelineToFile
Note that it's not recommended to use "savePipelineToFile" in production for obvious performance/resources reasons. And of course, to load the pipeline from the file use:
pub.flow:restorePipelineFromFile
The usual work flow to debugging in webMethods is:
Add the "savePipelineToFile" step as the first instruction in
your flow service.
Execute flow service or execute from some client that will trigger the flow service.
Disable the "savePipelineToFile" step in the flow service.
Add the "restorePipelineFromFile" step at the very top in your flow service.
Run webMethods Designer in Debug mode and step through. Once it
goes over the "restorePipelineFromFile" step, you should see the
entire content of the pipeline including the inputs as passed by the
client. If you continue stepping through your flow service step by
step then you should see the what the final output is.
The services "savePipelineToFile" and "restorePipelineFromFile" are very useful to debug your services. The "pipeline" file will be located at:
IntegrationServer\pipeline
If your requirements dictate that you need to produce xml flat files then use the following:
To serialize documents to xml (WmPublic package)
pub.xml:documentToXMLString
To write xml string to file (WmPublic package)
pub.file:stringToFile
Hope it helps!
EDIT:
By default the soap headers are hidden. Most of the time you don't need the soap headers but you can force webMethods to add the soap headers to the pipeline by doing the following.
1) Click on the web service descriptor.
2) In the properties panel, set "Pipeline Headers Enabled" to true.
The soap headers won't appear in your pipeline at design time. It will only exist at runtime. If you want to see the content of the soap header then you will need to use the "savePipelineToFile" / "restorePipelineFromFile" technique that I described previously.
To make the soap header appear at design time then you will have to do implicit mapping to the soap header fields as if they were there.
Related
I have been asked to implement a centralized monitoring and logging system using DataDog that will receive information from various services and applications, some running as Windows Services on virtual machines and some running inside a Kubernetes cluster. In order to implement the logging aspect so that DataDog can correctly ingest the logs, I'm using Serilog to do the logging.
My plan is currently to write the logs to the console in json format and have the DataDog agent installed on each server or k8s node capture and ship them to DataDog. This works, at least for the k8s node where I've implemented it so far. (I'm trying to avoid using the custom Serilog sink for DataDog as that's discouraged in the DataDog documentation).
My problem is that I cannot get logs ingested correctly on the DataDog side. DataDog expects the json to contain a property call Message but Serilog names this property RenderedMessage (if I use JsonFormatter(renderMessage: true)) or #m (if I use RenderedCompactJsonFormatter()).
How can I get my logs shipped to DataDog and ingested correctly on the DataDog end?
Answering my own question.
The DataDog logging page has a Configuration section. On that page the "Pre processing for JSON logs" section allows you to specify alternate property names for a few of the major log message properties. If you add #m to the Message attributes section and #l to the Status attributes section you will correctly ingest JSON messages from the RenderedCompactJsonFormatter formatter. If you add RenderedMessage and Level respectively you will correctly ingest JsonFormatter(renderMessage: true) formatter. You can specify multiple attributes in each section, so you can simultaneously support both formats.
If you would use the Console Sink
just apply a corresponding colored Theme to it.
I have a basic NIFI flow with one GenerateFlowFile processor and one LookupRecord processor with RestLookupService.
I need to do a rest lookup and enrich the incoming flow file with rest response.
I am getting errors that the lookup service is unable to lookup coordinates with the value I am extracting from the incoming flow file.
GenerateFlowFile is configured with simple JSON
LookupRecord is configured to extract the key from the JSON and populate it to the RestLookupService. Also, JsonReader and JsonSetWriter is configured to read the incoming flow file and to write the response back to the flow file
The RestLookupService itself exits with JsonParseException about unexpected character '<'
RestLookupService is configured with my API running in the cloud in which I am trying to use the extracted KEY from the incoming flow file.
The most interesting bug is that when I configure the URL to point for example to mocky.io everything works correctly so it means that the issue itself is tight with the API URL I am using (http://iotosk.ddns.net:3006/devices/${key}/getParsingData). I have tried also removing the $key, using the exact URL, using different URLs..
Of course the API is working OK over postman/curl anything else. I have checked the logs on the container that the API is running on and there is no requests in the logs what means that nifi is failing even before reaching the API. At least on application level...
I am absolutely out of options without any clue how to solve this. And with nifi also google is not helpful.
Does anybody see any issue in the configuration or can point me in some direction what can cause this issue?
After some additional digging. The issue was connected with authentication logic even
before the API receives it and that criples the request and and returned XML as Bryan Bende suggested in the comment.
But definitely better logging in nifi will help to solve this way faster...
I am trying to do API end points testing with JMeter. I am able to create test cases using JMeter GUI. But, facing while integrating JMeter with Maven Project. Please Help me.
There is no easy way to display the details of the failed requests, JMeter Maven Plugin is capable of only generating a HTML Reporting Dashboard which failures reporting capabilities are limited to:
The error table providing a summary of all errors and their proportion in the total requests
The Top 5 Errors by Sampler table providing for every Sampler (excluding Transaction Controller by default) the top 5 Errors :
So the options are in:
Modify the response message and add to it the information you need using i.e. JSR223 Listener
Use Flexible File Writer Listener to store request and response data and whatever else you need into a separate file
Amend the FreeMarker templates in the report-template folder according to your needs and configure JMeter to use this modified template folder for the dashboard generation
Any combination of all above
I have a WCF backend service which is a SOAP/XML service and I need to expose it to my consumers.
Importing the WSDL is not a problem, but I don't like the naming/url of the operations which is based on the SoapAction in the WSDL.
Manually I can change the display name, URL and HTTP verb to make it more 'restful' on the outside, but is there a way to automate this?
I like to add this to my ARM template somehow.
You can't change how WSDL import works, but it's only a batch operation of sorts. If you generate ARM template after you imported WSDL you will be able to use it to recreate your API without source WSDL. And you will be able to change that template however you see fit.
This might be a weird question but I am open for all the suggestions.
The background is I want to use script to automatically deploy/remove docker container on Jelastic, but unfortunately this part is not well documented in Jelastic official API document. Jelastic provided me a piece of sample code demonstrated how to use bash to create a new environment with a new docker container but it is not enough, I still don't know how to create/remove docker container by looking at the sample code.
Since Jelastic is using standard JSON API, I am wondering is there any tool which can automatically retrieve/detect that the parameters I can use with Jelastic JSON API?
If you were me, how would you get over this if there is no document as reference?
I am keen to use Jelastic, but this issue stopped me from onboarding, many thanks.
J.
All the parameters that can be used with Jelastic JSON API are specified at http://docs.jelastic.com/api/ page.
To use JSON API without document I suggests to you check the Postman API tool https://www.getpostman.com/. This application allows you to see all the sent/received data and allows you to passes JSON values without any document or any additional actions.
Simplest scenario for beginners: Go to API docs, section Users>Authorization, using Signin method you should to obtain the session value, that is necessary almost for all further actions. Then you need to obtain information about environment, section Environment>Environment, at first you should to executes GetEnvs method, then using the application identifier of the environment that was obtained from the previous command you should to executes GetEnvInfo method. As a results of the described scenario you will get all parameters and values that can be used with Jelastic JSON API for certain type of the environment.