I'm currently using GCE Container VMs (not GKE) to run Docker Containers which write their JSON formated log to the Console. The Log Information is automatically collected and stored in Stackdriver.
Problem: Stackdriver displays the data-field of the jsonPayload as text - not as JSON. It looks like the quotes of the fields within the payload are escaped and therefore not recognized as JSON structure.
I used both, logback-classic (like explained here) and slf4j/log4j (using JSONPattern) to generate JSON output (which looks fine), but the output is not parsed correctly.
I assume that, I have to configure somewhere that the output is JSON structured, not plain text. So far I haven't found an option where to do this when using a Container VM.
What does your logger output into stdout?
You shouldn't create a jsonPayload field yourself in your log output. That field gets automatically created when your logs get parsed and meet certain criteria.
Basically, write your log message into a message field of your JSON output and any additional data as additional fields. Stackdriver strips all special fields from your JSON payload and if there is nothing left then your message will end up as textPayload otherwise you'll get a jsonPayload with your message and other fields.
Full documentation is here:
https://cloud.google.com/logging/docs/structured-logging
Related
I am now using Jmeter to run the test of APIs.
The situation is that I have a login Api which will return a token inside response. I use a JSON extractor to save the token as a variable. Then, I use the ${token} is the header of other requests.
However, I found that when I was trying to run 40-50 threads, the ${token} in some threads would be empty, and caused a high error rate.
Therefore, may I ask is there any method to solve it and why?
Thanks very much.
Try saving full response from the Login API, most probably your server gets overloaded and cannot return the token and returning some error message instead.
There are following options:
If you're running JMeter in command-line non-GUI mode you can amend JMeter's Results File Configuration to store the results in XML form and include the response data, add the next lines to user.properties file:
jmeter.save.saveservice.output_format=xml
jmeter.save.saveservice.response_data=true
and when you run your test next time the .jtl results file will contain response bodies for all the requests.
Another option is using a Listener like Simple Data Writer configured like:
and when you run the test the responses.xml file will contain the response data
Both .jtl results file and responses.xml can be inspected using View Results Tree listener
More information: How to Save Response Data in JMeter
Using
NotificationHubClient hub = NotificationHubClient.CreateClientFromConnectionString(notificationHubConnection, notificationHubName, enableTestSend);
NotificationOutcome outcome = await hub.SendDirectNotificationAsync(fcmNotification, deviceUri);
I am able to send and receive notifications using FCM via the Azure hub to a Xamarin Android app, finally. However the payload is not present in the received RemoteMessage even though the sent fcmNotification json payload looks good and passes validation. I am basically looking at the RemoteMessage.Data property, but not finding the expected payload array. Looking at RemoteMessage structure, I haven't found any part of the payload array either.
I know that the Azure hub manipulates the notification by adding the necessary headers like content type, e.g. "application/json". Are there any other settings that are needed to be passed to enable the "data" only payload?
Additional settings are not necessary, but the format of the entire notification contents has to have this type of structure:
"{ \"data\":{ \"A\": \"aaa\",\"B\": \"bbb\",\"C\": \"ccc\",\"D\": \"ddd\",\"E\": \"eee\",\"F\": \"fff\"}}"
The number of data elements is up to you. The data element name can be anything as well as the associated content except for the need for backslash usage for special characters. Both the element name and content can be inserted using variables creating a traditional composite string.
What is especially important is the inserted spaces as shown. Also note that traditional Json formatting is not acceptable because of the need for those spaces.
I want to create small automation between Jira and Azure. To do this, I execute HTTP trigger from Jira, which send all request properties to Azure Logic App. In "When a HTTP request is received" step in Logic App I can see properly JSON schema with all data which I need. In next steps for example I want to add user to Azure AD group. And problem starts here.
For example, I want to Initialize variable and set it with value from JSON. I choose properties from dynamic menu, but after script execute always it is null value ( but in first step in "raw output" I see whole schema with data). I tried many things - parse, compose, many different conversion - always without any luck - null value or "".
Expected value - when I want to initialize variable using Properties from Dynamic Content I want to have value from input json.
Output from Jira
Output with the same JSON send from Postman
Thanks for any help !
--
Flow example
Flow result
If you send the json with application/json content-type, you could just select the property with the dynamic content, however if not you have to parse it to json format with Parse Json action.
AS for the schema, you need use your json data to generate it with Use sample payload to generate schema. Paste the sample json payload.
Then you will be able to select property. However you could not implement it dynamic content, you have to write the expression. The format will be like this body('Parse_JSON')['test1'] and if your json has array data, you need to point the index it will be like this body('Parse_JSON')['test2'][0]['test3'].
Below is my test, you could have a try.
I'm working on a webapp, which produces large JSON objects. I log these JSONs via console.log and I am looking now for a way to extract these JSONs and store them in a text file.
The actually problem are the sizes of the JSONs.
My approch so far was to store the JSON to a local variable
And then calling JSON.stringify(temp0). But afterwards firefox will not print the whole string.
(i am german - my english is horrible and dont know exact translations - but:)
if its json from "remote" (or local by service) - not generatet inside the script -
then
dont look at tab console -
look at tab networkanalyze
select the script (by url)
and there you get in responsetab all you need -
but this works only with get/post/put etc ....
I am trying to use SpringXD to stream some JSON metrics data to a Oracle database.
I am using this example from here: SpringXD Example
Http call being made: EarthquakeJsonExample
My shell cmd.
stream create earthData --definition "trigger|usgs| jdbc --columns='mag,place,time,updated,tz,url,felt,cdi,mni,alert,tsunami,status,sig,net,code,ids,souces,types,nst,dmin,rms,gap,magnitude_type' --driverClassName=driver --username=username --password --url=url --tableName=Test_Table" --deploy
I would like to capture just the properties portion of this JSON response into the given table columns. I got it to the point where it doesn't give me a error on the hashing but instead just deposits a bunch of nulls into the column.
I think my problem is the parsing of the JSON itself. Since really the properties is in the Features array. Can SpringXD distinguish this for me out of the box or will I need to write a custom processor?
Here is a look at what the database looks like after a successful cmd.
Any advice? Im new to parsing JSON in this fashion and im not really sure how to find more documentation or examples with SpringXD itself.
Here is reference to the documentation: SpringXD Doc
The transformer in the JDBC sink expects a simple document that can converted to a map of keys/values. You would need to add a transformer upstream, perhaps in your usgs processor or even a separate processor. You could use a #jsonPath expression to extract the properties key and make it the payload.