I have recorded a script for the help desk ticket and inside that ticket one field "operation" is read-only and it's dependent on other fields.
now, I want to fill the data in the "operation" field, but due to read-only validation and dependent on other fields, the script failed.
In addition, I already tried to pass values(false value replaced with value) via script but it failed. for example:
Search -> "prodid":false,
replace all -> with "prodid":123,
Kindly suggest how to pass values or ids in the read-only selection field.
If you're talking about HTML readonly attribute - JMeter doesn't care about it, JMeter acts on HTTP protocol level.
If you can follow your test steps using a real browser without issues and cannot using JMeter most probably your attempt to replay the script fails due to missing or improperly implemented correlation, start from adding HTTP Cookie Manager to your test plan and inspecting requests details for any dynamic values, if they are there - they're a subject to correlation, you will need to extract the values from the previous response using a suitable JMeter Post-Processor(s) and replace recorded hard-coded values with the JMeter Variable(s) from the Post-Processor(s)
Related
Well I will explain all about my case.
Im trying to set up Azure alerts that sends a custom mails, to do so I need a logic app that parse the info about the said alerts.
The problem is, even if I enable the common alert schema, and fill the custom properties field, as you can see in the image.
But what this alert sends to my Logic App in the customProerties field is a Null value, I don't get why.
But more than that, if I disable the common alert schema, the custom properties field will be sent without problems.
I don't understand if common alert schema doesn't allow customProperties, or if Im doing something bad, I need help.
Thanks for read and ask for it if anything of this post is bad explained.
I have just confirmed this issue with Microsoft support.
If I point an Activity Log Alert Rule to an Action Group Webhook with Common Schema enabled then the Custom Properties don't appear in the JSON payload. If I disable the Common Schema then the property does appear in the payload.
If I do the same for a Metric alert or Log Query alert, the Custom Properties do appear at the Webhook endpoint regardless of whether the Common Schema is enabled or not.
Microsoft pointed that the schemas for each type are documented (no custom property on the Activity Log Common Schema) and that this is not a bug. Well... the Alert Rule form does allow to configure the Custom Properties for each type of alert so... ah well, nevermind.
They also said "There are plans to align the behaviour on all alert types including activity logs, although there is no definite ETA though. For now, the best option for you to be able to customize the payloads of activity log alerts is by using logic app as an action group."
We are generating an OpenApi definition using Swagger/Swashbuckle. This definition is then imported into Azure API Management.
We have some querystring parameters on get requests that we have marked as required. Our validation ensures the querystring parameters are present and valid, otherwise we return a 400 Bad Request with details of which parameters are invalid/missing. The relevant part of the OpenAPI definition is below. Two querystring parameters (marked as required) and one path parameter (marked as required).
My problem is the way the OpenApi definition is converted into APIM operations.
The required querystring parameters are added as template parameters and they are added to the operation url. This means if they are not provided, APIM cannot match the request to an operation and we return a 404 to the caller rather than the helpful 400 that the backend would return.
I can't add easily add empty values into the querystring. I can't do that in the inbound policy of the operation as it doesn't match the operation. Doing it in the global inbound policy would mean I had to identify the operation myself (this is just one of many). Similarly, while I can return a 400 bad request in the onerror policy, I can't easily tell the caller what was wrong with the request.
I think it's built into the import process. When I changed the template parameters to query parameters in the portal and exported, the OpenApi definition was practically identical. When I reimported the exported template, the same thing happened. I also tried going via Wadl which looked more promising but I couldn't reimport that template.
Is there any way to move template query string parameters to be query string parameters? Any other options?
At the moment (since 2018) there is the bug in Azure APIM API import. Link.
It's status under review. We tried to raise this directly to Microsoft, but there was no solution provided from their side.
I need to store the values of devices' attributes with the right type in OrionCB's MongoDB.
As I was unable to perform that I dived into the code and found that IoTAgentUL (as well as IoTAgentJSON) uses OrionCB's API v1 instead of API v2.
As I can see API v1's updateContext sends data to MongoDB without it's type, so every measure is stored as text.
In the other hand I found that API v2's update entity send data to MongoDB with it's type. It produces that I can store attribute's values with it's type which benefits me when manipulating data (i.e. creating indexes, sorting, etc).
My question is if is there any workaround to solve this using the current implementations of IoT Agents.
The only workaround I can imagine is, once entities are automatically created by the IoT Agents, to update the type of such entities by your own. I mean, AFAIK, you can update both the value and the type of an entity.
In more details, I can think on a script that subscribes for all entities of certain type (those created by the agents). Then, when an entity is created this is notified to the script, which automatically updates the type of the entity's attributes.
Please observe you only need to modify the attribute's types once, just when the entities are created, not when an entity's attribute is updated; thus, something like an array or cache of already modified entities is needed in your script.
I have created jmeter script for load testing and added CSV data element to read Username and Password of 300 Users.Also followed below steps;
Created CSV file using a notepad and stored it in a directory where created script stored.
Defined variables and file path in CSV data set element
Used variables as parameters of HTTP request
Increased number of Threads
But it doesn't worked as expected.How can i fix this?
Temporary reduce threads count to 1-2, add View Results Tree listener and inspect request and response details.
If don't see proper username/password combination in the request body - your script is misconfigured. Check out Using CSV DATA SET CONFIG guide for detailed setup instructions.
If you see valid username/password pair but login is not successful, follow below checklist:
Add HTTP Cookie Manager to your test plan
If point 1 doesn't help - check for dynamic request parameters. Probably you'll need to extract some value from the first response (when you execute GET request for the login page) with the Regular Expression Extractor, convert it into a JMeter Variable and add it as a parameter to the request which performs login.
Here's my situation:
I want to do this:
I have a list of URLs in a MySQL database which I want to hit using a HTTP Request to see if the response is a HTTP Status code of 404 or not.
I have done this:
Added and configured a JDBC Config Element.
Added and configured a JDBC Request Sampler. Basically a select statement that returns a table with 8 columns. I have provided 8 comma-separated variables for the 'Variable names' field, so that the results of the JDBC request can be identified with these variable names.
Created a HTTP Request Sampler that uses one of those variables ${url} in the 'Server Name or IP' field.
Though the JDBC request works flawlessly and returns a table with a bunch of rows, the problem with this is that the HTTP Request Sampler never picks up the variable from the JDBC Request result.
The HTTP Request looks like this in the 'View Results Tree':
GET http://${url}/
I have tried these solutions:
Add 'Save Responses to a File' listener to the JDBC Request. This creates a file of type '.plain' and not a CSV. Had it been a CSV, I could have utilized that CSV file by creating a CSV Data Set Config. So this attempt failed.
I have tried forcing the file name in the above attempt to always use 'C:\JMETERTest\data.csv'. But it ends up creating a new file named 'C:\JMETERTest\data.csv1.plain'. This attempt failed too.
I tried to reference the URL column as ${url_1} in the HTTP Request's Server Name field. It worked. But the problem now is that in the results tree, all the requests are going for the the URL from only the first row of the result set. I see that this is because of the row number '_1' specified in the ${url_1} above. I can use this if someone can suggest a way to parameterize the '_1' into a variable that I can loop through (probably using a 'Counter' element). I created a Counter Config Element by the Reference Name 'loopCounter'. And used this in the Server Name field of the HTTP Request:
${url_("${loopCounter}")}
But now my HTTP Requests look lamer:
GET http://${url_("${loopCounter}")}/
This did not work too.
Solution 3 looks more doable to be only if I could resolve the parameterization of the row number.
I am open to JMeter Plugin suggestions too.
I will update anything else that I try as we go on.
P.S. Please let me know if my question is not clear in anyway.
Have you tried wrapping the HTTP sampler in a ForEach controller (parent) where the variable for the controller is the URL variable obtained from the JDBC sampler?
Also, the output variable in the ForEach will be the variable you now use in the HTTP sampler.
That way it will iterate through each variable from the beginning of the index to the end and run the sampler once each time.
In 'Save responses to a File' Listener, Select Checkboxes "Don't add Suffix and Prefix". Checking these two options will ensure, you get exact Log file name.