Protractor - How can I hold es-bdd-id in json file, so I can use it alter in the tests? - json

I'm trying to hold data object variables in a json file. Everything works fine with ids, but I can't get it to work for es-bdd-ids. Entry in json file looks like this:
"businessNameInputField":"[es-bdd-id=\"rs-application-business-page-rs-business-form-es-panel-body-es-field-edit-input-businessName\"]"
And when I run the script I get error:
WebDriverError: invalid argument: 'value' must be a string
How can I keep such variables in a json file? Any help appreciated.
Code:
businessName = getRandomString(9);
var businessNameInputField = element(by.css(webObjectVariables.businessNameInputField))
businessNameInputField.sendKeys(businessName)

Related

bigquery error: "Could not parse '41.66666667' as INT64"

I am attempting to create a table using a .tsv file in BigQuery, but keep getting the following error:
"Failed to create table: Error while reading data, error message: Could not parse '41.66666667' as INT64 for field Team_Percentage (position 8) starting at location 14419658 with message 'Unable to parse'"
I am not sure what to do as I am completely new to this.
Here is a file with the first 100 lines of the full data:
https://wetransfer.com/downloads/25c18d56eb863bafcfdb5956a46449c920220502031838/f5ed2f
Here are the steps I am currently taking to to create the table:
https://i.gyazo.com/07815cec446b5c0869d7c9323a7fdee4.mp4
Appreciate any help I can get!
As confirmed with OP (#dan), the error encountered is caused by selecting Auto detect when creating a table using a .tsv file as the source.
The fix for this is to manually create a schema and define the data type for each column properly. For more reference on using schema in BQ see this document.

Invalid property name: `errorType` on class `java.lang.String`. Validate that the correct setters is present

When I try to use a json-logger in Mule 4. I'm getting this error. I'm trying to log a error object here but it is not getting successfull. Please find the error object below.
I sorted out the issue. The issue was we cannot give JSON in the MESSAGE section of json-logger. When i changed it to a string. It worked
The MESSAGE section is meant to describe what you are going to Log.
It looks like you are trying to use the function stringifyNonJSON() with a Mule error and treat it like a String. Without more details of the flow and the payload it is not possible to have more insights.
You could try to create a string from the payload manually first and use that as the parameter, as the function is not able to handle this case apparently.

Error - cannot convert, not a json string: [type: INPUT_STREAM, value: java.io.BufferedInputStream#5f8890c2 in karate framework

In karate framework, while executing one test case, getting error
java.lang.RuntimeException: cannot convert, not a json string: [type: INPUT_STREAM, value: java.io.BufferedInputStream#5f8890c2] at com.intuit.karate.Script.toJsonDoc(Script.java:619) at com.intuit.karate.Script.assign(Script.java:586) at com.intuit.karate.Script.assignJson(Script.java:543) at com.intuit.karate.StepDefs.castToJson(StepDefs.java:329) at ✽.* json vExpectedJSONObject = vExpectedJSONFileContent,
Acually in this framework, we are executing sql query and then result of that query is stored at abc.json file. but due to this error that result is not getting stored in that json file.
I have tired with multiple option like file incoding - set to utf-8 then adding plugin in to pom.xml.
json vExpectedJSONObject = vExpectedJSONFileContent, I am expecting the sql result should be stored in json file.
Finally got the solution:), Issue was related to framework setup, actually we are trying to call Runtime.getRuntime().exec funtion to execute our sql query by using command at cmd prompt. but due to some access privileges that command was not executing, so after debug, we have put that mysql.exe file into jre/bin folder and then it works....

Converting evtx log to csv error

I am trying to convert and evtx log file to csv from log parser 2.2. I just want to copy all of the data into a csv.
LogParser "Select * INTO C:\Users\IBM_ADMI
N\Desktop\sample.csv FROM C:\Users\IBM_ADMIN\Desktop\Event
Logs\sample.evtx" -i:EVTX -o:csv
But I am getting the error below.
Error: Syntax Error: extra token(s) after query: 'Logs\sample.evtx'
Please assist in solving this error.
I know this has been a year but if you (or other people) still need it and for sake of reference, this is what I do:
LogParser "Select * INTO C:\Users\IBM_ADMIN\Desktop\sample.csv FROM 'C:\Users\IBM_ADMIN\Desktop\Event Logs\sample.evtx'" -i:evt -o:csv
Correct input type is evt, not evtx.
If there is space in the Event Logs folder, enclose with single quote.
The Problem was due to the extra space in between the folder name Event Logs. Changed the folder name to a single workd and it worked.
you have to convert .evtx file to .csv than you can read from this .csv file.
like this .enter image description here
//String command = "powershell.exe your command";
//called the PowerShell from java code
String command = "powershell.exe Get-WinEvent -Path C:\windows\System32\winevt\Logs\System.evtx | Export-Csv system.csv";
File seys = new File("system.csv");
Process powerShellProcess = Runtime.getRuntime().exec(command);

MySQL JSON data type and JSON_ARRAY_APPEND does not work

I am struggle with JSON and MySQL. I am trying to append to a JSON object.
Object has this form:
[{},{},...]
SELECT JSON_ARRAY_APPEND(my_meta, '$[0]', '{"private_key": "adfadf"}')
FROM my_object where user_id='11111'
I get the below error using 5.7.8-rc
Error Code: 1305. FUNCTION JSON_ARRAY_APPEND does not exist
So how do append then given I follow the docs?