Inno Setup parameter with quotes in [Run] section - configuration

I use [Run] section to modify the merit value of some codecs with commandmerit.exe that supports command-line.
So the syntax is:
Commandmerit.exe "{E2B7DF46-38C5-11D5-91F6-00104BDB8FF9}" "0x800000"
{E2B7DF46-38C5-11D5-91F6-00104BDB8FF9} is the CLSID of the codec and
0x800000 is the value of the new merit, but when I put this line in [Run] section :
Filename: "{app}\Commandmerit.exe"; Parameters: ""{F8FC6C1F-DE81-41A8-90FF-0316FDD439FD}" "0x10000000""; WorkingDir: "{app}"
The flowing error is displayed:
Mismatched or misplaced quotes on parameter.
If I put this line:
Filename: "{app}\Commandmerit.exe"; Parameters: """{F8FC6C1F-DE81-41A8-90FF-0316FDD439FD}" "0x10000000"""; WorkingDir: "{app}"
The flowing error is displayed :
Unknown constant ...... use two consecutive"{" if .....
If I put this line:
Filename: "{app}\Commandmerit.exe"; Parameters: """{{F8FC6C1F-DE81-41A8-90FF-0316FDD439FD}}" "0x10000000"""; WorkingDir: "{app}"
Then no error is displayed but it seems that the commandmerite.exe don't understand the parameter, so after the installer finishes the merit still unchanged.

To add quotes to a parameter, you must double up each quote, and then put quotes around the entire value.
Your second attempt was close but you forgot the middle ones.
Filename: "{app}\Commandmerit.exe"; Parameters: """{F8FC6C1F-DE81-41A8-90FF-0316FDD439FD}"" ""0x10000000"""; WorkingDir: "{app}"

I can see two different things in your problem.
First, is the { having a special meaning in inno setup, because it is the start of a constant. So, you have to escape the { by doubling it, e.g. {{. There is no need to escape the closing bracket because it is treated as the end of a constant only if it is a start for that constant.
Second, is that you're trying to pass " as part of the string, but that seems unnecessary in this case, since the purpose of the " character in the command line parameters is to allow the use of blank spaces inside a single parameter, but none of your parameters have spaces.
All that said, you must try writing your command like this:
[run]
Filename: "{app}\Commandmerit.exe"; Parameters: {{F8FC6C1F-DE81-41A8-90FF-0316FDD439FD} 0x10000000; WorkingDir: "{app}"

Related

Inject Key Storage Password into Option's ValuesUrl

I'm trying to request a raw file from a Gitlab repository as a values JSON file for my job option. The only way so far that I managed to do it is by writing my secret token as plain text in the request URL:
https://mycompanygitlab.com/api/v4/projects/XXXX/repository/files/path%2Fto%2Fmy%2Ffile.json/raw?ref=main&private_token=MyV3ry53cr3Tt0k3n
I've tried using option cascading; I created a Secure password Input Option called gitlab_token which points to a Key Storage Password and tried every possible notation (with or without .value, quoted or unquoted option) in the valuesUrl field of the second option, instead of the plain token, but I keep receiving this error message pointing an invalid char at the position of the dollar sign:
I've redacted sensitive info and edited the error print accordingly
I reproduced your issue. It works using a text option, you can use the value in this way: ${option.mytoken.value}.
- defaultTab: nodes
description: ''
executionEnabled: true
id: e4f114d5-b3af-44a5-936f-81d984797481
loglevel: INFO
name: ResultData
nodeFilterEditable: false
options:
- name: mytoken
value: deda66698444
- name: apiendpoint
valuesUrl: https://mocki.io/v1/xxxxxxxx-xxxx-xxxx-xxxx-${option.mytoken.value}
plugins:
ExecutionLifecycle: null
scheduleEnabled: true
sequence:
commands:
- exec: echo ${option.mytoken}
- exec: echo ${option.apiendpoint}
keepgoing: false
strategy: node-first
uuid: e4f114d5-b3af-44a5-936f-81d984797481
Another workaround (if you don't want to use a plain text option) could be to pass the secure option to an inline script and manage the logic from there.
Please open a new issue here.

Nifi: Delimiters in Nifi

I was working on a task and got an error. Which says that invalid char between encapsulated token and delimiter.
Here is the SS of the data
For line 08, I was using the pipe as a delimiter, and the escape character was '^'. For line 12 I was using the comma as a delimiter. The highlighted part is the issue. If I remove the cap sign from line 08 and a single quote from line 12 it runs with success.
The processor is ConverRecord and here is the screenshot of the configs of the processor.
Actually, I am using two processors of ConvertRecord. In one processor the fields separator is a comma(,) whereas in the second processor the fields separator is also comma(,) but the escape character is Cap sign(^).
assume that these are two different records.
Why it is throwing error at that point? And how can I solve this issue?
Thanks in advance.
For the first sample data (line 08), configure CSVReader as:
Quote Character: "
Escape Character: \
Value Separator(delimiter): |
For the second sample data (line 12), configure CSVReader as:
Quote Character: "
Escape Character: \
Value Separator(delimiter): ,
The reason for failure is that your data does not conform with delimited data specifications i.e. data is invalid, so you need to add upstream cleanup logic.
For line 08 data - you have used escape character as ^ and same is appeared in the data as well, so when CSVReader encountered ^" it escaped " because of this opening double quote does not have a corresponding closing double quote causing to throw the exception. So setting Escape Character: \ property will resolve the issue. \ is kind of widely used escape character, so it is very rare to get \ as a part of data.
For line 12 data - seems like single quote ' is used as Quote Character and missing a corresponding closing quote character i.e. ' causing to throw the exception. You need to devise a logic that will add the missing closing quote character wherever required. A workaround would be to use Quote Character: " so that ' will be the part of the data and then you can clean it at downstream eg. if you are putting data into a table then post-ingestion updating the column to remove '

Snowflake how to escape all special characters in a string of an array of objects before we parse it as JSON?

We are loading data into Snowflake using a JavaScript procedure.
The script will loop over an array of objects to load some data. These objects contain string that may have special characters.
i.e.:
"Description": "This file contain "sensitive" information."
The double quotes on sensitive word will become:
"Description": "This file contain \"sensitive\" information."
Which broke the loading script.
The same issue happened when we used HTML tags within description key:
"Description": "Please use <b>specific fonts</b> to update the file".
This is another example on the Snowflake community site.
Also this post recommended setting FIELD_OPTIONALLY_ENCLOSED_BY equal to the special characters, but I am handling large data set which might have all the special characters.
How can we escape special characters automatically without updating the script and use JavaScript to loop over the whole array to anticipate and replace each special character with something else?
EDIT
I tried using JSON_EXTRACT_PATH_TEXT:
select JSON_EXTRACT_PATH_TEXT(parse_json('{
"description": "Please use \"Custom\" fonts"
}'), 'description');
and got the following error:
Error parsing JSON: missing comma, line 2, pos 33.
I think the escape characters generated by the JS procedure are escaped when passing to SQL functions.
'{"description": "Please use \"Custom\" fonts"}'
becomes
'{"description": "Please use "Custom" fonts"}'
Therefore parsing them as JSON/fetching a field from JSON fails. To avoid error, the JavaScript procedure should generate a double backslash instead of a backslash:
'{"description": "Please use \\"Custom\\" fonts"}'
I do not think there is a way to prevent this error without modifying the JavaScript procedure.
I came across this today, Gokhan is right you need the double backslashes to properly escape the quote.
Here are a couple links that explain it a little more:
https://community.snowflake.com/s/article/Escaping-new-line-character-in-JSON-to-avoid-data-loading-errors
https://community.snowflake.com/s/article/Unable-to-Insert-Data-Containing-Back-Slash-from-Stored-Procedure
For my case I found that I could address this challenge by disabling the escaping and then manually replacing the using replace function.
For your example the replace is not necessary.
select parse_json($${"description": "Please use \"Custom\" fonts"}$$);
select parse_json($${"description": "Please use \"Custom\" fonts"}$$):description;

How to type text into Vim's command-line without executing it?

My context:
I frequently take notes in VIM. I'd like a VIM function to type a standard-header into the command line (specifically, a timestamp such as :sav 20180418_) without executing; control would return with VIM still in command-mode (so that user could append remainder of the filename and execute).
My fundamental difficulty: I cannot seem to get a Vim function/macro to enter command-mode, supply text to the command line, then exit while staying in command-mode and not executing the text supplied.
Is this possible?
Thank you.
You can use expand() function. For example if you are currently editing file 20180418_.txt you can type:
:sav <c-r>=expand("%:r")<cr>
where <c-r>= should be typed as Ctrl+R followed by =. Enter key is <cr>. This will expand the text in the command line into:
:sav 20180418_

Ansible - passing JSON string in environment to shell module

I am trying to pass JSON string in environment.
- name: Start {{service_name}}
shell: "<<starting springboot jar>> --server.port={{service_port}}\""
environment:
- SPRING_APPLICATION_JSON: '{"test-host.1":"{{test_host_1}}","test-host.2":"{{test_host_2}}"}'
test_host_1 is 172.31.00.00
test_host_2 is 172.31.00.00
But in spring logs, I get JSON parse exception where it prints
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character (''' (code 39)): was expecting double-quote to start field name
at [Source: {'test-host.1': '172.31.00.00', 'test-host.2': '172.31.00.00'}; line: 1, column: 3]
As seen, double quotes are converted to single quotes !!!
I tried escaping double quotes but with no luck.
Any idea why it happens, or any work around?
There is a thing about Ansible template engine.
If a string seems like an object (starts with { or [) Ansible converts it into object. See code.
To prevent this, you may use one of STRING_TYPE_FILTERS:
- SPRING_APPLICATION_JSON: "{{ {'test-host.1':test_host_1,'test-host.2':test_host_2} | to_json }}"
P.S. this is why hack with space character from #techraf's answer works: Ansible misses startswith("{") comparison and don't convert string to object.
Quick hack: add a space to the variable definition (after the first single quote) - a single space doesn't influence the actual variable value (space will be ignored):
- name: Start {{service_name}}
shell: "<<starting springboot jar>> --server.port={{service_port}}\""
environment:
- SPRING_APPLICATION_JSON: ' {"test-host.1":"{{test_host_1}}","test-host.2":"{{test_host_2}}"}'
With the space Ansible passes to shell (test1, test2 are values I set):
SPRING_APPLICATION_JSON='"'"' {"test-host.1":"test1","test-host.2":"test2"}'"'"'
Without the space:
SPRING_APPLICATION_JSON='"'"'{'"'"'"'"'"'"'"'"'test-host.2'"'"'"'"'"'"'"'"': '"'"'"'"'"'"'"'"'test2'"'"'"'"'"'"'"'"', '"'"'"'"'"'"'"'"'test-host.1'"'"'"'"'"'"'"'"': '"'"'"'"'"'"'"'"'test1'"'"'"'"'"'"'"'"'}'"'"'
Order is reversed too. Seems like without a space it interprets the JSON, with the space as string.
I don't really get why it happens so...