Quoting for preprocessor string on meson - double-quotes

This is my meson build script:
project('conce', 'cpp', version: '1.0.0.0', default_options: 'cpp_std=c++11')
progname = meson.project_name()
progver = meson.project_version()
progdefs = ['-DUSE_MESON', '-DID=69', '-DVER=\"' + progver + '\"']
bin = executable(progname, 'main.cpp', cpp_args: progdefs)
run_target('run', command: bin)
I would like to define VER with project version. This produce error on compiling main.cpp: error: stray '\' in program. So my question here is, how can I quote my string in meson?

The simple answer is, you don't :-)
Meson automatically escapes any strings you pass on to its APIs if needed, so you don't have to care about escaping. It makes sense that Meson does this for you, as there might be multiple levels that need to be escaped, for example the backend that will actually build your targets, as well as any strings passed to a shell.
In other words, you can solve it by doing this:
progdefs = ['-DUSE_MESON', '-DID=69', '-DVER="' + progver + '"']

Related

Python - iterate and replace (IPv4) with (IPv4 plus GeoIP)

I am looking for an elegant way to parse a text file (i.e. a log file containing source and destination IPs and lots of other data) keeping each line intact, and replacing all IPv4 addresses with the same IP followed by a comma and the GeoIP country code of that IP.
I have tried doing this in bash, sed, perl, and python. I tried a hundred perl one-liners and never quite got it because substitution like s/original/replacement/g doesn't want to execute GeoIP lookup in the substitution field. For example:
perl -pe 's/([0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3})/($1,system(geoiplookup $1))/g' < log.csv
results in:
"srcip=(110.110.110.110,system(geoiplookup 110.110.110.110))"
instead of the executing geoiplookup.
I've tried this with backticks as well as exec, lots of different punctuation, with the same result.
In Python I tried some code that looks like:
rexp_ip = r"(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})"
repl = { rexp_ip: rexp_ip+".test" }
---
while line:
line = i.readline()
print(re.sub(rexp_ip, lambda m: str(repl.get(m.group())), line))
It seems pretty close but I'm not sure whether I'm on the right track here.
I would be open to bash, sed, awk, perl, python, or any other solution.
This seems fairly simple to me and I may be over-thinking it!
I am guessing I'm not the first person who has tried this and maybe I'm 'reinventing the wheel' here.
Any insight would be appreciated.
I may have solved my own problem using perl with /e switch--
$ perl -lpe 's/([0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3})/(`printf $1;geoiplookup $1`)/eg' < log.csv

Compress heredoc declaration to one line in bash?

I have this which works to declare a JSON string in a bash script:
local my_var="foobar"
local json=`cat <<EOF
{"quicklock":"${my_var}"}
EOF`
The above heredoc works, but I can't seem to format it any other way, it literally has to look exactly like that lol.
Is there any way to get the command to be on one line, something like this:
local json=`cat <<EOF{"quicklock":"${my_var}"}EOF`
that would be so much nicer, but doesn't seem to take, obviously simply because that's not how EOF works I guess lol.
I am looking for a shorthand way to declare JSON in a file that:
Does not require a ton of escape chars.
That allows for dynamic interpolation of variables.
Note: The actual JSON I want to use has multiple dynamic variables with many key/value pairs. Please extrapolate.
I'm not a JSON guy, don't really understand the "well-formed" arguments in the discussion above, but, you can use a 'here-string' rather than a 'here-document', like this:
my_var="foobar"
json=`cat <<<{\"quicklock\":\"${my_var}\"}`
why not use jq? It's pretty good at managing string interpolation and it lints your structure.
$ echo '{}' >> foo.json
$ declare myvar="assigned-var"
$ jq --arg ql "$myvar" '.quicklock=$ql' foo.json
the text that comes out on the other end of that call to jq can then be cat into a file or whatever you wanna do. text would look something like this:
{"quicklock": "assigned-var"}
You can do this with printf:
local json="$(printf '{"quicklock":"%s"}' "$my_var")"
(Never mind that SO's syntax highlighting looks odd here. Posix shell command substitution allows nesting one level of quotes.)
A note (thanks to Charles Duffy's comment on the question): I'm assuming $my_var is not controlled by user input. If it is, you'll need to be careful to ensure it is legal for a JSON string. I highly recommend barring non-ASCII characters, double quotes, and backslashes. If you have jq available, you can use it as Charles noted in the comments to ensure you have well-formed output.
You can define your own helper function to address the situation with missing bash syntax:
function begin() { eval echo $(sed "${BASH_LINENO[0]}"'!d;s/.*begin \(.*\) end.*/\1/;s/"/\\\"/g' "${BASH_SOURCE[0]}"); }
Then you can use it as follows.
my_var="foobar"
json=$(begin { "quicklock" : "${my_var}" } end)
echo "$json"
This fragment displays the desired output:
{ "quicklock" : "foobar" }
This is just a proof of concept. You can define your syntax in any way you want (such as end of the input by the custom EOF string, correctly escape invalid characters). For example, since Bash allows function identifiers using characters other than alphanumeric characters, it is possible to define such a syntax:
json=$(/ { "quicklock" : "${my_var}" } /)
Moreover, if you relax the first criterion (escape characters), ordinary assignment will nicely solve this problem:
json="{ \"quicklock\" : \"${my_var}\" }"
How about just using the shell's natural concatenation of strings? If you concatenate ${mybar} rather than interpolate it, you can avoid escapes and get everything on one line:
my_var1="foobar"
my_var2="quux"
json='{"quicklock":"'${my_var1}'","slowlock":"'$my_var2'"}'
That said, this is a pretty crude scheme, and as others have pointed out you'll have problems if the variables, say, contain quote characters.
Since no escape chars is strong requirement here is a here-doc based solution:
#!/bin/bash
my_var='foobar'
read -r -d '' json << EOF
{
"quicklock": "$my_var"
}
EOF
echo "$json"
It will give you the same output as the first solution I mentioned.
Just be careful, if you would put first EOF inside double quotes:
read -r -d '' json << "EOF"
$my_var would not be considered as a variable but as a plain text, so you would get this output:
{
"quicklock": "$my_var"
}

Ubuntu Bash Script : error while executing function [duplicate]

This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.

run from Shell R function with json string parameter

I have function, that works with json string.
When I try in R:
my_function('{"menu":{"id":"file","value":"File","popup":{"menuitem":[{"value":"New","onclick":"CreateNewDoc()"},{"value":"Open","onclick":"OpenDoc()"},{"value":"Close","onclick":"CloseDoc()"}]}}}')
it works well.
But when I try in Shell command:
R -e "source('./my_function.R'); my_function('{"menu":{"id":"file","value":"File","popup":{"menuitem":[{"value":"New","onclick":"CreateNewDoc()"},{"value":"Open","onclick":"OpenDoc()"},{"value":"Close","onclick":"CloseDoc()"}]}}}')"
It fails with error:
unexpected character 'm'
.
Seems, that problem is with quotes in json string. How can I solve it?
P.S. I need to call my_function directly from Shell.
Thank you!
Write it main script like main.r,
source('./my_function.R')
my_function('{"menu":{"id":"file","value":"File","popup":{"menuitem":[{"value":"New","onclick":"CreateNewDoc()"},{"value":"Open","onclick":"OpenDoc()"},{"value":"Close","onclick":"CloseDoc()"}]}}}')"
Execute it from command terminal like,
Rscript main.r
please make sure you have R path configured.
You can't mix the quotes as you are doing. The shell is reading from your opening double quotes until the first double quotes it finds ( which is in your JSON string). It then sees an m (in menu) which it can't handle and gives the error message.

Is JSON safe to use as a command line argument or does it need to be sanitized first?

Is the following dangerous?
$ myscript '<somejsoncreatedfromuserdata>'
If so, what can I do to make it not dangerous?
I realize that this can depend on the shell, OS, utility used for making system calls (if being done inside a programming language), etc. However, I'd just like to know what kind of things I should watch out for.
Yes. That is dangerous.
JSON can include single quotes in string values (they do not need to be escaped). See "the tracks" at json.org.
Imagine the data is:
{"pwned": "you' & kill world;"}
Happy coding.
I would consider piping the data in to the program in question (e.g. use "popen" or even a version of "exec" that passes arguments directly) -- this can avoid issues that result from passing through the shell, for instance. Just as with SQL: using placeholders eliminates the need to trifle with "escaping".
If passing through a shell is the only way, then this may be an option (it is not tested, but something similar holds for a "<script>" context):
For every character in the JSON, which is either outside the range of "space" to "~" in ASCII, or has a special meaning in the '' context of a the shell such as \ and ' (but excluding " or any other character -- such as digits -- that can appear outside of "string" data, which is a limitation of this trivial approach), then encode the character using the \uXXXX JSON form. (Per the limitations defined above this should only encode potentially harmful characters appearing within the "strings" in the JSON and there should be no \\ pairs, no trailing \, and no 's, etc.)
It's ok. Just escape the character you use to wrap the string:
' should become '\''
So the JSON string
{"pwned": "you' & kill world;"}
becomes
{"pwned": "you'\'' & kill world;"}
and your final command, as the shell sees it, will be:
$ myscript '{"pwned": "you'\'' & kill world;"}'