I'm trying to put a tilde character in a variable that I'm going to use in a template in Ansible and for the life of me I cannot achieve what I want, as the tilde is being expanded in all sorts of weird ways.
What I want to achieve is to have some_var defined in my vars file so that I can use it in a template like so:
random_setting: "{{ some_var }}" and get this as a result: random_setting: ~, i.e. pure tilde, no quotes added.
Instead I keep getting this: random_setting: '~' (which is not acceptable for my use case) or this: random_setting: '' (which is just as bad).
My question is: how do I escape the tilde character, so that I can use it without it being either surrounded by quotes or expanded in some obscure way? I've already tried a few tricks including encoding the ~ character with base64 and using the | b64decode filter in Ansible, but nothing seems to work.
I think you might be confusing the real value with the output of Ansible.
If you run this:
---
- hosts: localhost
connection: local
vars:
var1: "~"
tasks:
- template: src=tilde-template.j2 dest=result.txt
with tilde-template.j2:
{{ var1 }}
And check the contents of result.txt it will contain just the tilde.
Related
this is my first post and I'm also very new into programming. Sorry if the terminology I use doesn't always make perfect sense. Feel free to correct any non-sense that would make your eyes bleed.
I am actually a network engineer but with the current trend in my field, I need to start coding and automating but have postponed it until my company had a real use case. Well, that use case arrived and it is called ACI.
I've been learning how to automate many basic things with ansible and so far so good.
My current use case requires a playbook that will concatenate two CSV files with different columns into one single CSV file which will later be used to set variables in other plays.
We mainly work with CSV files containing system names, VLAN IDs and Leaf ports, something like this:
VPC_SYS_NAME, VLAN_ID, LEAF_PAIR
sys1, 3001, 101-102
sys2, 2500, 111-112
... , ..., ... ...
So far what I have tried is to take this data, read it with the read_csv module in ansible, and use the fields in each column as variables to loop in another play:
- name: read the csv file
read_csv:
path: list.csv
delimiter: ','
register: csv
- name: GET EPG UNI PATH FROM VLAN ID
aci_rest:
host: "{{ ansible_host }}"
username: "{{ username }}"
password: "{{ password }}"
validate_certs: False
method: get
path: api/class/fvAEPg.json?query-target-filter=eq(fvAEPg.name,"{{item.VLAN_ID}}")
loop: "{{ csv.list }}"
register: register_as_variable
Once this play has finished, it will register the output into another variable, in this case, called register_as_variable.
I then parse this output with json_query and set it into a new variable:
- set_fact:
fact1: "{{ register_as_variable | json_query('results[].imdata[].fvAEPg.attributes.dn') }}"
lastly, I copy this output into another CSV file.
With the Ansible shell module and using cat and awk I remove any unwanted characters and change the CSV file from a list with 1 single row to a headerless column, getting something like this:
"uni/tn-tenant/ap-AP01/epg-3001",
"uni/tn-tenant/ap-AP01/epg-2500",
"uni/tn-tenant/ap-AP01/epg-...",
Up to this point, it works as I expect it (even if it is clearly not the cleanest way).
Where I am struggling at the moment is to find a way to merge/concatenate both the original CSV with the system name, VLAN ID etc and the newly created CSV with the output "uni/tn-tenant/ap-AP01/epg-...." into one unique "master" CSV file that would be used by other plays. The "master" CSV file should look something like this:
VPC_SYS_NAME, VLAN_ID, LEAF_PAIR, MO_PATH
sys1, 3001, 101-102, "uni/tn-tenant/ap-AP01/epg-3001",
sys2, 2500, 111-112, "uni/tn-tenant/ap-AP01/epg-2500",
... , ..., ... ..., "uni/tn-tenant/ap-AP01/epg-....",
Adding the MO_PATH header can be done with sed -i '1iMO_PATH' file.csv but merging the columns of both files in a given order is what I'm unable to accomplish.
So far I have tried to use panda and cat but without success.
I would be extremely thankful if anyone could help me just a bit or guide me in the right direction.
Thanks!
Hello and welcome to StackOverflow! A former network engineer is here to help :)
The easiest way to merge two files line by line (if you are sure that they order is correct) is to use paste utility.
I have the following files:
1.csv
VPC_SYS_NAME,VLAN_ID,LEAF_PAIR
sys1,3001,101-102
sys2,2500,111-112
2.csv
"uni/tn-tenant/ap-AP01/epg-3001",
"uni/tn-tenant/ap-AP01/epg-2500",
Then i came up with
Adding a new header to resulting file 3.csv:
echo "$(head -n 1 1.csv),MO_PATH" > 3.csv
we are reading header of 1.csv, adding missing column and redirecting output to 3.csv (while overwriting it completely)
Merging two files using paste utility, while skipping the header of 1.csv
tail -n+2 1.csv | paste -d"," - 2.csv >> 3.csv
Let's split this one:
tail -n+2 1.csv - reads 1 csv starting from 2nd line to stdout
paste -d"," - 2.csv - merges two files line by line, using , as delimiter, while getting contents of the first file from stdin (represented as -). We used a pipe | symbol to pass stdout of tail command to stdin of paste command
>> used to append the content to already existing 3.csv
The result:
VPC_SYS_NAME,VLAN_ID,LEAF_PAIR,MO_PATH
sys1,3001,101-102,"uni/tn-tenant/ap-AP01/epg-3001",
sys2,2500,111-112,"uni/tn-tenant/ap-AP01/epg-2500",
And for pipes to work, don't forget to use shell module instead of command, since this question is marked as ansible
I am facing trouble in sending data from an azure web api having a legitimate backslash(\). Data field is user id which is of following pattern:
Domain\UserId
I want to store it in the database as it is. But Dot liquid doesn't process it.
I tried using escape, escape_once and replace
{{ body.requestor | escape_once }}
{{ body.requestor | escape }}
{{ body.requestor | replace "\", "\\"}}
but none of them worked. I cant ask caller of my web api to pass the user id with two backslashes - \\. I have to make a change in my web api to accept the user id's as they are.
Any inputs/pointers are appreciated.
Am I too late to the party? But here is the answer. First - Replace is case-sensitive. Then you need to use colon ":" fro parameters. And third, I have no explanation but I suspect that it takes item to find with escape and item to replace with, without escape. Here is the program
string templateString =
#"Nothing: '{{ k3 }}'
Replace with dash: '{{ k3|Replace:""\\"", ""-"" }}'
Replace with double slash: '{{ k3|Replace:""\\"", ""\\"" }}'";
Template.NamingConvention = new CSharpNamingConvention();
var t = Template.Parse(templateString);
string output = t.Render(Hash.FromDictionary(new Dictionary<string, object>() {{ "k3", "Domain\\user" } }));
Console.WriteLine(output);
Output:
Nothing: 'Domain\user'
Replace with dash: 'Domain-user'
Replace with double slash: 'Domain\\user'
This may be a bug in dot Liquid implementation of escape standard filter, or arguably a point where the Shopify specification is too vague hence implementations will differ. dot Liquid is using .NET RegEx replace causing "\" for pattern matching to be interpreted as the beginning of the pattern and an escape per https://learn.microsoft.com/en-us/dotnet/standard/base-types/character-escapes-in-regular-expressions
So you need {{ body.requestor | replace "\\", "\\"}}
(!)
The pattern to search for interprets the escape (so it's a single \ been matched) while the replacement string is not interpreting the escape (so it's the actual double \\ string).
This question appear when I'm looking for the same question, only for the ruby gem liquid implementation. David Burg's answer give me hint, and this is what works:
{{ "yes \ no" | replace "\", "\\\" }}
will replace the single backlash with double backlash
You nearly got it, the correct solution is:
{{ body.requestor | replace "\", "\\\\"}}
One backslash for the first argument and four for the 2nd.
The first argument is literal.
The 2nd argument gets parsed and because we want two backslashes we need to escape each one of them with another backslash. Making a total of four backslashes.
I am using Salt with jinja2 "regex_search" and I try to extract some digits (release version) from the archive file name. Then use the value to create a symlink, that contains it. I've tried different combinations using "list", "join" and other filters to get rid of this Unicode char, but without success.
Example:
"release_info" variable gets value "release-name-0.2345.577_20190101_1030.tar.gz" and I need to get only digits between the dots.
Here is the corresponding part of the sls file:
symlink to current release {{ release_info }}:
file.symlink:
- name: /home/{{ component.software['component_name'] }}/latest
- target: /home/{{ component.software['component_name'] }}/{{ release_info |regex_search('(\d+\.\d+\.\d+)') }}
- user: support
- group: support`enter code here`
The expected result is "/home/support/0.2345.577", but I have "/home/support/(u'0.2345.577',)"
If I try to pipe "yaml" or "json" filter like:
{{ release_info |regex_search('(\d+\.\d+\.\d+)') | yaml }}
I've got:
/home/support/[0.2345.577]
which is not what I am looking for.
PS
I've got it, but seems to me as not a got approach. Just workaround.
{{ release_info |regex_search('(\d+\.\d+\.\d+)') |yaml |replace('[','') |replace(']','') }}
Hello Todor and Welcome to Stack Overflow!
I have tried the example that you have posted and here is how to achieve what you want
Note: I have changed the regex pattern a little in order to support any other possibilities that could have more digits e.g 0.1.2.3.4 and so on, but of course you can use your pattern as long as it works for you as expected.
Solution 1:
{{ release_info | regex_search("(\d(\.\d+){1,})") | first }}
The result before using first:
('0.2345.577', '.577')
The result after using first:
0.2345.577
Solution 2:
{{ release_info | regex_search("(\d\.\d+\.\d+)") | first }}
The result before using first:
('0.2345.577',)
The result after using first:
0.2345.577
first is a built-in filter in jinja that can return the first item in a sequence. you can check List of built-in filters for more information about the other filters
I am trying to pass JSON string in environment.
- name: Start {{service_name}}
shell: "<<starting springboot jar>> --server.port={{service_port}}\""
environment:
- SPRING_APPLICATION_JSON: '{"test-host.1":"{{test_host_1}}","test-host.2":"{{test_host_2}}"}'
test_host_1 is 172.31.00.00
test_host_2 is 172.31.00.00
But in spring logs, I get JSON parse exception where it prints
Caused by: com.fasterxml.jackson.core.JsonParseException: Unexpected character (''' (code 39)): was expecting double-quote to start field name
at [Source: {'test-host.1': '172.31.00.00', 'test-host.2': '172.31.00.00'}; line: 1, column: 3]
As seen, double quotes are converted to single quotes !!!
I tried escaping double quotes but with no luck.
Any idea why it happens, or any work around?
There is a thing about Ansible template engine.
If a string seems like an object (starts with { or [) Ansible converts it into object. See code.
To prevent this, you may use one of STRING_TYPE_FILTERS:
- SPRING_APPLICATION_JSON: "{{ {'test-host.1':test_host_1,'test-host.2':test_host_2} | to_json }}"
P.S. this is why hack with space character from #techraf's answer works: Ansible misses startswith("{") comparison and don't convert string to object.
Quick hack: add a space to the variable definition (after the first single quote) - a single space doesn't influence the actual variable value (space will be ignored):
- name: Start {{service_name}}
shell: "<<starting springboot jar>> --server.port={{service_port}}\""
environment:
- SPRING_APPLICATION_JSON: ' {"test-host.1":"{{test_host_1}}","test-host.2":"{{test_host_2}}"}'
With the space Ansible passes to shell (test1, test2 are values I set):
SPRING_APPLICATION_JSON='"'"' {"test-host.1":"test1","test-host.2":"test2"}'"'"'
Without the space:
SPRING_APPLICATION_JSON='"'"'{'"'"'"'"'"'"'"'"'test-host.2'"'"'"'"'"'"'"'"': '"'"'"'"'"'"'"'"'test2'"'"'"'"'"'"'"'"', '"'"'"'"'"'"'"'"'test-host.1'"'"'"'"'"'"'"'"': '"'"'"'"'"'"'"'"'test1'"'"'"'"'"'"'"'"'}'"'"'
Order is reversed too. Seems like without a space it interprets the JSON, with the space as string.
I don't really get why it happens so...
I want to change a glob such as c{at,lif} in to a regex. What would it look like? I've tried using /c[at,lif]/ but that did not work.
For Basic GREP operation see f.e. http://www.regular-expressions.info/refquick.html
From http://www.regular-expressions.info/alternation.html:
If you want to search for the literal text cat or dog, separate both options with a vertical bar or pipe symbol: cat|dog. If you want more options, simply expand the list: cat|dog|mouse|fish.
This suggests the following should work:
/c(at|lif)/
Obligatory What Was Wrong With Yours, Then:
/c[at,lif]/
The square brackets [..] are not used in GREP for grouping, but to define a character class. That is, here you create a custom class which allows one of the characters at,lif. Thus it matches ca or c, or cf -- but always only one character. Adding a repetition code c[at,lif]+ only appears to work because it will then match both cat and clif, but also cilt, calf, and c,a,t.