How to capture terminal screen output (with ansi color) to an image file? - screen-capture

I tried the following command to capture the output of a command (grep as an example) with color. But the result is shown as ^[[01;31m^[[Ka^[[m^[[K.
grep --color=always a <<< a |
a2ps -=book -B -q --medium=A4dj --borders=no -o out1.ps &&
gs \
-sDEVICE=png16m \
-dNOPAUSE -dBATCH -dSAFER \
-dTextAlphaBits=4 -q \
-r300x300 \
-sOutputFile=out2.png out1.ps
Is there a way to capture the color in the image? Thanks.

Related

Can you separate distinct JSON attributes into two files using jq?

I am following this tutorial from Vault about creating your own certificate authority. I'd like to separate the response (change the output to API call using cURL to see the response) into two distinct files, one file possessing the certificate and issuing_ca attributes, the other file containing the private_key. The tutorial is using jq to parse JSON objects, but my unfamiliarity with jq isn't helpful here, and most searches are returning info on how to merge JSON using jq.
I've tried running something like
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
jq -r '.data.certificate, .data.issuing_ca > test.cert.pem \
jq -r '.data.private_key' > test.key.pem
or
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
| jq -r '.data.certificate, .data.issuing_ca > test.cert.pem \
| jq -r '.data.private_key' > test.key.pem
but no dice.
It is not an issue with jq invocation, but the way the output files get written. Per your usage indicated, after writing the file test.cert.pem, the contents over the read end of the pipe (JSON output) is no longer available to extract the private_key contents.
To duplicate the contents over at the write end of pipe, use tee along with process substitution. The following should work on bash/zsh or ksh93 and not on POSIX bourne shell sh
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
| tee >( jq -r '.data.certificate, .data.issuing_ca' > test.cert.pem) \
>(jq -r '.data.private_key' > test.key.pem) \
>/dev/null
See this in action
jq -n '{data:{certificate: "foo", issuing_ca: "bar", private_key: "zoo"}}' \
| tee >( jq -r '.data.certificate, .data.issuing_ca' > test.cert.pem) \
>(jq -r '.data.private_key' > test.key.pem) \
>/dev/null
and now observe the contents of both the files.
You could abuse jq's ability to write to standard error (version 1.6 or later) separately from standard output.
vault write -format=json pki_int/issue/example-dot-com \
common_name="test.example.com" \
ttl="24h" \
format=pem \
| jq -r '.data as $f | ($f.private_key | stderr) | ($f.certificate, $f.issuing_ca)' > test.cert.pem 2> test.key.pem
There's a general technique for this type of problem that is worth mentioning
because it has minimal prerequisites (just jq and awk), and because
it scales well with the number of files. Furthermore it is quite efficient in that only one invocation each of jq and awk is needed. The idea is to setup a pipeline of the form: jq ... | awk ...
There are many variants
of the technique but in the present case, the following would suffice:
jq -rc '
.data
| "test.cert.pem",
"\t\(.certificate)",
"\t\(.issuing_ca)",
"test.key.pem",
"\t\(.private_key)"
' | awk -F\\t 'NF == 1 {fn=$1; next} {print $2 > fn}'
Notice that this works even if the items of interest are strings with embedded tabs.

CURL ERROR: parse error: Invalid numeric literal at line 1, column 8

I'm doing a bash script, when working out such an error ... I can't figure out what the problem is. I'm not good at scripting yet.
#!/bin/bash
data=LOGIN
password=123PASSWD
note_link=$(curl -s 'https://cryptgeon.nicco.io' \
-H 'X-Requested-With: XMLHttpRequest' \
--data-urlencode "data=$data" \
--data "has_manual_pass=false&duration_hours=0&dont_ask=false&data_type=T&notify_email=&notify_ref=" \
| jq -r --arg arg $password '.note_link + "#" + $arg')
echo "note URL is $note_link"
curl's -s option is silencing the errors as well, but you want to see the errors in this case to be able to understand what is going wrong, so use -sS instead.
Also, jq can only parse json. If the input is not json, it will fail with the error you get. You should first try to parse the output with jq, and if it fails, display it.
#!/bin/bash
data=LOGIN
password=123PASSWD
curl_output=$(curl -sS 'https://cryptgeon.nicco.io' \
-H 'X-Requested-With: XMLHttpRequest' \
--data-urlencode "data=$data" \
--data "has_manual_pass=false&duration_hours=0&dont_ask=false&data_type=T&notify_email=&notify_ref=")
if note_link=$(jq -r --arg pass "$password" '.note_link + "#" + $pass' <<<"$curl_output" 2>/dev/null); then
echo "note URL is $note_link"
else
printf >&2 %s\\n "Could not parse the curl output:" "$curl_output"
fi

How to insert variable in fswatch regex?

I'm trying to use a variable to identify mxf or mov file extensions. The following works where I explicitly name the file extensions with a regular expression.
${FSWATCH_PATH} -0 \
-e ".*" --include ".*\.[ mxf|mov ]" \
--event Updated --event Renamed --event MovedTo -l $LATENCY \
$LOCAL_WATCHFOLDER_PATH \
| while read -d "" event
do
<code here>
done
How can I use a variable for the file extensions, where the variable name is FileTriggerExtensions? The code below doesn't work:
FileTriggerExtensions=mov|mxf
${FSWATCH_PATH} -0 \
-e ".*" --include ".*\.[ $FileTriggerExtensions ]" \
--event Updated --event Renamed --event MovedTo -l $LATENCY \
$LOCAL_WATCHFOLDER_PATH \
| while read -d "" event
do
done
I guess you use Bash or a similar shell?
FileTriggerExtensions=mov|mxf
-bash: mxf: command not found
Use quotes or escape the pipe symbol.

Sending HTML by cURL on MAILGUN

I tried to send this by email. I tried a lot of ways and always I obteined the same thing: an error.
http://goto-21.net/campaign/htmlversion?mkt_hm=0&AdministratorID=47507&CampaignID=58&StatisticID=62&MemberID=733807&s=994508d6292a660150ccc60c3f0310d4&isDemo=0
I tried with this:
curl -s --user 'api:key-3ax6xnjp29jd6fds4gc373sgvjxteol0' \
Xhttps://api.mailgun.net/v2/samples.mailgun.org/messages \
-F from='Excited User ' \
-F to='foo#example.com' \
-F cc='bar#example.com' \
-F bcc='baz#example.com' \
-F subject='Hello' \
-F text='Testing some Mailgun awesomness!' \
-F html=' CODE HERE ' \
And this:
curl -s --user 'api:key-3ax6xnjp29jd6fds4gc373sgvjxteol0' \
Xhttps://api.mailgun.net/v2/samples.mailgun.org/messages \
-F from='Excited User ' \
-F to='foo#example.com' \
-F cc='bar#example.com' \
-F bcc='baz#example.com' \
-F subject='Hello' \
-F text='Testing some Mailgun awesomness!' \
--form-string html=' CODE HERE ' \
But it doesnt work...Always ''syntax error''
Anyone can help me?
Thank you!
Use the following cURL command to send HTML emails using the command line.
Replace the uppercase parameter with your own and you're ready to send!
curl -s --user 'api:YOUR-API-KEY' https://api.mailgun.net/v2/YOURDOMAIN/messages -F from='YOU#YOURPROVIDER.COM' -F to=RECEIVER#PROVIDER.com -F subject='Hello World with HTML' -F html='<html><head><title>Hello</title></head><body>Hello <strong>World</strong></body></html>' -F text='Hello world'
What this does is send an email from an address attached to your domain to a receiver sending html and plain text version as a fall back!
Happy mailgunning
Best,
the origin request is looks like:
curl -s --user 'api:key-3ax6xnjp29jd6fds4gc373sgvjxteol0' \
https://api.mailgun.net/v3/samples.mailgun.org/messages \
-F from='Excited User <excited#samples.mailgun.org>' \
-F to='devs#mailgun.net' \
-F subject='Hello' \
-F text='Testing some Mailgun awesomeness!'
but you have a big X before https.
curl -s --user 'api:key-3ax6xnjp29jd6fds4gc373sgvjxteol0' \
-------> X <----- https://api.mailgun.net/v2/samples.mailgun.org/messages \ -F from='Excited User ' \ -F to='foo#example.com' \ -F cc='bar#example.com' \ -F bcc='baz#example.com' \ -F subject='Hello' \ -F text='Testing some Mailgun awesomness!' \ --form-string html=' CODE HERE ' \
maybe this cause an error

wget parsing of HTML title in bash failing on 404 errors

I'm building a GSA keyword list. I have a list of keywords, and the urls they are supposed to link to. I need to come up with a list of titles for the links. The best place I can come up with is the title tag in the HTML.
Given a list formatted like this:
bash,PhraseMatch,http://stackoverflow.com/questions/tagged/bash,
html,PhraseMatch,http://stackoverflow.com/questions/tagged/html,
carreers,PhraseMatch,http://careers.stackoverflow.com/faq,
I want a list like this:
bash,PhraseMatch,http://stackoverflow.com/questions/tagged/bash,Newest 'bash' Questions
html,PhraseMatch,http://stackoverflow.com/questions/tagged/html,Newest 'html' Questions
carreers,PhraseMatch,http://careers.stackoverflow.com/faq,Stack Overflow Carreers 2.0
All it is doing is looking up the URL, getting the title tag, and appending it to the end of the line. Here is what I have so far:
{
for line in $( cut -d ',' -f 3 input.csv );
{
wget --no-check-certificate --quiet -O - $line \
| paste -sd ' ' - \
| grep -o -e '<head[^>]*>.*</head>' \
| grep -o -e '<title>.*</title>' \
| cut -d '>' -f 2 \
| cut -d '<' -f 1 \
| cut -d '-' -f 1 \
| tr -d ' ' \
| sed 's| *\(.*\)|\1|g' \
| paste -s -d '\n' - \
;
}
} | paste -d ' ' input.csv - > output.csv
The problem I am having is that some of the pages are returning various errors. In that case, I get no data back. This results in no line being generated. When I do the paste to merge the two streams, they aren't the same size.
I'm looking for a way to check for empty data and return an empty line. Help?
Ignoring the issue parsing HTML using a collection of command-line tools, you can substitute a fixed error string for the output of commands that don't complete. ( I don't think I'm inserting the check at the right place in the pipeline, but hopefully you can make that correction):
set -o pipefail
while IFS=, read first second line rest; do
wget --no-check-certificate --quiet -O - $line |
paste -sd ' ' - |
grep -o -e '<head[^>]*>.*</head>' |
grep -o -e '<title>.*</title>' |
cut -d '>' -f 2 |
cut -d '<' -f 1 |
cut -d '-' -f 1 |
tr -d ' ' |
sed 's| *\(.*\)|\1|g' |
paste -s -d '\n' - \
|| echo "<no output found>" # If any part of the pipeline fails
done < input.csv | paste -d ' ' input.csv - > output.csv