I am trying to learn Yesod and trying to implement a simple REST app where everytime a I get a GET request I write something to a file. Right now I have the following handler function:
getTestR =
do
return $ writeFile "test.txt" "Just something"
return $ object ["result" .= "Ok"]
What I was expecting is that the file test.txt would be created and I would obtain a JSON with {result=Ok}. However, I am obtaining the JSON, but the file is not being created.
I guess the writeFile is not being evaluated because of the lazy evaluation, but I have no idea how to overcome this problem. Thanks in advance.
just use liftIO:
getTestR =
do
liftIO $ writeFile "test.txt" "Just something"
return $ object ["result" .= "Ok"]
Related
I have a Snakemake pipeline where I get my input/output paths for my file folders from a json file and use the expand function to obtain the paths.
import json
with open('config.json', 'r') as f:
config = json.load(f)
wildcard = ["1234", "5678"]
rule them_all:
input:
expand('config["data_input"]/data_{wc}.tab', wc = wildcard)
output:
expand('config["data_output"]/output_{wc}.rda', wc = wildcard)
shell:
"Rscript ./my_script.R"
My config.json is
{
"data_input": "/very/long/path",
"data_output": "/slightly/different/long/path"
}
While trying to make a dry run, though, I get the following error:
$ snakemake -np
Building DAG of jobs...
MissingInputException in line 12 of /path/to/Snakefile:
Missing input files for rule them_all:
config["data_input"]/data_1234.tab
config["data_input"]/data_5678.tab
The files are there and their path is /very/long/path/data_1234.tab.
This is probably a low-hanging fruit, but what am I doing wrong in the syntax for the expansion? Or is it the way I call the json file?
expand() does not interpret access to dictionaries for its first argument while expanding the path with quotation marks, so this operation with expand() has to be done in a wildcard.
The correct syntax, in this case, would be e.g.
expand('{input_folder}/data_{wc}.tab', wc = wildcard, input_folder = config["data_input"])
I'm working on parsing JSON data using JSON.sh. And I wanted to read data from json file (test.json) whose content will be something like,
{
"/home/ukrishnan/projects/test.yml": {
"LOG_DRIVER": "syslog",
"IMAGE": "mysql:5.6"
},
"/home/ukrishnan/projects/mysql/app.xml": {
"ENV_ACCOUNT_BRIDGE_ENDPOINT": "/u01/src/test/sample.txt"
}
}
And I try to parse this JSON using JSON.sh by using,
test_parser=`sh ./lib/JSON.sh < test/test.json`
echo $test_parser
It prints,
["/home/ukrishnan/projects/test.yml","LOG_DRIVER"] "syslog" ["/home/ukrishnan/projects/test.yml","IMAGE"] "mysql:5.6" ["/home/ukrishnan/projects/test.yml"] {"LOG_DRIVER":"syslog","IMAGE":"mysql:5.6"} ["/home/ukrishnan/projects/mysql/app.xml","ENV_ACCOUNT_BRIDGE_ENDPOINT"] "/u01/src/test/sample.txt" ["/home/ukrishnan/projects/mysql/app.xml"] {"ENV_ACCOUNT_BRIDGE_ENDPOINT":"/u01/src/test/sample.txt"} [] {"/home/ukrishnan/projects/test.yml":{"LOG_DRIVER":"syslog","IMAGE":"mysql:5.6"},"/home/ukrishnan/projects/mysql/app.xml":{"ENV_ACCOUNT_BRIDGE_ENDPOINT":"/u01/src/test/sample.txt"}}
Whereas, the same command (sh ./lib/JSON.sh < test/test.json), if I run through terminal, it is printing with line breaks,
["/home/ukrishnan/projects/test.yml","LOG_DRIVER"] "syslog"
["/home/ukrishnan/projects/test.yml","IMAGE"] "mysql:5.6"
["/home/ukrishnan/projects/test.yml"] {"LOG_DRIVER":"syslog","IMAGE":"mysql:5.6"}
["/home/ukrishnan/projects/mysql/app.xml","ENV_ACCOUNT_BRIDGE_ENDPOINT"] "/u01/src/test/sample.txt"
["/home/ukrishnan/projects/mysql/app.xml"] {"ENV_ACCOUNT_BRIDGE_ENDPOINT":"/u01/src/test/sample.txt"}
[] {"/home/ukrishnan/projects/test.yml":{"LOG_DRIVER":"syslog","IMAGE":"mysql:5.6"},"/home/ukrishnan/projects/mysql/app.xml":{"ENV_ACCOUNT_BRIDGE_ENDPOINT":"/u01/src/test/sample.txt"}}
I wanted to read this and assign to bash variables like,
file_name='/home/ukrishnan/projects/test.yml'
key='LOG_DRIVER'
value='syslog'
As I'm almost completely new to shell script and grep or awk, I don't have much idea of how to achieve this. Any help on this would be greatly appreciated.
I wrote a JSON serializer / deserializer for gawk, if you're interested. Save that script and modify it, replacing everything above # === FUNCTIONS === with the following:
#!/usr/bin/gawk -f
# capture JSON string from beginning to end into a scalar variable
{ json = json ORS $0 }
END {
# objectify JSON string to the multilevel array "obj"
deserialize(json, obj)
for (filename in obj) {
print "file_name=" quote(filename)
for (key in obj[filename]) {
# print key="value"
print key "=" quote(obj[filename][key])
}
}
}
Do chmod 755 json.awk and execute it. Output will resemble this:
$ ./json.awk test5.json
file_name="/home/ukrishnan/projects/mysql/app.xml"
ENV_ACCOUNT_BRIDGE_ENDPOINT="/u01/src/test/sample.txt"
file_name="/home/ukrishnan/projects/test.yml"
LOG_DRIVER="syslog"
IMAGE="mysql:5.6"
Hopefully the logic is reasonably easy to follow. If you prefer to output filename=, key=, and value= on every loop iteration, modify the nested for loops accordingly:
for (filename in obj) {
for (key in obj[filename]) {
print "file_name=" quote(filename)
print "key=" quote(key)
print "value=" quote(obj[filename][key])
}
}
That change will result in the following output:
$ ./json.awk test5.json
file_name="/home/ukrishnan/projects/mysql/app.xml"
key="ENV_ACCOUNT_BRIDGE_ENDPOINT"
value="/u01/src/test/sample.txt"
file_name="/home/ukrishnan/projects/test.yml"
key="LOG_DRIVER"
value="syslog"
file_name="/home/ukrishnan/projects/test.yml"
key="IMAGE"
value="mysql:5.6"
Anyway, with that output, you can do something silly in BASH like this to populate and act upon the variables:
#!/bin/bash
./test.awk test5.json | while read -r line; do {
eval $line
[ "${line/=*/}" = "value" ] && {
echo "bash: file_name=$file_name"
echo "bash: key=$key"
echo "bash: value=$value"
echo "------"
}
}; done
It'd probably be more graceful just to do all processing within gawk from start to finish and not mess with the polyglot handoff, though.
Getting back to json.awk, if you prefer to keep json.awk modular for easy reuse in future projects, you could remove everything above # === FUNCTIONS ===, create a separate main.awk containing the code block at the top of this answer, and #include "json.awk" as a helper library pretty much anywhere outside of END {...} (just below the shbang, for example).
JSON.sh (from http://json.org) offers a nice bash friendly means of flattening out a JSON file. Which you've already provided how it looks in your question. So, the flatten form is the format:
[node] tab value
You have to think in UNIX script in extracting the information you want, you'll note the lines you're interested in actually follow this pattern:
["filename","key"] tab ["value"]
In regex notation, we replace:
filename with (.*)
key with (.*)
tab with \t
value with (.*)
We can retrieve the first, second and third matching groups with \1, \2, \3 respectively.
When used in sed we also note that these symbols []() need to be escaped with a backslash \, resulting in the following script:
./lib/JSON.sh < test/test.json | sed 's/\["\(.*\)","\(.*\)\"]\t"\(.*\)"/\1,\2,\3/;t;d'
/home/ukrishnan/projects/test.yml,LOG_DRIVER,syslog
/home/ukrishnan/projects/test.yml,IMAGE,mysql:5.6
/home/ukrishnan/projects/mysql/app.xml,ENV_ACCOUNT_BRIDGE_ENDPOINT,/u01/src/test/sample.txt
Now we put the lines in a loop and for each line, we can extract out filename,key,value:
for line in $(./lib/JSON.sh < test/test.json | sed 's/\["\(.*\)","\(.*\)\"]\t"\(.*\)"/\1,\2,\3/;t;d')
do
IFS="," read -ra arr <<< $line
filename=${arr[0]}
key=${arr[1]}
value=${arr[2]}
cat <<EOF
filename : $filename
key : $key
value : $value
EOF
done
Which outputs:
filename : /home/ukrishnan/projects/test.yml
key : LOG_DRIVER
value : syslog
filename : /home/ukrishnan/projects/test.yml
key : IMAGE
value : mysql:5.6
filename : /home/ukrishnan/projects/mysql/app.xml
key : ENV_ACCOUNT_BRIDGE_ENDPOINT
value : /u01/src/test/sample.txt
I'm using Symfony's console component to write some command line tools, one of which uses WP-CLI to set up a WordPress site. Specifically using wp option update I'm running into issues with both JSON and quotes.
For example, running something like:
shell_exec("wp option update blogdescription ''");
Results in literal '' in the database. In fact, any time I use that command, if it's successful, whatever I've tried to set in the database is wrapped in the single quotes.
And then something like this:
$github_updater_args = [
'github_access_token'=>'',
'bitbucket_username'=>'username',
'bitbucket_password'=>'password',
'all-in-one-seo-populate-keywords'=>'1'
];
$github_updater_args = json_encode($github_updater_args);
var_dump($github_updater_args);
shell_exec("wp option update github_updater '$github_updater_args' --format=json");
The dump results in:
string(200) "{"github_access_token":"","bitbucket_username":"devs#webspecdesign.com","bitbucket_password":"xxxxxxxxx","webspec-design-wordpress-core":"1","all-in-one-seo-populate-keywords":"1","webspec-smtp":"1"}"
Which is valid JSON, but the wp command resulted in the following:
Error: Invalid JSON: '{github_access_token:,bitbucket_username:username,bitbucket_password:password,all-in-one-seo-populate-keywords:1}'
You'll notice the quotes have been stripped out, which I assume is what it's complaining about? Or is that just how it dumps out the error?
Either way, I then decided I would hardcode the JSON, so just:
shell_exec('wp option update github_updater \'{"github_access_token": "", "bitbucket_username": "username", "bitbucket_password": "password"}\' --format=json');
Which gave me the following error:
Error: Too many positional arguments: , bitbucket_username: username, bitbucket_password: password}'
I wager the two issues are related. I thought maybe it was a WP-CLI issue, so I opened an issue there, but it was closed without reaching an answer. The more I stare at it, however, the more I think it might by a Symfony thing. Perhaps I definitely need to use the Process Component rather than shell_exec? Is that what it's designed for?
Update: Tried the Symfony Process Component:
$process = new Process("wp option update blogdescription ''");
$process->run();
Still got my two quotes. So nothing doing there.
You are looking for escapeshellarg() to escape your json data. escapeshellcmd() is for sanitizing an entire command, not a single argument. Try something like this:
$github_updater_args = [
'github_access_token'=>'',
'bitbucket_username'=>'username',
'bitbucket_password'=>'password',
'all-in-one-seo-populate-keywords'=>'1'
];
$github_updater_args = json_encode($github_updater_args);
$cmd = sprintf("wp option update github_updater %s --format=json", escapeshellarg($github_updater_args));
shell_exec($cmd);
Edit
There is something missing from your question or there is a bug in the CLI script you are executing. I wrote two quick test php scripts to illustrate that the json escaping works correctly.
test.php
<?php
$github_updater_args = [
'github_access_token'=>'',
'bitbucket_username'=>'username',
'bitbucket_password'=>'password',
'all-in-one-seo-populate-keywords'=>'1'
];
$github_updater_args = json_encode($github_updater_args);
$cmd = sprintf("php j.php %s", escapeshellarg($github_updater_args));
echo shell_exec($cmd);
echo PHP_EOL;
j.php
<?php
var_dump(json_decode($argv[1]));
output
~$ php test.php
object(stdClass)#1 (4) {
["github_access_token"]=>
string(0) ""
["bitbucket_username"]=>
string(8) "username"
["bitbucket_password"]=>
string(8) "password"
["all-in-one-seo-populate-keywords"]=>
string(1) "1"
}
I'm trying to replace an old Tcl interface to C++ using SWIG. Here is an example class:
class test {
std::string str;
public:
test(const char * s):str(s) {}
void print() const {std::cout << str << std::endl;}
};
and here is the standard way to use it:
load ./example.so example
test s "this is a test string"
s print
But I want to preserve the simplicity of the old interface which does not use the "". I've found that I can do something like:
load ./example.so example
proc TEST {args} { test [lindex $args 0] [lrange $args 1 end] }
TEST s2 this is another test string
s2 print
Which looks simple and works flawlessly, however, of course, I cannot have the proc definition in the user script. I'm not sure where else I could place it. Is there a way to put it in the .i file?
I think you can put this in your .i file:
%init %{
Tcl_Eval(interp, "proc TEST {args} { ... }");
%}
That will insert a call to Tcl_Eval (which will make the procedure) into your module's initialization function, and as long as the interp context is available there (WARNING! Check this!) then you'll get everything nicely packaged.
If you've got lots of code, consider calling Tcl_EvalFile instead, or calling Tcl code that will source the script code (necessary if you've got a need for complex scripting to locate the file to read).
Script works well when run manually, but when I schdule it in cronjob it shows :
malformed JSON string, neither array, object, number, string or atom, at character offset 0 (before "<html>\r\n<head><tit...") at /usr/local/lib/perl5/site_perl/5.14.2/JSON.pm line 171.
script itself:
#rest config vaiables
$ENV{'PERL_LWP_SSL_VERIFY_NONE'} = 0;
print "test\n";
my $client = REST::Client->new();
$client->addHeader('Authorization', 'Basic YWRtaW46cmFyaXRhbg==');
$client->addHeader('content_type', 'application/json');
$client->addHeader('accept', 'application/json');
$client->setHost('http://10.10.10.10');
$client->setTimeout(1000);
$useragent = $client->getUseragent();
print "test\n";
#Getting racks by pod
$req = '/api/v2/racks?name_like=2t';
#print " rekvest {$req}\n";
$client->request('GET', qq($req));
$racks = from_json($client->responseContent());
$datadump = Dumper (from_json($client->responseContent()));
crontab -l
*/2 * * * * /usr/local/bin/perl /folder/api/2t.pl > /dmitry/api/damnitout 2>&1
Appreciate any suggestion
Thank you,
Dmitry
It is difficult to say what is really happening, but in my experience 99% issues of running stuff in crontab stems from differences in environment variables.
Typical way to debug this: in the beginning of your script add block like this:
foreach my $key (keys %ENV) {
print "$key = $ENV{$key}\n";
}
Run it in console, look at the output, save it in log file.
Now, repeat the same in crontab and save it into log file (you have already done that - this is good).
See if there is any difference in environment variables when trying to run it both ways and try to fix it. In Perl, probably easiest is to alter environment by changing %ENV. After all differences are sorted out, there is no reason for this to not work right.
Good luck!