Prolog Json handle - json

server():- http_server(http_dispatch, [port(45000)]).
serverTest(Request):-http_read_json(Request, JSONIn),
json_to_prolog(JSONIn, PrologIn), format(PrologIn) .
I have this Prolog program but I can't handle the PrologIn variable very well. I get this error:
Type error: `text' expected, found `json([id=3])'
I know that means I can't use format with PrologIn but how do I handle the information inside? Meaning, how do I extract the "id = 3" info ?
Edit:
This is the full program
(If I use more than enough modules, is because I'm doing other stuff with the program and didn't filtered to this specific case)
:- use_module(library(http/thread_httpd)).
:- use_module(library(http/http_dispatch)).
:- use_module(library(http/http_parameters)).
:- use_module(library(http/http_ssl_plugin)).
:- use_module(library(http/http_open)).
:- use_module(library(http/http_client)).
:- use_module(library(http/http_json)).
:- use_module(library(http/json)).
:- http_handler('/test', serverTest, []).
The rest is the first two predicts before the edit
I test this by first going on the Prolog's console and typing "server().", this starts the server. Then I use Postman in the following way: Select POST, in the Headers the key is Content-Type and its value is application/json, then, in the Body, I select raw (JSON(application-json)) and write this in the text area:
{
"id" : 3
}
This is how I test it, I want to be able to handle the id=3 information in the prolog predicate (serverTest).

You really need to show the full program, how you start the server, and how you query it. Otherwise, one can only guess.
Anyway, few problems with this: format(PrologIn).
First, as the program tells you, this is a term. And format does formatted output. At the very least, you would have to write:
format("~w", [PrologIn])
See the documentation on format/2, basically, if your term looks like this: json([id=3]), you should get json([id=3]) printed.
Now the next question: where would this print to? When you start a server with the HTTP package libraries, input and output are redirected so that you can read requests and write responses. There are many examples in the library documentation.
Then the next thing: how you get the 3 out of there. If you additionally load the http_json plugin module:
:- use_module(library(http/http_json)).
then you can directly use, as shown in the code example,
http_read_json_dict(Request, DictIn)
Now DictIn is a "dict" probably looking like this: _{id:3}. See the documentation on dicts.
You don't have to use dicts, just inspect the json term using normal pattern matching and list processing. Dicts are just easier (as in, less typing) for some use cases.
Edit
Here is a minimal example that works for me. This is the server code:
:- use_module(library(http/thread_httpd)).
:- use_module(library(http/http_dispatch)).
:- use_module(library(http/http_json)).
:- http_handler('/test', test, []).
server :-
http_server(http_dispatch, [port(45000)]).
test(Request) :-
http_read_json_dict(Request, JSON),
format(user_error, "~w~n", [JSON.id]).
From the top level, after consulting the file, I run:
?- server.
% Started server at http://localhost:45000/
true.
At that point, I use curl from another command line like this:
$ curl -H "Content-Type: application/json" -d '{"id":3}' http://localhost:45000/test
curl: (52) Empty reply from server
and I get the 3 printed out on the Prolog toplevel where the server is running:
?- 3
This is of course not ideal, so I replace the last line in the server code with the following:
reply_json(_{foobar:JSON.id}).
and then on the Prolog toplevel, where the server is running, I use make/0:
?- make.
% Updating index for library .../lib/swipl-7.3.35/library/
% ... compiled 0.00 sec, 0 clauses
true.
Now, when I again use curl:
$ curl -H "Content-Type: application/json" -d '{"id":3}' http://localhost:45000/test
{"foobar":3}
This is all you need!

I don't know your Prolog web library, but I'm guessing http_read_json already binds its second argument to a Prolog term so the the call to json_to_prolog is unnecessary and incorrect.
Try
serverTest(Request) :- http_read_json(Request, JSONIn), format(JSOnIn).
If you want to isolate the id number from what you receive, this could be as simple as
serverTest(Request) :- http_read_json(Request, json([id=X])),
% ... do something with X value here ... %

Related

hard-coded output without expansion in Snakefile

I have Snakefile as following:
SAMPLES, = glob_wildcards("data/{sample}_R1.fq.gz")
rule all:
input:
expand("samtools_sorted_out/{sample}.raw.snps.indels.g.vcf", sample=SAMPLES),
expand("samtools_sorted_out/combined_gvcf")
rule combine_gvcf:
input: "samtools_sorted_out/{sample}.raw.snps.indels.g.vcf"
output:directory("samtools_sorted_out/combined_gvcf")
params: gvcf_file_list="gvcf_files.list",
gatk4="/storage/anaconda3/envs/exome/share/gatk4-4.1.0.0-0/gatk-package-4.1.0.0-local.jar"
shell:"""
java -DGATK_STACKTRACE_ON_USER_EXCEPTION=true \
-jar {params.gatk4} GenomicsDBImport \
-V {params.gvcf_file_list} \
--genomicsdb-workspace-path {output}
"""
When I test it with dry run, I got error:
RuleException in line 335 of /data/yifangt/exomecapture/Snakefile:
Wildcards in input, params, log or benchmark file of rule combine_gvcf cannot be determined from output files:
'sample'
There are two places that I need some help:
The {output} is a folder that will be created by the shell part;
The {output} folder was hard-coded manually required by the command line (and the contents are unknown ahead of time).
The problem seems to be with the {output} without expansion as compared with the {input} which does.
How should I handle with this situation? Thanks a lot!

I am using identical syntax in jq to change JSON values, yet one case works while other turns bash interactive, how can I fix this?

I am trying to update a simple JSON file (consists of one object with several key/value pairs) and I am using the same command yet getting different results (sometimes even having the whole json wiped with the 2nd command). The command I am trying is:
cat ~/Desktop/config.json | jq '.Option = "klay 10"' | tee ~/Desktop/config.json
This command perfectly replaces the value of the minerOptions key with "klay 10", my intended output.
Then, I try to run the same process on the newly updated file (just value is changed for that one key) and only get interactive terminal with no result. ps unfortunately isn't helpful in showing what's going on. This is what I do after getting that first command to perfectly change the value of the key:
cat ~/Desktop/config.json | jq ‘.othOptions = "-epool etc-eu1.nanopool.org:14324 -ewal 0xc63c1e59c54ca935bd491ac68fe9a7f1139bdbc0 -mode 1"' | tee ~/Desktop/config.json
which I would have expected would replace the othOptions key value with the assigned result, just as the last did. I tried directly sending the stdout to the file, but no result there either. I even tried piping one more time and creating a temp file and then moving it to change to original, all of these, as opposed to the same identical command, just return > and absolutely zero output; when I quit the process, it is the same value as before, not the new one.
What am I missing here that is causing the same command with just different inputs (the key in second comes right after first and has identical structure, it's not creating an object or anything, just key-val pair like first. I thought it could be tee but any other implementation like a passing of stdout to file produces the same constant > waiting for a command, no user.
I genuinely looked everywhere I could online for why this could be happening before resorting to SE, it's giving me such a headache for what I thought should be simple.
As #GordonDavisson pointed out, using tee to overwrite the input file is a (well-known - see e.g. the jq FAQ) recipe for disaster. If you absolutely positively want to overwrite the file unconditionally, then you might want to consider using sponge, as in
jq ... config.json | sponge config.json
or more safely:
cp -p config.json config.json.bak && jq ... config.json | sponge config.json
For further details about this and other options, search for ‘sponge’ in the FAQ.

Send JSON from rsyslog to Kibana

I'm using rsyslog to watch over my syslogs and send them over to Logstash+Kibana.
My syslogs messages are logged as JSON. They can look something like this:
{"foo":"bar", "timegenerated": 43843274834}
rsyslog configuration as so:
module(load="omelasticsearch")
#define a template to print all fields of the message
template(name="messageToES" type="list" option.json="on") {
property(name="msg")
}
*.* action(type="omelasticsearch"
server="localserverhere"
serverport="80"
template="messageToES")
The Kibana is fine, since if I run a CURL command to it, it receives the record. The code as below:
curl -XPOST myserver/test/bar -d '{"test": "baz", "timegenerated":1447145221519}'
When I run rsyslogs and point it to a dummy server, I can see the incoming requests with the valid json. However, when I point it back to my logstash server, it doesn't show up in logstash or kibana.
Does anyone know how to send syslogs as json into Kibana/logstash?
I've never used it, but it looks like you are missing things from your config file. The docs have a pretty thorough example:
module(load="omelasticsearch")
template(name="testTemplate"
type="list"
option.json="on") {
constant(value="{")
constant(value="\"timestamp\":\"") property(name="timereported" dateFormat="rfc3339")
constant(value="\",\"message\":\"") property(name="msg")
constant(value="\",\"host\":\"") property(name="hostname")
constant(value="\",\"severity\":\"") property(name="syslogseverity-text")
constant(value="\",\"facility\":\"") property(name="syslogfacility-text")
constant(value="\",\"syslogtag\":\"") property(name="syslogtag")
constant(value="\"}")
}
action(type="omelasticsearch"
server="myserver.local"
serverport="9200"
template="testTemplate"
searchIndex="test-index"
searchType="test-type"
bulkmode="on"
queue.type="linkedlist"
queue.size="5000"
queue.dequeuebatchsize="300"
action.resumeretrycount="-1")
Based on what you are trying to do, it looks like you need to plug in localserverhere where it shows myserver.local. It also looks like you have ES accepting stuff on port 80, so you'd put in 80 instead of 9200.

{"errorMessages":["Unexpected character (''' (code 39)): expected a valid value

I found "Query using POST" from here.
And tried to use curl command from command like. Installed curl by refering this for windows.
Here is my CURL string:
curl -D- -u admin:password -X POST -H "Content-Type: application/json" --data
'{"jql":"project = CI","startAt":0,"maxResults":50,"fields":["summary","status","assignee"]}'
"https://myclientname.atlassian.net/rest/api/2/search"
This is how I'm doing and getting error:
{"errorMessages":["Unexpected character (''' (code 39)): expected a valid value
(number, String, array, object, 'true', 'false' or 'null')\n
at [Source: org.apache.catalina.connector.CoyoteInputStream#1626cb2; line: 1, column: 2]"]}
Is there any problem making this curl string in windows? Please suggest? How can I correct this and get JSON object? Please note that, userID, password and client name is correct. Thanks.
Seems to be an windows issue. Do not use the ' (single-quote) character.
Instead, use " (double-quote) character for enclosing the string. Then, if you have inner quotes, use """ (3x double-quotes) to escape them.
Example: "{ """name""":"""Frodo""", """age""":123 }"
I tried the cURL you pointed to in your question, but with no luck. Also, the cURL comes with Git is not working either. However, the one I installed with CygWin works. And the same command is also working in Ubuntu. Which basically indicates that your command itself is OK.
If you are working on Windows, I recommend you to use a tool called Fiddler. It can perform almost all HTTP requests you may need. Good luck!
Update:
Here I add the steps to make HTTP POST request with Fiddler.
1) After starting Fiddler, you will see the GUI like Figure 1. The upper right panel is where you should input staff like JIRA's website, request type, and the content you want to post. To be specific, under the "Composer" tab, you need to select "POST" as your request type, and put the JIRA's URL there, keep HTTP/1.1 selected. You should put the request header under the URL bar. Now, you need to pay attention to. At least, you should input two things in HTTP header: the content type, which is "application/json", and the authorization header. The authentication is a Base64 string, you can get your Base64 string here with your "admin:password". If you want to know more about the basic authentication method, please refer Jira's website here. The lower right panel of the GUI is where you should put your post content.
2) When you get these staff ready, you can click the "Execute" button at upper right corner of the GUI. The execution result will be shown at the left panel. As Figure 2 shows, if you get a result with the status 200, congratulations, you got it. If you get other types of results, please google the error code or leave comments here.
3) Double click the result, the returned JSON content will be shown in the lower right panel like Figure 3. You can try different tab to see the returned staff. For example, if you go to the "TextView", you will get the returned JSON as pure string.
Please comment if you have any further question.
Pls verify if you have any value wrapped with single quote.
e.g
"NetworkType": 'Test'
Try this. It should work.
curl -D- -u admin:password -X POST -H "Content-Type: application/json" --data
\\"{"jql":"project = CI","startAt":0,"maxResults":50,"fields":["summary","status","assignee"]}\\"
"https://myclientname.atlassian.net/rest/api/2/search"
Don't forget to use a slash (\*{}\*}after and before json
This worked for me:
curl.exe -u elastic:Password! -k -X POST "https://localhost:9200/_security/user/kibana_system/_password?pretty" -H "Content-Type: application/json" --data '{"""password""" : """CHANGEME"""}'
Notice the format of the last parameter (--data): it uses single quotes (') as string delimiters and triple double-quotes (") inside)

Ways to parse JSON using KornShell

I have a working code for parsing a JSON output using KornShell by treating it as a string of characters. The issue I have is that the vendor keeps changing the position of the field that I am intersted in. I understand in JSON, we can parse it by key-value pairs.
Is there something out there that can do this? I am intersted in a specific field and I would like to use it to run the checks on the status of another RESTAPI call.
My sample json output is like this:
JSONDATA value :
{
"status": "success",
"job-execution-id": 396805,
"job-execution-user": "flexapp",
"job-execution-trigger": "RESTAPI"
}
I would need the job-execution-id value to monitor this job through the rest of the script.
I am using the following command to parse it:
RUNJOB=$(print ${DATA} |cut -f3 -d':'|cut -f1 -d','| tr -d [:blank:]) >> ${LOGDIR}/${LOGFILE}
The problem with this is, it is field delimited by :. The field position has been known to be changed by the vendors during releases.
So I am trying to see if I can use a utility out there that would always give me the key-value pair of "job-execution-id": 396805, no matter where it is in the json output.
I started looking at jsawk, and it requires the js interpreter to be installed on our machines which I don't want. Any hint on how to go about finding which RPM that I need to solve it?
I am using RHEL5.5.
Any help is greatly appreciated.
The ast-open project has libdss (and a dss wrapper) which supposedly could be used with ksh. Documentation is sparse and is limited to a few messages on the ast-user mailing list.
The regression tests for libdss contain some json and xml examples.
I'll try to find more info.
Python is included by default with CentOS so one thing you could do is pass your JSON string to a Python script and use Python's JSON parser. You can then grab the value written out by the script. An example you could modify to meet your needs is below.
Note that by specifying other dictionary keys in the Python script you can get any of the values you need without having to worry about the order changing.
Python script:
#get_job_execution_id.py
# The try/except is because you'll probably have Python 2.4 on CentOS 5.5,
# and the straight "import json" statement won't work unless you have Python 2.6+.
try:
import json
except:
import simplejson as json
import sys
json_data = sys.argv[1]
data = json.loads(json_data)
job_execution_id = data['job-execution-id']
sys.stdout.write(str(job_execution_id))
Kornshell script that executes it:
#get_job_execution_id.sh
#!/bin/ksh
JSON_DATA='{"status":"success","job-execution-id":396805,"job-execution-user":"flexapp","job-execution-trigger":"RESTAPI"}'
EXECUTION_ID=`python get_execution_id.py "$JSON_DATA"`
echo $EXECUTION_ID