Get a github folder it's contents as a json file - json

if I construct this URL, I can see the contents of the images folder as a json in Firefox.
https://api.github.com/repos/DessoCode/ESP32/contents/Images?ref=main
However, this doesn't seem to be a true json since my parser doesn't parse this. I'm using an ESP32 and Arduino.
The code works with a true json link. (For example: http://arduinojson.org/example.json)
My question is: How do I change the first URL so it has a .json extension?
Thank you so much!

I figured it out, I needed to serialize the input stream first, then I could deserialize it and use the values.

The alternative is to install GitHub CLI gh (through one of the linux_arm64 packages), and use gh api to execute the API call... with -q or --jq <string> to query and select values from the response using jq syntax.
No need to serialize/de serialize.
gh api repos/DessoCode/ESP32/contents/Images?ref=main --jq ".[].path"
Images/robot.bmp
Images/skulltest.bmp
Images/yellow.bmp

Related

Request a specific variable of JSON stream using bash/terminal

I am working with a flutter web server testing. I writing a simple bash script to fetch some JSON data from API request. The API request dispatch following information as JSON response.
{
"code_version":{
"engine_name":"flutter_renderV1",
"proxy":"10.1.1.1:1090",
"test_rate":true,
"test_density":"0.1",
"mapping_eng":"flutter_default_mapper"
},
"developer_info":{
"developerid":"30242",
"context":true,
"request_timestamp":"156122441"
}
}
Once this received, I saved in to local file named server_response{$id}.json. I need to collect test_density value under code_version data frame. I used several awk, sed command to fetch data, unfortunatly I cannot get the exact output from my terminal.
You need to install powerful JSON querying processor like jq processor. you can can easily install from here
once you install jq processor, try following command to extract the variable from JSON key value
suppose, your file named as server_response_123.json,
jq '.code_version.test_density' server_response_123.json
the output will be shown as,
"0.1"

Read and write to JSON on x10Hosting

I have an angular 2 project, how can I read and write to a JSON file on my server?
I can do what I want within my code itself bit I don't want to have to change my code, recompile and upload my website every time.
Any help? Examples are greatly appreciated
Angular can read the remote JSON file using the HTTP Client but it can't directly write to the remote file.
For writing, you can use a server side script such as PHP (supported by x10Hosting) to provide a url that allows Angular to post to (also using the HTTP Client), to update the JSON.
For example something like this PHP:
$data = json_decode('./data.json'); // decode the json
$data->something = $_POST['something']; // update the something property
file_put_contents('./data.json', json_encode($data)); // write back to data.json

Post parameter to Rest API

I need to post a json data and it's parameter to REST API. I know there might be some issues when using json in cross domain, but I tried using mozilla addon "http requester" and using "php-curl" and getting result in a json format as {"success":false}.
Is there any way to inspect the REST API using json data? If so please provide me some example to pass a json data to a REST API using a parameter.
If you have curl installed, at the command line you can do:
curl -d "{\"success\": false}" http://path/to/api?param=value

How can I get the json object which represents a Yahoo! pipe

It seems that Yahoo pipes are represented using JSON. I want to download these JSON objects for some research purpose. Usually a Yahoo pipe is rendered in a browser editor thru a url like this: http://pipes.yahoo.com/pipes/pipe.edit?_id=XgRo96h13BGtJWvS8SvLAg, but you can't get the corresponding JSON object to this Yahoo pipe. Does anyone know how to get JSON objects representing Yahoo pipes and store them in any persistent form?
It is possible to get hold of a JSON description of a Yahoo Pipe using a URL of the form:
http://pipes.yahoo.com/pipes/pipe.info?_out=json&_id=PIPE_ID
The pipe2py python library demonstrates how to grab the JSON description of a pipe and "compile" it to a Python equivalent that can be run on your own server.
The post Exporting Yahoo Pipe Definitions, Compiling Them to Python, and Running Them in Scraperwiki describes how you can use pipe2py in the Scraperwiki environment to compile and execute pipes on Scraperwiki using pipe definitions imported directly from Yahoo Pipes, or exported from Yahoo Pipes and then stored locally in a Scraperwiki database table.
When I load that page in a browser I can see that it makes an ajax request for:
http://pipes.yahoo.com/pipes/ajax.pipe.load?id=XgRo96h13BGtJWvS8SvLAg&_out=json&modinfo=true&rnd=7560&.crumb=MjvGjpzhPLl
That's your object but I'm not sure if I'm answering your question of how to "get it". If you need to get it through a program you would need a script that loges into pipes and extracts that url.
A quick way, while not automated, is to use an HTTP analyzer. Here's a process for getting the object using HttpFox (I use v0.8.9) for Firefox. With the analyzer running, load the edit page for a pipe, like the one you linked:
http://pipes.yahoo.com/pipes/pipe.edit?_id=XgRo96h13BGtJWvS8SvLAg
Look at the request with a URL that starts with:
http://pipes.yahoo.com/pipes/ajax.pipe.load?id=....
Next, explore the content of the request (there's a 'Content' tab in HttpFox). That's the JSON object representing the pipe structure.
Use pipe.run?[your pipe id here]&_render=json as opposed to pipe.edit
So in your case to get the json it would be - http://pipes.yahoo.com/pipes/pipe.run?_id=XgRo96h13BGtJWvS8SvLAg&_render=json
I guess how you implement the client is dependent on what you like writing in/what other functionality you need.
You could also do it the other way around and use the web service service module to post the data to a script that can extract the json and persist it to a database. You could check out json.org.

List of Slaves connected to master - Hudson

Is there a way to find it programatically? I need this as part of an automated run; So this would be very helpful if there is an existing remote API call which can give this.
You don't need to parse the HTML - most of the Hudson pages can be turned into API calls by adding URL suffix, e.g. make GET calls to:
http://hudson:8080/computer/api/json
Switch the JSON for either XML or Python if you prefer these over JSON.
If you use just the API suffix, you'll get a short generic help page on the API.
Groovy script to get all computers:
def jenkins = Jenkins.instance
def computers = jenkins.computers
computers.each{
println "${it.displayName} ${it.hostName}"
}
Look at http://hudson:8080/computer/