My 4D database has a method that calls an external application to make an HTTP request to an API site, e.g. https://somewhere.com/api/?key=somekey¶m=someparam. The request returns a JSON response. However, the external application only returns a call success or fail. I am not able to extract the JSON response.
My 4D database is still in version 12 and have no plans to migrate yet to latest version. Is there any way for me to make an HTTP request and get the JSON response? I was thinking of using the built-in PHP engine and make cURL call. Has anybody done this in 4D?
I recommend using Keisuke Miyako's OAuth plugin. https://github.com/miyako/4d-plugin-oauth. It comes with a cURL library and a JSON parsing library. I use it to pull JSON data from an api source. It looks like he's depricated the plug-in but has links to the separate components.
http://sources.4d.com/trac/4d_keisuke/wiki/Plugins/
ARRAY LONGINT($optionNames;0)
ARRAY TEXT($optionValues;0)
C_BLOB($inData;$outData)
$url:="https://api.atsomewhere.com/blahblah"
$error:=cURL ($url;$optionNames;$optionValues;$inData;$outData)
If ($error=0)
$jsonText:=BLOB to text($outData;UTF8 text without length)
$root:=JSON Parse text ($jsonText)
JSON GET CHILD NODES ($root;$nodes;$types;$names)
$node:=JSON Get child by name ($root;"Success";JSON_CASE_INSENSITIVE)
$status:=JSON Get bool ($node)
If ($status=1)
$ResponseRoot:=JSON Get child by name ($root;"Response";JSON_CASE_INSENSITIVE)
$node1:=JSON Get child by name ($ResponseRoot;"SourceId";JSON_CASE_INSENSITIVE)
$node2:=JSON Get child by name ($ResponseRoot;"SourceName";JSON_CASE_INSENSITIVE)
$output1:=JSON Get text ($node1)
$output2:=JSON Get text ($node2)
End if
End if
4D v12 has built-in support for PHP. I used the PHP EXECUTE command to call a PHP file. But since 4D v12 PHP does not have native support for cURL I used file_get_contents()
My 4D code is as follows:
C_TEXT($result)
C_TEXT($param1)
C_BOOLEAN($isOk)
$param1:="Tiger"
//someFunction is a function in index.php. $result will hold the JSON return value.
//I pass value "Tiger" as parameter
$isOk:=PHP Execute("C:\\index.php";"someFunction";$result;$param1)
C:\index.php contains the PHP script that 4D v12 will run. The code is
<?php
function someFunction($p1){
$somekey = 'A$ga593^bna,al';
$api_URL = 'https://somewhere.com/api/?key='. $somekey. '¶m='.$p1;
return file_get_contents($api_URL);
}
?>
This approach is applicable for GET request. But this already serves my purpose.
Related
I've been doing research in order to write an API for a school project, and when referencing the API documentation of YouTube and Twitter, I see API URLs like this
https://api.twitter.com/1.1/account/settings.json
My understanding was that you execute a method on the backend which will return information to the caller, but I thought those files had to be of extension type .py or .java or whatever language you're using, and JSON was just the return type. I've been unable to find any information on how a .json file works in this example. Is there code in settings.json that is being executed?
JSON is just a format of your data, that you can then use, for example in JavaScript.
It is back-end language independent. By this I mean, that front-end of the application does not care who produced .json file.
Maybe it was Java application, maybe it was Python, or PHP application it does not matter. It can be also static file, with fixed content which just has json format.
After you receive such thing in front-end, you can do with it whatever you want. From your perspective it will be probably some kind of nested array.
In example you provided, I get:
{"errors":[{"code":215,"message":"Bad Authentication data."}]}
And it is fine, it's just data you get. It is JSON format - that is true. But you don't care that path has .json in the URL, it could have any extension, what is important is what's inside.
That is a beauty of JSON. You can prepare for yourself static file with mocked data in JSON format, and use it while developing front-end of the application. When you wish, you can have back-end application which will return real data for your app.
You can also see here, how to return json file from PHP:
Returning JSON from a PHP Script
Or here to see how to do it in Python (Django Framework):
Creating a JSON response using Django and Python
I have an angular 2 project, how can I read and write to a JSON file on my server?
I can do what I want within my code itself bit I don't want to have to change my code, recompile and upload my website every time.
Any help? Examples are greatly appreciated
Angular can read the remote JSON file using the HTTP Client but it can't directly write to the remote file.
For writing, you can use a server side script such as PHP (supported by x10Hosting) to provide a url that allows Angular to post to (also using the HTTP Client), to update the JSON.
For example something like this PHP:
$data = json_decode('./data.json'); // decode the json
$data->something = $_POST['something']; // update the something property
file_put_contents('./data.json', json_encode($data)); // write back to data.json
I'd like to know if it's possible to turn a Json file API like this one :
http://graph.facebook.com/10152830671619648/photos?fields=id,name,source
to a MySQL data table in a database.
What language shall I use for this ? PHP, Javascript ?
Thanks for answering !
EDIT : Actually, I'd like to create a system to manage the comics I need to buy with a simple interface. All the information about a comic book will be stored in a database (id, name, image link, if I need it, if I have it, if I read it).
In PHP, there are several ways to get the JSON object from url including fopen, get_file_contents and curl, the latter being the more reliable option if there are restrictions on the server.
JSON url to PHP object:
- Get JSON object from URL
Serialize and store to db:
- How do I store an php object in a MySQL table?
or write object properties to db table columns:
- mysqli: http://php.net/manual/en/mysqli-stmt.bind-param.php pdo:
- http://php.net/manual/en/pdostatement.bindparam.php
JavaScript can be used as well, using Ajax to both retrieve the JSON object and to store it in the db e.g. by sending it to a server-side script which do the storing. You can use PHP, PHP and JavaScript (Ajax), JavaScript and some JavaScript Ajax APIs, etc.
I'm learning how to use a third party API called Wunderground and I don't know how to request, receive, and use their results which is in a Json format.
If you see their website a developer can sign up for free and receives an API KEY. You can then request weather data in the following URL format.. http://api.wunderground.com/api/KEY/FEATURE/[FEATUREā¦]/[SETTINGā¦]/q/QUERY.FORMAT
So I have tried it in my web browser by typing some parameters and I received a very long Json file with the correct information (I checked). Problem is I don't have the slightest idea of how to create a variable which can make this request, and even if I was able to do that I don't know where should I receive the file and how to get only the results I want (in this case current weather).
You have to use Titanium.Network.HTTPClient to make request.
For code examples related to Json parsing you can use:
Appcelerator: Using JSON to Build a Twitter Client
HTTPClient()
It seems that Yahoo pipes are represented using JSON. I want to download these JSON objects for some research purpose. Usually a Yahoo pipe is rendered in a browser editor thru a url like this: http://pipes.yahoo.com/pipes/pipe.edit?_id=XgRo96h13BGtJWvS8SvLAg, but you can't get the corresponding JSON object to this Yahoo pipe. Does anyone know how to get JSON objects representing Yahoo pipes and store them in any persistent form?
It is possible to get hold of a JSON description of a Yahoo Pipe using a URL of the form:
http://pipes.yahoo.com/pipes/pipe.info?_out=json&_id=PIPE_ID
The pipe2py python library demonstrates how to grab the JSON description of a pipe and "compile" it to a Python equivalent that can be run on your own server.
The post Exporting Yahoo Pipe Definitions, Compiling Them to Python, and Running Them in Scraperwiki describes how you can use pipe2py in the Scraperwiki environment to compile and execute pipes on Scraperwiki using pipe definitions imported directly from Yahoo Pipes, or exported from Yahoo Pipes and then stored locally in a Scraperwiki database table.
When I load that page in a browser I can see that it makes an ajax request for:
http://pipes.yahoo.com/pipes/ajax.pipe.load?id=XgRo96h13BGtJWvS8SvLAg&_out=json&modinfo=true&rnd=7560&.crumb=MjvGjpzhPLl
That's your object but I'm not sure if I'm answering your question of how to "get it". If you need to get it through a program you would need a script that loges into pipes and extracts that url.
A quick way, while not automated, is to use an HTTP analyzer. Here's a process for getting the object using HttpFox (I use v0.8.9) for Firefox. With the analyzer running, load the edit page for a pipe, like the one you linked:
http://pipes.yahoo.com/pipes/pipe.edit?_id=XgRo96h13BGtJWvS8SvLAg
Look at the request with a URL that starts with:
http://pipes.yahoo.com/pipes/ajax.pipe.load?id=....
Next, explore the content of the request (there's a 'Content' tab in HttpFox). That's the JSON object representing the pipe structure.
Use pipe.run?[your pipe id here]&_render=json as opposed to pipe.edit
So in your case to get the json it would be - http://pipes.yahoo.com/pipes/pipe.run?_id=XgRo96h13BGtJWvS8SvLAg&_render=json
I guess how you implement the client is dependent on what you like writing in/what other functionality you need.
You could also do it the other way around and use the web service service module to post the data to a script that can extract the json and persist it to a database. You could check out json.org.