Call remote server through oracle apex - external

How to call external server or remote server from Oracle APEX using various restful services kindly let me know.
Thanks & Regards,
Yokes G

One way to do it is to set up a Rest Data Source entry. I am using Oracle Apex 20.2.
eg called myRest
using Simple HTTP
Remote Server: Server name without https: include the endpoint
JSON Operation: GET
Parameters: the header parameters
It is very easy to copy and paste this from PostMan.
Then you can use the following type of code to access this REST service
declare
l_params apex_exec.t_parameters;
begin
apex_exec.add_parameter( l_params, 'content', 'hello david' );
apex_exec.add_parameter( l_params, 'to', '123456' );
apex_exec.execute_rest_source(
'myRest',
'GET',
NULL,
l_params );
end;
Note that I couldn't get the JSON to work properly using Rest Data Sources and so I use make_rest_request instead as follows:
declare
l_host constant varchar2(200) := 'https://platform.xxx.com';
l_host_path constant varchar2(200) := l_host || '/v1/message';
l_clob clob;
l_body clob;
begin
l_body := '{
"messages": [
{
"channel": "xxx",
"to": "123",
"content": "Hello from APEX"
}
]
}';
apex_web_service.g_request_headers(1).name := 'Content-Type';
apex_web_service.g_request_headers(1).value := 'application/json';
apex_web_service.g_request_headers(2).name := 'Authorization';
apex_web_service.g_request_headers(2).value := 'yourkey';
l_clob := apex_web_service.make_rest_request(
p_url => l_host_path,
p_http_method => 'POST',
p_body => l_body);
end;
This second method is nice as then everything is in one place and you don't need to reference the REST Data Sources.

Related

Bulk insert rows from an array to an sql server with golang

I have a list of structs as follows
row = [
{
"name":<name1>,
"age" :<age1>,
"job" :<job1>
},
{
"name":<name1>,
"age" :<age1>,
"job" :<job1>
},
{
"name":<name1>,
"age" :<age1>,
"job" :<job1>
},
etc...
]
I want to insert this into an SQL table. So far I was running a loop through the array and inserting each row one by one. But is there any other way by which I can insert all the rows with just one query? I know bulk insert, but my understanding is that, for bulk insert, I will have to import data from an external file. I don't want to do that. How do I use the data from this array and perform bulk insert?
type Person struct {
Name string
Age int
Job string
}
func InsertPeople(db *sql.DB, personSlice []*Person) error {
var queryString = `INSERT INTO "person_table" (
"name"
, "age"
, "job"
) VALUES `
numOfFields := 3
params := make([]interface{}, len(personSlice)*numOfFields)
for i, p := range personSlice {
pos := i * numOfFields
params[pos+0] = p.Name
params[pos+1] = p.Age
params[pos+2] = p.Job
queryString += `(?, ?, ?),`
}
queryString = queryString[:len(queryString)-1] // drop last comma
_, err := db.Exec(queryString, params...)
return err
}
You aren't going to be able to do any kind of super-optimized bulk insert without placing a file on the server I don't think.
I am not sure if the db library supports it but using the SQLX extension you can build a single insert statement with named bindvars that go against a struct. You can then pass an array of these structs to a method like NamedExec.
Something like this:
users := []User{
{
Name: "alex",
Email: "alex#example.com",
},
{
Name: "muhammed",
Email: "muhammed#example.com",
},
}
db.NamedExec("insert into users (NAME, EMAIL) values (:Name, :Email);", users)

SQL Server OPENROWSET not pulling JSON Data

I am trying so save the JSON file Data 'WhitelistedOrigins' in a SQL table however I keep receiving NULL Entries.
I have used the same method to import attributes into a table and this has worked before although the JSON was formatted differently
JSON
"Client": {
"whiteListedOrigins": [
"file://",
"https://mobile.gtent.eu",
"https://mobile.assists.co.uk",
"https://valueadds3.active.eu",
"https://flash3.active.eu",
"https://valueadds3.assists.co.uk"
]
}
SQL
DECLARE #JSON VARCHAR(MAX)
SELECT #JSON = BulkColumn
FROM OPENROWSET
(BULK 'C:\config.json', SINGLE_CLOB)
AS A
UPDATE dbo.CommonBaseline
SET CommonBaseline.whiteListedOrigins= whiteListedOrigins
FROM OPENJSON (#JSON, '$.Client')
WITH (
whiteListedOrigins Varchar (MAX))
RESULT
You need to use OPENJSON() with explicit schema and AS JSON option in a column definition.
If you want to return a nested JSON fragment from a JSON property, you
have to provide the AS JSON flag. Without this option, if the property
can't be found, OPENJSON returns a NULL value instead of the
referenced JSON object or array, or it returns a run-time error in
strict mode.
Statement:
DECLARE #Json nvarchar(max) = N'{
"Client": {
"whiteListedOrigins": [
"file://",
"https://mobile.gtent.eu",
"https://mobile.assists.co.uk",
"https://valueadds3.active.eu",
"https://flash3.active.eu",
"https://valueadds3.assists.co.uk"
]
}
}'
SELECT *
FROM OPENJSON (#JSON, '$.Client') WITH (
whiteListedOrigins nvarchar (MAX) AS JSON
)
Output:
--------------------
whiteListedOrigins
--------------------
[
"file://",
"https://mobile.gtent.eu",
"https://mobile.assists.co.uk",
"https://valueadds3.active.eu",
"https://flash3.active.eu",
"https://valueadds3.assists.co.uk"
]
Notes:
Your JSON (without surrounding { and }) is not valid.

How to get odata.nextLink from the returned JSON object using oracle's apex_json

I am retrieving data from an odata service. The response contains a link for loading more data i.e. odata.nextLink which I need to retrieve to load more data. How do I do this using oracle's apex_json?
The server's response is something like this:
{
"odata.metadata":"http://localhost:60497/odata/$metadata#tables","value":[
{"id":001,"name":"abc" }
.
.
],
"odata.nextLink":"http://localhost:60497/odata/tables?$skip=10"
}
Normally I would parse the data and then retrieve the information like this next_link := apex_json.get_varchar2('odata.nextLink'); but since it contains a point that won't work.
In this case:
apex_json.get_varchar2('odata.nextLink');
it is considering nextLink as the value of odata.
DECLARE
v_json APEX_JSON.T_VALUES;
v_str_json VARCHAR2(4000);
next_link VARCHAR2(4000);
BEGIN
v_str_json := '{
"odata" : {"nextLink" : "mylink"},
"odata.metadata":"http://localhost:60497/odata/$metadata#tables","value":[
{"id":001,"name":"abc" }
],
"odata.nextLink":"http://localhost:60497/odata/tables?$skip=10"
}
}';
APEX_JSON.PARSE(v_json, v_str_json);
next_link := APEX_JSON.GET_VARCHAR2(p_path => 'odata.nextLink', p_values => v_json);
dbms_output.put_line(next_link);
END;
>> OUTPUT: mylink
I do not know how to say "odata.nextLink" to be interpreted literally; and not as a path.
If nobody knows:
1 - or stop sending "odata." at the beginning of the name of each attribute;
2 - or use replace to remove.

Parsing a JSON to meet minimum requirements inside a stored procedure

I have been looking at what I believe to every single page on SQL Server and half stackoverflow, and I can't find a proper solution to this
Our challenge, is to deal with an exiting application that send/receive JSON form SQL Server. So we have to build a STRONG JSON architecture on SQL Server.
We need to validate the format of the JSON (legacy system has its own standard) so messages are in exact expected format.
The thing is, JSON functions are not so advance as XML, and seems there is no way to validate a schema in SQL Server.
We tried with sp_prepare and sp_execute, but that does not seem to work.
We tested something like this:
Declare #ptSQL1 int;
Exec sp_prepare #ptSQL1 output,
N'#P1 nvarchar(128), #json NVARCHAR(1000) ',
N' SELECT *
INTO temp_tblPersons
FROM OPENJSON (#json, ''$.root'')
WITH (
Cname NVARCHAR(100) ''strict$.FirstName'',
Csurname NVARCHAR(100) ''lax$.surname''
) as J
where Csurname like #P1';
DECLARE #json7 NVARCHAR(1000)
SET #json7 = N'{
"root": [
{ "FirstName": "Charles" , "surname":"perez" },
{ "FirstName": "Jade" , "surname":"pelaz" },
{ "FirstName": "Jim" , "surname":"alvarez" },
{ "FirstName": "Luke" , "surname":"alonso" },
{ "FirstName": "Ken"}
]
}'
IF (#ptSQL1 = 0) PRINT 'THE SUPPLY JSON IS NOT VALID'
ELSE Exec sp_execute #ptSQL1, N'a%', #json7;
but does not meet the sp_prepare/execute behavior.
Our intention it to validate a minimum schema before proceed to process the data, and if the schema doesn't meet the standard, return an ERROR.
How can this be accomplished?
(not sure where we read the #ptSQL1 = 0, but I believe to read somewhere)
Our intention it to validate a minimum schema before proceed to
process the data, and if the schema doesn't meet the standard, return
an ERROR.
The JSON must be parsed in order to validate the schema. A prepare doesn't actually execute the query in order to parse the JSON document, plus sp_prepare and sp_execute are internal API system stored procedures not intended to be called directly in T-SQL.
Although one can't currently validate JSON schema in T-SQL (without writing a custom SQLCLR assembly), you could just use TRY/CATCH and handle errors. The example below handles JSON errors differently but I would personally just THROW all errors and handle specific ones in the app code.
DECLARE #json NVARCHAR(1000);
DECLARE #P1 NVARCHAR(128) = 'a%';
SET #json = N'{
"root": [
{ "FirstName": "Charles" , "surname":"perez" },
{ "FirstName": "Jade" , "surname":"pelaz" },
{ "FirstName": "Jim" , "surname":"alvarez" },
{ "FirstName": "Luke" , "surname":"alonso" },
{ "FirstName": "Ken"}
]
}';
BEGIN TRY
SELECT *
INTO temp_tblPersons
FROM OPENJSON (#json, '$.root')
WITH (
Cname NVARCHAR(100) 'strict$.FirstName',
Csurname NVARCHAR(100) 'lax$.surname'
) as J
where Csurname like #P1;
END TRY
BEGIN CATCH
DROP TABLE IF EXISTS temp_tblPersons;
IF ERROR_MESSAGE() LIKE N'%JSON%'
BEGIN
PRINT 'THE SUPPLY JSON IS NOT VALID';
END
ELSE
BEGIN
THROW;
END;
END CATCH;

Extract specific values from JSON array

I have a JSON response that is formatted this way:
- Client 1
- Date: 15.07.2017
- Name: John
- URL: www.google.com
- Client 2
- Date: 15.07.2017
- Name: Jane
- URL: www.google.com
- Client N...
How could I extract only the Name & URL value from each client so I could add them to a listbox for example? Also please note that "Client 1" could be named otherwise, like "User 1" or just "1", that's not important, but the code should extract the values regardless of the parent object name.
PS: Sorry for missleading, the JSON format above was pseudo-code from memory, the actual format is:
[
{
"date":"xxx",
"name":"xxx",
"url":"xxx"
},
{
"date":"xxx",
"name":"xxx",
"url":"xxx"
},
{
"date":"xxx",
"name":"xxx",
"url":"xxx"
}
]
Answer in case anybody is looking.
procedure Answer;
var
JSON: string;
ClientItem: TJSONValue;
ClientList: TJSONArray;
ListBoxItem: TListBoxItem;
begin
JSON := TFile.ReadAllText('.\your-file.json');
ClientList := TJSONObject.ParseJSONValue(JSON) as TJSONArray;
if Assigned(ClientList) then
try
ListBox.Items.BeginUpdate;
try
for ClientItem in ClientList do
begin
ListBoxItem := TListBoxItem.Create(ListBox);
ListBoxItem.StyleLookup := 'CustomListbox';
ListBoxItem.StylesData['URL'] := ClientItem.GetValue<string>('url');
ListBoxItem.StylesData['Name'] := ClientItem.GetValue<string>('name');
Listbox.AddObject(ListBoxItem);
end;
finally
Listbox.Items.EndUpdate;
end;
finally
ClientList.Free;
end;
end;