Used Database:
Im using Oracle 19c database, so i tried to use JSON functions declared already in PLSQL (for instance JSON_TABLE) to import JSON inside database table.
What im doing:
Im just calling API, getting JSON from it, and then i would like to import data inside the database, regardless of what data, and in what structure they came.
Problem:
I would like to iterate JSON data without knowing element names inside that JSON.
I would like to know where im actually am (name of current node), and names of child elements, so i could dynamically create tables from those names, add relations between them, and import all data.
What i have tried:
So far i was doing it manually- i had to create tables by myself. Importing data required knowledge of object names, and also knowledge of JSON structure that i want to import. And its working, but oh well... i would like to create something more universal. All this stuff had to be done, because i dont know any way to walk thru structure of JSON without knowing names of objects and generally- entire JSON structure.
Any ideas how to walk thru json structure, without knowing object names and relations between them?
Learn the new PL/SQL JSON data structures JSON Data Structures
procedure parse_json(p_json in blob) is
l_elem json_element_t := json_element_t.parse(p_json);
l_obj json_object_t;
l_arr json_array_t;
l_keys json_key_list;
begin
case
when l_elem.is_Object then
l_obj := treat(l_elem as json_object_t);
l_keys := l_obj.get_Keys;
for i in 1..l_keys.count loop
//work with the keys
if l_obj.get(l_keys(i)).is_object then
// this key is object,
end if;
if l_obj.get(l_keys(i)).is_array then
// this key is array,
end if;
end loop;
when l_elem.is_Array then
l_arr := treat(l_elem as json_array_t);
for i in 0..l_arr.get_size - 1 loop
// work with array
case l_arr..get_type(i)
when 'SCALAR' then
if l_arr.get(i).is_string then
if l_arr.get(i).is_number then
if l_arr.get(i).is_timestamp then
if l_arr.get(i).is_boolean then
.....
when 'OBJECT' then
....
when 'ARRAY' then
....
end case;
end loop;
end case;
end parse_json;
You can also use the truly helpful JSON Data Guide and the DBMS_JSON package to map out the json object for you and even automatically create a view using JSON_TABLE.
Regards
Related
I'm just new with APEX, PL/SQL and API/JSON so please bear with me.
I need to create a search page where the data will be coming from the API.
I tried to do it with web source but unfortunately I'm having an error, checked already with the dba team, etc. the error still there, thinking its about the version issue or something, so i remove this idea, though this will really help me a lot.
So the workaround is that the PL/SQL will connect to the API.
So it goes like this:
In APEX, I will input some data on the textbox and when I click the search button it will fetch the data from API to the interactive report.
**UPDATED
This is what I have and I believe there's a conversion of JSON thing that I also need to do.
declare
v_url varchar2(1000);
v_wallet_path varchar2(120) :='<walletvalue>';
v_body clob := '{<json body>}';
l_response clob;
begin
apex_web_service.g_request_headers.delete;
apex_web_service.g_request_headers(1).name := 'Ocp-Apim-Subscription-Key';
apex_web_service.g_request_headers(1).value := '<key value>';
v_url := '<url>';
l_response := apex_web_service.make_rest_request(
p_url => v_url,
p_http_method => 'POST',
p_wallet_path => v_wallet_path,
p_wallet_pwd =>'<password>',
p_body => v_body);
if apex_web_service.g_status_code = 200 then --OK
--dbms_output.put_line(l_response);
else --ERROR?
dbms_output.put_line('ERROR');
End If;
End;
Can someone please help me, I've been thinking about this for weeks. I don’t know where to start. What are the things I need to have, to know and the steps on how to create the page.
I know this is a lot but I will really appreciate your help! Thanks in advance also!
This is a very broad question, so my answer is also pretty vague.
I don't think you want to create a function - before the Web Source module was introduced, this kind of thing was often done in an on-submit page process. In your process you'd need to:
Call the web API, pass in your search term, and get back a response. The old way to do this was with UTL_HTTP, but the newer APEX_WEB_SERVICE package made it much easier.
Using APEX_COLLECTION, create/truncate a collection and save the response clob into the collection's p_clob001 field.
Edit: here's a code snippet for that
l_clob := apex_web_service.make_rest_request(....);
APEX_COLLECTION.CREATE_OR_TRUNCATE_COLLECTION(p_collection_name => 'API_RESPONSE');
APEX_COLLECTION.ADD_MEMBER(
p_collection_name => 'API_RESPONSE'
p_clob001 => l_clob);
Then create an interactive report. The source will be a SQL query which will take the collection's clob, parse it as JSON, and convert into a tabular format (rows and columns) using JSON_TABLE.
Edit: add example
SELECT jt.id, jt.name
FROM APEX_collections c
cross join JSON_TABLE(
clob001, -- the following lines depend on your JSON structure
'$[*]',
columns(
id number path '$.id',
name varchar2(10) path '$.name')
) jt
WHERE collection_name = 'API_RESPONSE'
Alternately, you could parse the clob using JSON_TABLE as part of your page process, save the output table into a collection using APEX_COLLECTION.CREATE_COLLECTION_FROM_QUERY, and then just query that collection for your interactive report.
Edit: I'm not sure if this would work, but something like:
APEX_COLLECTION.CREATE_COLLECTION_FROM_QUERY (
p_collection_name => 'API_RESPONSE',
p_query => 'SELECT t.id, t.name
FROM JSON_TABLE(
l_clob,
''$[*]'',
columns(
id number path ''$.id'',
name varchar2(10) path ''$.name'')
) t');
Side note: as a very different option, you could also call the web service using JavaScript/jQuery/AJAX. I think this would be more complicated, but it's technically possible.
I am using a stored procedure to INSERT into and UPDATE a number of tables. Some of the data is derived from a JSON parameter.
Although I have successfully used json_to_recordset() to extract named data from the JSON parameter, I cannot figure how to use it in an UPDATE statement. Also, I need to use some items of data from the JSON parameter a number of times.
Q: Is there a way to use json_to_recordset() to extract named data to a temporary table to allow me to reuse the data items throughout my stored procedure? Maybe I should SELECT INTO variables within the stored procedure?
Q: Failing that can anyone please provide a simple example of how to update a table using data returned from json_to_recordset(). I must also include data not from the JSON parameter such as now()::timestamp(0).
This is how I have used json_to_recordset() so far:
INSERT INTO myRealTable (
rec_timestamp,
rec_data1,
rec_data2
)
SELECT
now()::timestamp(0),
x.data1,
x.data2
FROM json_to_recordset(json_parameter) x
(
json_data1 int,
json_data2 boolean
);
Thank you.
I am using Delphi 7, Windows 7 and Absolute Database.
Quick Background. I work for a charity shop that relies on items being donated to us. To reclaim 'Gift Aid' from our HMRC we have to submit details of all sales together with the name and address of each donator of that sale. We help people with Special Needs so accurate data input is important.
Up to now to check Post Code verification was easy (based on our local area) basically the format of AA00_0AA or AA0_0AA. As our name has become better known not all Post Codes follow these rules.
I have access to UK's Royal Mail database for addresses in the UK si I thought to actually compare the inputted Post Code with a real Post Code. The RM csv file is huge so I use GSplit3 to break it down into more manageable files. This leaves me with 492 csv files each consisting of about 62000 lines. Note that I am only interested in the Post Codes so there is massive duplication.
To load these files into a dataset (without duplication) I first loaded the file names into a listbox and ran a loop to iterate through all the files to copy to the dataser.To avoid duplication I tried putting an unique index on the field but even running outside of Delphi I still got an error message about duplication. I then tried capturing the text of the last record to be appended and then compare it with the next record
procedure TForm1.importClick(Sender: TObject);
var
i,y:Integer;
lstfile:string;
begin
for i:= 0 to ListBox1.Items.Count-1 do
begin
lstfile:='';
cd.Active:=False;//cd is a csv dataset loaded with csv file
cd.FileName:='C:\paf 112018\CSV PAF\'+ListBox1.Items[i];
cd.Active:=True;
while not cd.Eof do
begin
if (lstfile:=cd.Fields[0].AsString=cd.Fields[0].AsString) then cd.Next
else
table1.append;
table1.fields[0].asstring:=cd.Fields[0].AsString;
lstfile:=cd.Fields[0].AsString;
cd.Next;
end;
end;
table1.Edit;
table1.Post;
end;
This seemed to work OK although the total number of records in the dataset seemed low. I checked with my own Post Code and it wasn't there although another post Code was located. So obviously records had been skipped. I then tried loading the CSV file into a string list with dupignore then copying the stringlist to the dataset.
unit Unit1;
interface
uses
Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms,
Dialogs, Grids, DBGrids, SMDBGrid, StdCtrls, DB, ABSMain, SdfData;
type
TForm1 = class(TForm)
cd: TSdfDataSet;
dscd: TDataSource;
dst: TDataSource;
ABSDatabase1: TABSDatabase;
table1: TABSTable;
table1PostCode: TStringField;
Label2: TLabel;
ListBox1: TListBox;
getfiles: TButton;
import: TButton;
procedure getfilesClick(Sender: TObject);
procedure importClick(Sender: TObject);
private
{ Private declarations }
public
{ Public declarations }
end;
var
Form1: TForm1;
num:Integer;
implementation
{$R *.dfm}
procedure ListFileDir(Path: string; FileList: TStrings);
var
SR: TSearchRec;
begin
if FindFirst(Path + '*.csv', faAnyFile, SR) = 0 then
begin
repeat
if (SR.Attr <> faDirectory) then
begin
FileList.Add(SR.Name);
end;
until FindNext(SR) <> 0;
FindClose(SR);
end;
end;
procedure TForm1.getfilesClick(Sender: TObject);
begin//Fill listbox with csv files
ListFileDir('C:\paf 112018\CSV PAF\', ListBox1.Items);
end;
//start to iterate through files to appane to dataset
procedure TForm1.importClick(Sender: TObject);
var
i,y:Integer;
myl:TStringList;
begin
for i:= 0 to ListBox1.Items.Count-1 do
begin
myl:=TStringList.Create;
myl.Sorted:=True;
myl.Duplicates:=dupIgnore;
cd.Active:=False;
cd.FileName:='C:\paf 112018\CSV PAF\'+ListBox1.Items[i];
cd.Active:=True;
while not cd.Eof do
begin
if (cd.Fields[0].AsString='')then cd.Next
else
myl.Add(cd.Fields[0].AsString);
cd.Next;
end;
for y:= 0 to myl.Count-1 do
begin
table1.Append;
table1.Fields[0].AsString:=myl.Strings[y];
end;
myl.Destroy;
end;
t.Edit;
t.Post;
end;
procedure TForm1.Button1Click(Sender: TObject);
begin
t.Locate('Post Code',edit1.text,[]);
end;
procedure TForm1.Button2Click(Sender: TObject);
var
sel:string;
begin
q.Close;
q.SQL.Clear;
q.SQL.Add('Select * from postc where [Post Code] like :sel');
q.ParamByName('sel').AsString:=Edit1.Text;
q.Active:=True;
end;
end.
This starts well but soon starts to slow I presume because of memory leaks, I have tried myl.free(), freeandnil(myl) and finally destroy but they all slow down quickly. I am not an expert but do enjoy using Delphi and normally manage to solve problems through your pages or googling but this time I am stumped. Can anyone suggest a better method please
The following shows how to add postcodes from a file containing a list of them
to a query-type DataSet such as TAdoQuery or your q dataset (your q doesn't say what type
it is, as far as I can see).
It also seems from what you say that although you treat your postcode file
as a CVS file, you shouldn't actually need to: if the records contain only one
field, there should be no need for commas, because there is nothing to separate,
the file should contain simply one postcode per line. Consequently, there
doesn't seem to be any need to incur the overhead of loading it as a CSV file, so you should be able simply to load it into a TStringList and add the postcodes from there.
I'm not going to attempt to correct your code, just show a very simple example of how I think it should be done.
So the following code opens a postcode list file, which is assumed to contain one postcode per line, checks whether each entry in it
already exists in your table of postcodes and adds it if not.
procedure TForm1.AddPostCodes(const PostCodeFileName: String);
// The following shows how to add postcodes to a table of existing ones
// from a file named PostCodeFileName which should include an explicit path
var
PostCodeList : TStringList;
PostCode : String;
i : Integer;
begin
PostCodeList := TStringList.Create;
try
PostCodeList.LoadFromFile(PostCodeFileName);
if qPostCodes.Active then
qPostCodes.Close;
qPostCodes.Sql.Text := 'select * from postcodes order by postcode';
qPostCodes.Open;
try
qPostCodes.DisableControls; // Always call DisableControls + EnableControls
// when iterating a dataset which has db-aware controls connected to it
for i := 0 to PostCodeList.Count - 1 do begin
PostCode := PostCodeList[i]; // use of PostCode local variable is to assist debuggging
if not qPostCodes.Locate('PostCode', PostCode, [loCaseInsensitive]) then
qPostCodes.InsertRecord([PostCode]); // InsertRecord does not need to be foollowed by a call to Post.
end;
finally
qPostCodes.EnableControls;
qPostCodes.Close;
end;
finally
PostCodeList.Free; // Don't use .Destroy!
end;
end;
Btw, regarding the comment about bracketing an iteration of a dataset
by calls to DisableControls and EnableControls, the usual reason for doing this is to avoid the overheaad of updating the gui's display of any db-aware controls connected to the dataset. However, one of the reasons I'm
not willing to speculate what is causing your slowdown is that TAdoQuery,
which is one of the standard dataset types that codes with Delphi benefits massively
from DisableControls/EnableControls even when there are NO db-aware controls
connected to it. This is because of a quirk in the coding of TAdoQuery. Whatever
dataset you are using may have a similar quirk.
I'm creating a CRUD program using Go and I have quite a big struct with over 70 fields that I want to add to a MySQL database.
I was wondering if there's a way to automatically map the struct into my database so I wouldn't have to create the table manually and it would just copy my struct?
I haven't found a way to totally automate that process, but atleast you can create them using tags and only a little bit of code.
Workarround example:
There are some github projects in the wild, which help you to achieve this.
For example structable
You'd have to add tags to your structs members.
Example from github:
type Stool struct {
Id int `stbl:"id, PRIMARY_KEY, AUTO_INCREMENT"`
Legs int `stbl:"number_of_legs"`
Material string `stbl:"material"`
Ignored string // will not be stored. No tag.
}
When you have that part, you can create the table like in the following example (also from the github page)
stool := new(Stool)
stool.Material = "Wood"
db := getDb() // Get a sql.Db. You're on the hook to do this part.
// Create a new structable.Recorder and tell it to
// bind the given struct as a row in the given table.
r := structable.New(db, "mysql").Bind("test_table", stool)
// This will insert the stool into the test_table.
err := r.Insert()
try gen-model
go get -u github.com/DaoYoung/gen-model
gen-model init # then set value in .gen-model.yaml
gen-model create
done
This is something new for me to try using XML column in SQL server 2008 database. I have seen few posts related to this, but I am able to find it difficult.
Let me put my question in a simplest form.
I have a database table dbo.cXML that has the columns EmailID(nVarchar(128)), ClientID(int) and cycleXML(XML).
My middleware component implements complex business logic and spit out the XML after logic processing.
Now I have a requirement that need the following:
a) A stored procedure with parameters in place to perform a check on above table to see if there is already an XML for a given EmailID and Client ID. If a record exists, use Update query to update entire XML otherwise simply insert the XML.
b) A stored procedure whould be able to send back the complete XML to my middleware component on request.
Can someone please help me understand the pseudo code. Appreciate your help.
Thanks,
Yagya
Not totally sure what you mean by requirement 3b but does this help you???
IF EXISTS (SELECT 1 FROM dbo.cXML WHERE EmailID = #YourEmailID AND ClientID = #YourClientID)
BEGIN
UPDATE dbo.cXML
SET cycleXML = #YourXML
WHERE EmailID = #YourEmailID AND ClientID = #YourClientID
END
ELSE BEGIN
INSERT INTO dbo.cXML (EmailID, ClientID, cycleXML)
SELECT #YourEmailID, #YourClientID, #YourXML
END