Related
I'm a newbie to SQL Server. I have a table Accounts which is defined as:
OrganizationId int,
AccountDetails varchar(max)
The AccountDetails column contains XML data.
The data in the table looks like this:
1 | <Account><Id>100</Id><Name>A</Name></Account>
2 | <Account><Id>200</Id><Name>B</Name></Account>
3 | <Account><Id>300</Id><Name>C</Name></Account>
4 | <Account><Id>400</Id><Name>D</Name></Account>
I need write a SQL query to get the records from this table where AccountId is 200 or 400.
The query should return two rows (#2 and #4) in JSON format, like this:
result1 : { "account_id": 200, "account_name": B }
result2 : { "account_id": 400, "account_name": D }
I'm wondering how do I go about this?
Thank you.
For # 1 above, should I be trying to cast the AccountDetails column to XML type and then use "nodes" feature for querying/filtering?
For #2, I should be writing a SQL function to convert the XML to JSON first and querying XML to build the JSON as needed?
As already mentioned, it is much better to use a proper XML data type for the AccountDetails column.
Please try the following solution.
It will work starting from SQL Server 2016 onwards.
SQL
-- DDL and sample data population, start
DECLARE #tbl TABLE (OrganizationId INT IDENTITY PRIMARY KEY, AccountDetails NVARCHAR(MAX));
INSERT #tbl (AccountDetails) VALUES
('<Account><Id>100</Id><Name>A</Name></Account>'),
('<Account><Id>200</Id><Name>B</Name></Account>'),
('<Account><Id>300</Id><Name>C</Name></Account>'),
('<Account><Id>400</Id><Name>D</Name></Account>');
-- DDL and sample data population, end
;WITH rs AS
(
SELECT t.OrganizationId
, account_id = x.value('(/Account/Id/text())[1]', 'INT')
, account_name = x.value('(/Account/Name/text())[1]', 'VARCHAR(20)')
FROM #tbl AS t
CROSS APPLY (VALUES(TRY_CAST(AccountDetails AS XML))) AS t1(x)
)
SELECT *
, JSONData = (SELECT rs.account_id, rs.account_name FOR JSON PATH,WITHOUT_ARRAY_WRAPPER)
FROM rs
WHERE rs.account_id IN (200, 400);
Output
OrganizationId
account_id
account_name
JSONData
2
200
B
{"account_id":200,"account_name":"B"}
4
400
D
{"account_id":400,"account_name":"D"}
We've JSON entry in the MS SQL database. I would like to export JSON entry in the "Data" column which match with the list of EMPNO. Could someone please help?
ColumnName: Data
Data inside the column:
output:{
"Request":{
"Person":{
"DisplayName":"Test User",
"EMPNO":"000001",
"Entity":"01",
"Country":"AA"
},
"DomainOverride":null,
"ReasonForGen":"Only",
"Email":"123#test.com",
"CurrentSIP":"123#test.com"
},
"EmailAddress":"123#testcom",
"SIPAddress":"123#test.com",
"Status":"NoChange"
}
Query in layman language:
select DisplayName,EMPNO,Entity,Country,DomainOverride,ReasonForGen,Email
from Table1
where data.output.Request.EMPNO in ([EMPNO list])
You can use JSON_VALUE. Something like this:
select JSON_VALUE(data,'$.Output.Request.Person.DisplayName'), ...
from Table1
where JSON_VALUE(data,'$.Output.Request.Person.EMPNO') in ([EMPNO list])
You may try to use OPENJSON() to parse the JSON text and get objects and values from the JSON input as a table. You need to use OPENJSON() with explicit schema in the WITH clause to define the columns:
Table:
CREATE TABLE Data (
JsonData nvarchar(max)
)
INSERT INTO Data
(JsonData)
VALUES
(N'{
"output":{
"Request":{
"Person":{
"DisplayName":"Test User",
"EMPNO":"000001",
"Entity":"01",
"Country":"AA"
},
"DomainOverride":null,
"ReasonForGen":"Only",
"Email":"123#test.com",
"CurrentSIP":"123#test.com"
},
"EmailAddress":"123#testcom",
"SIPAddress":"123#test.com",
"Status":"NoChange"
}
}')
Statement:
SELECT
j.DisplayName,
j.EMPNO,
j.Entity,
j.Country,
j.DomainOverride,
j.ReasonForGen,
j.Email
FROM Data d
CROSS APPLY OPENJSON(d.JsonData) WITH (
EMPNO nvarchar(10) '$.output.Request.Person.EMPNO',
DisplayName nvarchar(50) '$.output.Request.Person.DisplayName',
EMPNO nvarchar(50) '$.output.Request.Person.EMPNO',
Entity nvarchar(50) '$.output.Request.Person.Entity',
Country nvarchar(50) '$.output.Request.Person.Country',
DomainOverride nvarchar(50) '$.output.Request.DomainOverride',
ReasonForGen nvarchar(50) '$.output.Request.ReasonForGen',
Email nvarchar(50) '$.output.Request.Email'
) j
-- Use additional WHERE clause
--WHERE j.EMPNO IN ('00001', '000002')
Result:
DisplayName EMPNO Entity Country DomainOverride ReasonForGen Email
Test User 000001 01 AA Only 123#test.com
I have data like this:
I want to query result like this:
Here is my code
SELECT
PML_CODE
,PML_NAME_ENG
,(
SELECT
PML_ID
,PML_NO
,PML_CODE
,PML_NAME_ENG
,PML_FORMULA
FROM DSP.PARAMET_LIST AS A WITH(NOLOCK)
WHERE A.PML_ID = B.PML_ID
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
) AS BR_OBJECT
FROM DSP.PARAMET_LIST AS B WITH(NOLOCK)
My code works for what I want, but I want to know if there is a better, faster way to write this query?
Next time please do not post pictures, but rather try to create some DDL, fill it with sample data and state your own attempts and the expected output. This makes it easier for us to understand and to answer your issue.
You can try it like this:
DECLARE #tbl TABLE(PML_ID BIGINT, PML_NO INT, PML_CODE VARCHAR(10), PML_NAME_ENG VARCHAR(10), PML_FORMULA VARCHAR(10));
INSERT INTO #tbl VALUES
(2017102600050,1,'KHR','Riel','01')
,(2017102600051,2,'USD','Dollar','02')
,(2017102600052,3,'THB','Bath','05')
SELECT
PML_CODE
,PML_NAME_ENG
,BR_OBJECT
FROM #tbl
CROSS APPLY(
SELECT
(
SELECT
PML_ID
,PML_NO
,PML_CODE
,PML_NAME_ENG
,PML_FORMULA
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
)) AS A(BR_OBJECT);
The big difference to your own approach is that I use a CROSS APPLY using the columns we have already instead of calling a correlated sub-query.
You can just concatenate the values. Be sure to cast the integers and to handle the NULL values. For example, if there is NULL value for column, there can be two cases - ignore the property or add the property with null, right?
For SQL Server 2016 SP1+ and later you can use FOR JSON. Basically, you should end up with something like this:
DECLARE #DataSource TABLE
(
[PML_ID] VARCHAR(64)
,[PML_NO] INT
,[PML_CODE] VARCHAR(3)
,[PML_NAME_ENG] NVARCHAR(32)
,[PML_FORMULA] VARCHAR(2)
);
INSERT INTO #DataSource ([PML_ID], [PML_NO], [PML_CODE], [PML_NAME_ENG], [PML_FORMULA])
VALUES ('201710260000000050', 1, 'KHR', 'Riel', 01)
,('201710260000000051', 2, 'USD', 'Dollar', 02)
,('201710260000000052', 3, 'THB', 'Bath', 05);
SELECT [PML_CODE]
,[PML_NAME_ENG]
,'{"PML_ID":'+ [PML_ID] +',"PML_NO":'+ CAST([PML_NO] AS VARCHAR(12)) +',"PML_CODE":'+ [PML_CODE] +',"PML_NAME_ENG":'+ [PML_NAME_ENG] +',"PML_FORMULA":'+ [PML_FORMULA] +'}' AS [BR_OBJECT]
FROM #DataSource;
-- SQL Server 2016 SP1 and latter
SELECT DS1.[PML_CODE]
,DS1.[PML_NAME_ENG]
,DS.[BR_OBJECT]
FROM #DataSource DS1
CROSS APPLY
(
SELECT *
FROM #DataSource DS2
WHERE DS1.[PML_CODE] = DS2.[PML_CODE]
AND DS2.[PML_NAME_ENG] = DS2.[PML_NAME_ENG]
FOR JSON AUTO
) DS ([BR_OBJECT]);
Background
I need to fetch a few thousands rows from Oracle and convert them to JSON for use in SlickGrid.
Currently I am fetching the rows in PHP, converting it from ISO to UTF-8 with iconv and exporting to json with json_encode. The whole operation takes about 1 second on DB side and 5 seconds to generate JSON. It is way to long.
The question
I have read that Oracle 12c supports JSON, but I cannot find exactly what I need.
Is there a way to return the result of a standard sql query in a json format?
supposedly I would like to issue a query similar to this:
SELECT * from table AS JSON
and receive a valid json similar to this:
[{"col1": "value1", "col2": 2}, {"col1": "valueOfRow2", "col2": 3}]
An important thing is that I need to have the unicode sequences escaped for me, as I use ISO-8859-2 charset on the client side, and JSON have to be in either UTF-8 or have the sequences escaped.
Oracle 12c version 12.1.0.2 (the latest version as of 11.11.2014) adds JSON support:
https://docs.oracle.com/database/121/NEWFT/chapter12102.htm#BGBGADCC
It's been available since October 17th. https://blogs.oracle.com/db/entry/oracle_database_12c_release_1
If you are unable to patch/work with that version there is an excellent package written by Lewis Cunningham and Jonas Krogsboell: PL/JSON
* http://pljson.sourceforge.net/
It's an excellent package (I have used it in numerous database installations).
The examples included are good and cover most scenarios.
declare
ret json;
begin
ret := json_dyn.executeObject('select * from tab');
ret.print;
end;
/
12cR2 (available in the Oracle Cloud) supports this natively.
SQL> select JSON_ARRAY(EMPLOYEE_ID, FIRST_NAME,LAST_NAME) from HR.EMPLOYEES;
JSON_ARRAY(EMPLOYEE_ID,FIRST_NAME,LAST_NAME)
--------------------------------------------------------------------------------
[100,"Steven","King"]
[101,"Neena","Kochhar"]
or
SQL> select JSON_OBJECT('ID' is EMPLOYEE_ID , 'FirstName' is FIRST_NAME,'LastName' is LAST_NAME) from HR.EMPLOYEES;
JSON_OBJECT('ID'ISEMPLOYEE_ID,'FIRSTNAME'ISFIRST_NAME,'LASTNAME'ISLAST_NAME)
----------------------------------------------------------------------------
{"ID":100,"FirstName":"Steven","LastName":"King"}
{"ID":101,"FirstName":"Neena","LastName":"Kochhar"}
The release 12.2 includes new capabilities for generating JSON documents directly from SQL queries. The easiest way to achieve the goal is to use the functions: JSON_OBJECT and JSON_ARRAYAGG.
create table tab as
select level col1, 'value '||level col2 from dual connect by level <= 2
/
select max (rownum) rn, json_arrayagg (
json_object (
key 'col1' value col1,
key 'col2' value col2
) format json returning clob
) as json_doc
from tab;
Result:
RN JSON_DOC
---------- ---------------------------------------------------------
2 [{"col1":1,"col2":"value 1"},{"col1":2,"col2":"value 2"}]
Test with large amount of data:
select rn, length (json_doc) json_size, json_doc from (
<query mentoined above here>
cross join (select dummy from dual connect by level <= 1e5)
);
RN JSON_SIZE JSON_DOC
---------- ---------- ---------------------------------------------------------
200000 5600001 [{"col1":1,"col2":"value 1"},{"col1":2,"col2":"value 2"},
On the slow test machine it took ~1 sec. to create 5,6M JSON.
In the release 19c the syntax of the the function JSON_OBJECT is simplified.
The query above will look now like this:
select json_arrayagg (
json_object (*) returning clob
) as json_doc
from tab;
On Live SQL.
You can use the xmltype to convert the result of an SQL into XML and JSON. See the following article for the solution which will work for Oracle since version 9. You can also download the package itstar_xml_util:
http://stefan-armbruster.com/index.php/12-it/pl-sql/12-oracle-xml-and-json-goodies
A simple example with the emp table:
declare
l_sql_string varchar2(2000);
l_xml xmltype;
l_json xmltype;
begin
l_sql_string := 'select a.empno, a.ename, a.job from emp a';
-- Create the XML from SQL
l_xml := itstar_xml_util.sql2xml(l_sql_string);
-- Display the XML
dbms_output.put_line(l_xml.getclobval());
l_json := itstar_xml_util.xml2json(l_xml);
-- Display the JSON
dbms_output.put_line(l_json.getclobval());
end;
The result looks like this:
{"ROWSET": [
{
"EMPNO": 7839,
"ENAME": "KING",
"JOB": "PRESIDENT"
},
{
"EMPNO": 7698,
"ENAME": "BLAKE",
"JOB": "MANAGER"
},
[...]
{
"EMPNO": 7934,
"ENAME": "MILLER",
"JOB": "CLERK"
}
]}
Starting Oracle 19c, the syntax to construct a JSON representation for a row of a table is simplified
For Eg: To convert all the rows of the hr.employees to separate jsons, use
SELECT JSON_OBJECT(*) FROM hr.employees ;
{
"EMPLOYEE_ID" : 100,
"FIRST_NAME" : "Steven",
"LAST_NAME" : "King",
"EMAIL" : "SKING",
"PHONE_NUMBER" : "515.123.4567",
"HIRE_DATE" : "2003-06-17T00:00:00",
"JOB_ID" : "AD_PRES",
"SALARY" : 24000,
"COMMISSION_PCT" : null,
"MANAGER_ID" : null,
"DEPARTMENT_ID" : 90
} --row 1
{
"EMPLOYEE_ID" : 101,
"FIRST_NAME" : "Neena",
"LAST_NAME" : "Kochhar",
"EMAIL" : "NKOCHHAR",
"PHONE_NUMBER" : "515.123.4568",
"HIRE_DATE" : "2005-09-21T00:00:00",
"JOB_ID" : "AD_VP",
"SALARY" : 17000,
"COMMISSION_PCT" : null,
"MANAGER_ID" : 100,
"DEPARTMENT_ID" : 90
} --row 2
...
LIVE SQL example
Oracle 12c support for JSON is an ability to store JSON objects, query them and select from them.
You have tabular format and only need to display your data as a JSON. So you can simply concatenate rows into {'col1': 'rowN1', 'col2': 'rowN2'} and make the rest on a client side.
Or you can use LISTAGG to get the whole document. Example:
http://technology.amis.nl/2011/06/14/creating-json-document-straight-from-sql-query-using-listagg-and-with-clause/
Just mind the SQL VARCHAR2 limit of 4000 characters.
You could also look into http://database-geek.com/2009/03/25/json-in-and-out-of-oracle-json-data-type/ But I don't think, that oracle object type will improve your performance.
Another aproach is to export XML using XMLType. Then convert XML to JSON. XMLType will take care of special characters, and API is quite stable (you will not need to rewrite your program for Oracle 14).
To add to the answer in oracle 12.2 , you can create json as you want like this .
SELECT JSON_ARRAY(
JSON_OBJECT (
KEY 'number' VALUE s.number,
KEY 'name' VALUE s.sname,
KEY 'location' VALUE s.loc
)
) AS student_det
FROM student s;
Just try this:
:) life is happy
with data as
( select
xmlelement(e,regexp_replace('{"name":"'||colname||'"}', '[[:cntrl:]]', ''),',') col1
from tblname
)
select
rtrim(replace(replace(replace(xmlagg(col1).getclobval(),'&'||'quot;','"'),'<E>',''),'</E>',''),',')
as very_long_json
from data;
I test in 19C:
SQL> select JSON_OBJECT(*) from HR.EMPLOYEES;
------------------------------------------------------------------------------
{"ID":100,"FirstName":"Steven","LastName":"King", ...}
{"ID":101,"FirstName":"Neena","LastName":"Kochhar", ...}
Or:
SQL> select json_arrayagg(JSON_OBJECT(*) returning clob ) from HR.EMPLOYEES;
------------------------------------------------------------------------------
[ {"ID":100,"FirstName":"Steven","LastName":"King", ...},{"ID":101,"FirstName":"Neena","LastName":"Kochhar", ...}]
I do not see Python solution (in case you need to dump JSON).
I wrote json-ora-extract for medium size extracts (because data-set has to fit available memory).
It uses wx_Oracle and json Python modules to read data from Oracle data base (any version) and dump it into *.json file.
There's also an option to create compressed *.gz file.
Background
I need to fetch a few thousands rows from Oracle and convert them to JSON for use in SlickGrid.
Currently I am fetching the rows in PHP, converting it from ISO to UTF-8 with iconv and exporting to json with json_encode. The whole operation takes about 1 second on DB side and 5 seconds to generate JSON. It is way to long.
The question
I have read that Oracle 12c supports JSON, but I cannot find exactly what I need.
Is there a way to return the result of a standard sql query in a json format?
supposedly I would like to issue a query similar to this:
SELECT * from table AS JSON
and receive a valid json similar to this:
[{"col1": "value1", "col2": 2}, {"col1": "valueOfRow2", "col2": 3}]
An important thing is that I need to have the unicode sequences escaped for me, as I use ISO-8859-2 charset on the client side, and JSON have to be in either UTF-8 or have the sequences escaped.
Oracle 12c version 12.1.0.2 (the latest version as of 11.11.2014) adds JSON support:
https://docs.oracle.com/database/121/NEWFT/chapter12102.htm#BGBGADCC
It's been available since October 17th. https://blogs.oracle.com/db/entry/oracle_database_12c_release_1
If you are unable to patch/work with that version there is an excellent package written by Lewis Cunningham and Jonas Krogsboell: PL/JSON
* http://pljson.sourceforge.net/
It's an excellent package (I have used it in numerous database installations).
The examples included are good and cover most scenarios.
declare
ret json;
begin
ret := json_dyn.executeObject('select * from tab');
ret.print;
end;
/
12cR2 (available in the Oracle Cloud) supports this natively.
SQL> select JSON_ARRAY(EMPLOYEE_ID, FIRST_NAME,LAST_NAME) from HR.EMPLOYEES;
JSON_ARRAY(EMPLOYEE_ID,FIRST_NAME,LAST_NAME)
--------------------------------------------------------------------------------
[100,"Steven","King"]
[101,"Neena","Kochhar"]
or
SQL> select JSON_OBJECT('ID' is EMPLOYEE_ID , 'FirstName' is FIRST_NAME,'LastName' is LAST_NAME) from HR.EMPLOYEES;
JSON_OBJECT('ID'ISEMPLOYEE_ID,'FIRSTNAME'ISFIRST_NAME,'LASTNAME'ISLAST_NAME)
----------------------------------------------------------------------------
{"ID":100,"FirstName":"Steven","LastName":"King"}
{"ID":101,"FirstName":"Neena","LastName":"Kochhar"}
The release 12.2 includes new capabilities for generating JSON documents directly from SQL queries. The easiest way to achieve the goal is to use the functions: JSON_OBJECT and JSON_ARRAYAGG.
create table tab as
select level col1, 'value '||level col2 from dual connect by level <= 2
/
select max (rownum) rn, json_arrayagg (
json_object (
key 'col1' value col1,
key 'col2' value col2
) format json returning clob
) as json_doc
from tab;
Result:
RN JSON_DOC
---------- ---------------------------------------------------------
2 [{"col1":1,"col2":"value 1"},{"col1":2,"col2":"value 2"}]
Test with large amount of data:
select rn, length (json_doc) json_size, json_doc from (
<query mentoined above here>
cross join (select dummy from dual connect by level <= 1e5)
);
RN JSON_SIZE JSON_DOC
---------- ---------- ---------------------------------------------------------
200000 5600001 [{"col1":1,"col2":"value 1"},{"col1":2,"col2":"value 2"},
On the slow test machine it took ~1 sec. to create 5,6M JSON.
In the release 19c the syntax of the the function JSON_OBJECT is simplified.
The query above will look now like this:
select json_arrayagg (
json_object (*) returning clob
) as json_doc
from tab;
On Live SQL.
You can use the xmltype to convert the result of an SQL into XML and JSON. See the following article for the solution which will work for Oracle since version 9. You can also download the package itstar_xml_util:
http://stefan-armbruster.com/index.php/12-it/pl-sql/12-oracle-xml-and-json-goodies
A simple example with the emp table:
declare
l_sql_string varchar2(2000);
l_xml xmltype;
l_json xmltype;
begin
l_sql_string := 'select a.empno, a.ename, a.job from emp a';
-- Create the XML from SQL
l_xml := itstar_xml_util.sql2xml(l_sql_string);
-- Display the XML
dbms_output.put_line(l_xml.getclobval());
l_json := itstar_xml_util.xml2json(l_xml);
-- Display the JSON
dbms_output.put_line(l_json.getclobval());
end;
The result looks like this:
{"ROWSET": [
{
"EMPNO": 7839,
"ENAME": "KING",
"JOB": "PRESIDENT"
},
{
"EMPNO": 7698,
"ENAME": "BLAKE",
"JOB": "MANAGER"
},
[...]
{
"EMPNO": 7934,
"ENAME": "MILLER",
"JOB": "CLERK"
}
]}
Starting Oracle 19c, the syntax to construct a JSON representation for a row of a table is simplified
For Eg: To convert all the rows of the hr.employees to separate jsons, use
SELECT JSON_OBJECT(*) FROM hr.employees ;
{
"EMPLOYEE_ID" : 100,
"FIRST_NAME" : "Steven",
"LAST_NAME" : "King",
"EMAIL" : "SKING",
"PHONE_NUMBER" : "515.123.4567",
"HIRE_DATE" : "2003-06-17T00:00:00",
"JOB_ID" : "AD_PRES",
"SALARY" : 24000,
"COMMISSION_PCT" : null,
"MANAGER_ID" : null,
"DEPARTMENT_ID" : 90
} --row 1
{
"EMPLOYEE_ID" : 101,
"FIRST_NAME" : "Neena",
"LAST_NAME" : "Kochhar",
"EMAIL" : "NKOCHHAR",
"PHONE_NUMBER" : "515.123.4568",
"HIRE_DATE" : "2005-09-21T00:00:00",
"JOB_ID" : "AD_VP",
"SALARY" : 17000,
"COMMISSION_PCT" : null,
"MANAGER_ID" : 100,
"DEPARTMENT_ID" : 90
} --row 2
...
LIVE SQL example
Oracle 12c support for JSON is an ability to store JSON objects, query them and select from them.
You have tabular format and only need to display your data as a JSON. So you can simply concatenate rows into {'col1': 'rowN1', 'col2': 'rowN2'} and make the rest on a client side.
Or you can use LISTAGG to get the whole document. Example:
http://technology.amis.nl/2011/06/14/creating-json-document-straight-from-sql-query-using-listagg-and-with-clause/
Just mind the SQL VARCHAR2 limit of 4000 characters.
You could also look into http://database-geek.com/2009/03/25/json-in-and-out-of-oracle-json-data-type/ But I don't think, that oracle object type will improve your performance.
Another aproach is to export XML using XMLType. Then convert XML to JSON. XMLType will take care of special characters, and API is quite stable (you will not need to rewrite your program for Oracle 14).
To add to the answer in oracle 12.2 , you can create json as you want like this .
SELECT JSON_ARRAY(
JSON_OBJECT (
KEY 'number' VALUE s.number,
KEY 'name' VALUE s.sname,
KEY 'location' VALUE s.loc
)
) AS student_det
FROM student s;
Just try this:
:) life is happy
with data as
( select
xmlelement(e,regexp_replace('{"name":"'||colname||'"}', '[[:cntrl:]]', ''),',') col1
from tblname
)
select
rtrim(replace(replace(replace(xmlagg(col1).getclobval(),'&'||'quot;','"'),'<E>',''),'</E>',''),',')
as very_long_json
from data;
I test in 19C:
SQL> select JSON_OBJECT(*) from HR.EMPLOYEES;
------------------------------------------------------------------------------
{"ID":100,"FirstName":"Steven","LastName":"King", ...}
{"ID":101,"FirstName":"Neena","LastName":"Kochhar", ...}
Or:
SQL> select json_arrayagg(JSON_OBJECT(*) returning clob ) from HR.EMPLOYEES;
------------------------------------------------------------------------------
[ {"ID":100,"FirstName":"Steven","LastName":"King", ...},{"ID":101,"FirstName":"Neena","LastName":"Kochhar", ...}]
I do not see Python solution (in case you need to dump JSON).
I wrote json-ora-extract for medium size extracts (because data-set has to fit available memory).
It uses wx_Oracle and json Python modules to read data from Oracle data base (any version) and dump it into *.json file.
There's also an option to create compressed *.gz file.