I need to the following JSON to csv in abap adt.
data(rep) = {
"name": "John",
"age": "22",
"gender": "male",
}
{
"name": "ram",
"age": "21",
"gender": "male",
}
{
"name": "Janu",
"age": "22",
"gender": "female",
}
which function is used for to convert csv.
I would recommend you to first save the JSON string to an internal table and then create a CSV from the internal table.
TYPES: BEGIN OF t_person,
name TYPE string,
age TYPE i,
gender TYPE string,
END OF t_person.
TYPES: tt_person TYPE STANDARD TABLE OF t_person WITH DEFAULT KEY.
DATA: lt_csv TYPE truxs_t_text_data.
DATA: json TYPE string VALUE '{"VALUES":[{
"name": "John",
"age": "22",
"gender": "male",
},
{
"name": "ram",
"age": "21",
"gender": "male",
},
{
"name": "Janu",
"age": "22",
"gender": "female",
}]}'.
DATA(lt_persons) = VALUE tt_person( ).
"Convert to internal table
CALL TRANSFORMATION id SOURCE XML json RESULT values = lt_persons.
"Convert to CSV
CALL FUNCTION 'SAP_CONVERT_TO_TEX_FORMAT'
EXPORTING
i_field_seperator = ';'
TABLES
i_tab_sap_data = lt_persons
CHANGING
i_tab_converted_data = lt_csv
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
Related
is it possible to obtain this output ?
With a command line tool (jq or other)
From csv
id,age,gender
1,39,M
2,25,M
To json column array
[{"id":["1","2"]},{"age":["39","25"]},{"gender":["M","M"]}]
OR from json
[
{
"id": "1",
"age": "39",
"gender": "M"
},
{
"id": "2",
"age": "25",
"gender": "M"
}
]
To json column array
[{"id":["1","2"]},{"age":["39","25"]},{"gender":["M","M"]}]
Hoping you understood me. Thank you in advance for your answers.
There is no cannonical way to read in CSV as the format is not standardized. Existing usages mostly differ in the field separator and the escape sequence (used if either one happens to be part of the data), which you would have to consider explicitly when reading in raw text manually.
Therefore, I regard it as safer to follow your second approach. Here's one way for the given input:
[
{
"id": "1",
"age": "39",
"gender": "M"
},
{
"id": "2",
"age": "25",
"gender": "M"
}
]
jq 'map(to_entries) | transpose | map({(first.key): map(.value)}) | add'
{
"id": [
"1",
"2"
],
"age": [
"39",
"25"
],
"gender": [
"M",
"M"
]
}
Demo
How do you do find elements for a key in another json using dataweave?
I have to 2 JSON arrays; one which is standing lookup data and second which I am creating dynamically and I need to lookup the values based on a key from my lookup json.
For e.g. My lookup array looks like:
[{"id": 1, "shortname": "John", "fullname": "John Doe", "age" : 40,"designation": "Engineer"},
{"id": 2, "shortname": "Mary", "fullname": "Mary Jane","age" : 36,"designation": "Manager"}]
Now when I looping through my payload, I only get the id in it from my API Response and I need to add the shortname and fullname for the id passed to me in the final response. Something like below:
Input Payload is:
[{
"id": 1,
"project": "ABC",
"rate": "150"
}
{
"id": 2,
"project": "ABC",
"rate": "200"
}]
Output Payload required is:
[{
"id": 1,
"fullname": "John Doe"
"shortName": "John"
"project": "ABC",
"rate": "150"
}
{
"id": 2,
"fullname": "Mary Jane"
"shortName": "John"
"project": "ABC",
"rate": "200"
}]
The dataweave expression I am trying to write is:
payload map(item, index) -> {
"id": item.id,
"fullname": "Lookup from other array based on id",
"shortname": "Lookup from other array based on id",
"project": item.project,
"rate": item.rate
}
How can I get the value of fullname and shortname from another array based on the id as I am modifying the response payload? I don't want all the data from my lookup Array rather only specific fields.
You could use one of the join functions from the Arrays module, like leftJoin(). The after joining do the mapping to the final expected output.
In general don't think about transformations in terms of 'looping'. DataWeave is a functional language. Think on terms of mapping, filtering, etc.
%dw 2.0
output application/json
import * from dw::core::Arrays
var lookup=[{"id": 1, "shortname": "John", "fullname": "John Doe", "age" : 40,"designation": "Engineer"},
{"id": 2, "shortname": "Mary", "fullname": "Mary Jane","age" : 36,"designation": "Manager"}]
---
leftJoin(payload, lookup, (left) -> left.id, (right) -> right.id )
map(item, index) -> {
(item.l),
"fullname": item.r.fullname,
"shortname": item.r.shortname
}
Output (for the input shared):
[
{
"id": 1,
"project": "ABC",
"rate": "150",
"fullname": "John Doe",
"shortname": "John"
},
{
"id": 2,
"project": "ABC",
"rate": "200",
"fullname": "Mary Jane",
"shortname": "Mary"
}
]
I have two json objects :
"Celebrity": [
{
"Name": "SRK",
"Surname": "Kajol"
},{
"Name": "Ajay",
"Surname": "Devgan"
}]
"Cricketer": [
{
"Name": "Virat",
"Surname": "Kohli"
},{
"Name": "Sachin",
"Surname": "Tendulkar"
}]
I want to merge the above two arrays into a single JSON array:
{"Celebrity": [
{
"Name": "SRK",
"Surname": "Kajol"
},{
"Name": "Ajay",
"Surname": "Devgan"
}],
"Cricketer": [
{
"Name": "Virat",
"Surname": "Kohli"
},{
"Name": "Sachin",
"Surname": "Tendulkar"
}]}
How to I do this merge in MS sql ?
Original answer:
A possible approach is using JSON_MODIFY() and JSON_QUERY(), but you need at least SQL Server 2016 to use the built-in JSON support. The idea is to extract the "Cricketer" JSON array from the second JSON (using JSON_QUERY() with the appropriate path) and append it to the first JSON (using JSON_MODIFY()):
JSON:
DECLARE #json1 nvarchar(max) = N'{"Celebrity": [
{"Name": "SRK", "Surname": "Kajol"},
{"Name": "Ajay", "Surname": "Devgan"}
]}'
DECLARE #json2 nvarchar(max) = N'{"Cricketer": [
{"Name": "Virat", "Surname": "Kohli"},
{"Name": "Sachin", "Surname": "Tendulkar"}
]}'
Statement:
SELECT #json1 = JSON_MODIFY(#json1, '$."Cricketer"', JSON_QUERY(#json2, '$."Cricketer"'))
SELECT #json1
Result:
{"Celebrity": [
{"Name": "SRK", "Surname": "Kajol"},
{"Name": "Ajay", "Surname": "Devgan"}
],"Cricketer":[
{"Name": "Virat", "Surname": "Kohli"},
{"Name": "Sachin", "Surname": "Tendulkar"}
]}
Update:
If you want to build a JSON output from different tables, the approach below is also an option:
SELECT
Celebrity = (SELECT Name, Surname FROM TableA FOR JSON AUTO),
Cricketers = (SELECT Name, Surname FROM TableB FOR JSON AUTO)
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
Given that I have a valid avro schema as below:
{
"type": "record",
"namespace": "com.example",
"name": "Employee",
"doc": "Avro Schema for our Employee",
"fields": [
{ "name": "first_name", "type": "string", "doc": "First Name of Customer" },
{ "name": "last_name", "type": "string", "doc": "Last Name of Customer" },
{ "name": "age", "type": "int", "doc": "Age at the time of registration" },
]
}
and a Json array as below:
[
{
"first_name": "Alex",
"last_name": "Dan",
"age": 35
},
{
"first_name": "Bill",
"last_name": "Lee",
"age": 36
},
{
"first_name": "Charan",
"last_name": "Vaski",
"age": 37
}
]
What is the best effieient way to convert the json array to a list of Avro GenericRecord?
I have got the following codes, that converts one json object to one GenericRecord
Schema schema = parser.parse(schemaString);
GenericDatumReader<GenericRecord> reader = new GenericDatumReader<>(schema);
JsonDecoder jsonDecoder = DecoderFactory.get().jsonDecoder(schema, jsonString);
GenericRecord record = reader.read(null, jsonDecoder);
System.out.println(record);
Here is the most optimized one I could get so far
ObjectMapper objectMapper = new ObjectMapper();
JsonNode jsonNode = objectMapper.readTree(jsonArray);
Schema schema = parser.parse(schemaString);
GenericDatumReader<GenericRecord> reader = new GenericDatumReader<>(schema);
JsonDecoder jsonDecoder = DecoderFactory.get().jsonDecoder(schema, "");
for (int i = 0; i < jsonNode.size(); i++) {
jsonDecoder.configure(jsonNode.get(i).toString());
GenericRecord record = reader.read(null, jsonDecoder);
System.out.println(record);
}
I am getting this result when I am using graph api . it is in array format
{
"id": "216805086",
"name": "raj sharma",
"first_name": "raj ",
"last_name": "sharma",
"link": "https://www.facebook.com/raj.sharma.5",
"username": "raj .sharma.5",
"favorite_teams": [
{
"id": "198358615428",
"name": "Mumbai Indians"
},
{
"id": "190313434323691",
"name": "Indian Cricket Team"
}
],
"favorite_athletes": [
{
"id": "100787839962234",
"name": "Saina Nehwal"
}
],
"gender": "male",
"email": "raj.discoverme#gmail.com",
"timezone": 5.5,
"locale": "en_GB",
"verified": true,
"updated_time": "2013-08-13T06:01:17+0000"
}
I am working in a php language and phpmyadmin database . Now i want to insert the array into my database . Should i make a column for id , name , first_name ,last_name,link,favorite_teams etc or should i make a one column for all of this ........
how toinsert tha array into the database
Actually this is not an array. This is JSON. In JSON there are two formats,
JSONArray [ ]
JSONObject { }
You are getting the JSONObject as your output. There is a function in PHP callerd JSONDecode.
Go through this you will get idea.
Storing facebook app data in a database is against Facebook policy http://developers.facebook.com/policy/
$data = '{
"id": "216805086",
"name": "raj sharma",
"first_name": "raj ",
"last_name": "sharma",
"link": "https://www.facebook.com/raj.sharma.5",
"username": "raj .sharma.5",
"favorite_teams": [
{
"id": "198358615428",
"name": "Mumbai Indians"
},
{
"id": "190313434323691",
"name": "Indian Cricket Team"
}
],
"favorite_athletes": [
{
"id": "100787839962234",
"name": "Saina Nehwal"
}
],
"gender": "male",
"email": "raj.discoverme#gmail.com",
"timezone": 5.5,
"locale": "en_GB",
"verified": true,
"updated_time": "2013-08-13T06:01:17+0000"
}';
//decode to get as php variable
$values = json_decode($data,true); //true to decode as a array not an object
$sql = "INSERT INTO TableName (id,name,first_name,last_name,link,username)
VALUES ('".$values['id']."','".$values['name']."','".$values['first_name']."','".$values['last_name']."','".$values['link']."','".$values['username']."')";
mysql_query($sql);
Json_decode() takes a JSON encoded string and converts it into a PHP variable.