insert json data into text datatype of POSTGRES table column - json

I am trying to insert json data into text datatype of postgres table.
For example, table1 is having columns as:
id | name | occupation | skills
--------------------------------------------------------
1 | John | engineer | {"java":"true","oracle":"true"}
---------------------------------------------------------
2 | mary | engineer | {".net":"true","mysql":"true"}
Here in the above table skills are of type text and we are inserting json data in it.
How can I insert json data into text datatype of postgres table?

Convert your json to string, then try to save
"{\"java\":\"true\",\"oracle\":\"true\"}"
If you are inserting json data directly to postgres db, try converting json to string from online converter

Related

how to create Json object of each column in a row

i have a table itemmaster in Postgresql.
id| attribute1 | attribute2 | attribute3
1 | Good | Average | Best
i want output as json like
[{"attribute1":"Good"},{"attribute2":"Average"},{"attribute3":"Best"}]
i want to use this JSON as nested JSON other object, ihave tried row_to_json and json object builder but not getting exact result.
select json_build_array(json_build_object('attribute1', itemmaster.attribute1),
json_build_object('attribute2', itemmaster.attribute2),
json_build_object('attribute3', itemmaster.attribute3))
from itemmaster;

Create hive external table with complex data type and load from CSV or TSV having few columns with serialized JSON object

I have CSV (or TSV) with a column ('nw_day' in example below) having serialized array object and another column ('res_m' in example below) having serialized JSON object. It also has columns with STRING, TIMESTAMP, and FLOAT data type.
For the TSV that looks somewhat like (showing first row)
+----------+---------------------+-------+-----------------------------------------------+------------------------------------------------------------------------+
| com_id | w_start_time | cap | nw_day | res_m |
+----------+---------------------+-------+-----------------------------------------------+------------------------------------------------------------------------+
| dtf_id | 2019-04-24 06:00:03 | 444.3 | {'Fri','Mon','Sat','Sun','Thurs','Tue','Wed'} | {"some_str":"str_one","some_n":1,"some_t":2019-04-24 06:00:03.700+0000}|
+----------+---------------------+-------+-----------------------------------------------+------------------------------------------------------------------------+
I have tried the following statement, but it is not giving me perfect results.
CREATE EXTERNAL TABLE IF NOT EXISTS table_name(
com_id STRING,
w_start_time TIMESTAMP,
cap FLOAT,
nw_day array <STRING>,
res_m STRUCT <
some_str: STRING,
some_n: BIGINT,
some_t: TIMESTAMP
>)
COMMENT 's_e_s'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t'
COLLECTION ITEMS TERMINATED BY ','
STORED AS TEXTFILE
LOCATION '/location/to/folder/containing/csv'
TBLPROPERTIES ("skip.header.line.count"="1");
So, I'm thinking I deserialize those objects into hive complex datatypes with ARRAYS and STRUCT. But that is not exactly what I get when I run
select * from table_name limit 1;
which gives me
+----------+---------------------+-------+----------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------+
| com_id | w_start_time | cap | nw_day | res_m |
+----------+---------------------+-------+----------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------+
| dtf_id | 2019-04-24 06:00:03 | 444.3 | ["{'Fri'"," 'Mon'"," 'Sat'"," 'Sun'"," 'Thurs'"," 'Tue'"," 'Wed'}"] | {"some_str":"{\"some_str\":\"str_one\",\"some_n\":1,\"some_t\":2019-04-24 06:00:03.700+0000}\","some_n":null,"some_t":null}|
+----------+---------------------+-------+----------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------+
So, it considering the whole object as a string and split the string by delimiter.
I need some help understanding how to load data from CSV/TSV to complex data types in Hive.
I found a similar question but the requirement is little different and there is no complex datatype involved in there.
Any help would be much appreciated. If this cannot be done and a preprocessing step has to be included prior to loading, some example of input data to complex datatype loads in hive would help me. Thanks in advance!

Using MySQL JSON field to join on a table with custom fields

So I made this system to store custom objects with custom fields for an app that I'm developing. First I have object_def where I save the object definitions:
id | name | fields
------------------------------------------------------------
101 | Group 1 | [{"name": "Title", "id": "AbCdE123"}, ...]
102 | Group 2 | [{"name": "Name", "id": "FgHiJ456"}, ...]
So we have ID (INT), name (VARCHAR) and fields (LONGTEXT). In fields are the object fields like this: {id: string, type: string, name: string}[].
Now In the object table, I have this:
id | object_def_id | object_values
------------------------------------------------------------
235 | 101 | {"AbCdE123": "The Object", ... }
236 | 102 | {"FgHiJ456": "John Perez", ... }
Where object_values is a LONGTEXT also. With that system, I'm able to show the objects on a table in my app using JSON.parse().
Now I've learned that there is a JSON type in MySQL and I want it to use it to do queries and stuff (I'm really new to this).
I've changed the LONGTEXT to JSON and now I wanted to do a SELECT that show the results like this:
#Select objects in group 1:
id | group | Title | ... | other_custom_field
-------------------------------------------------------
235 | Group 1 | The Object | ... | other_custom_value
#Select objects in group 2:
id | group | Name | ... | other_custom_field
-------------------------------------------------------
236 | Group 2 | John Perez | ... | other_custom_value
Id, then group name (I can do this with INNER JOIN) and then all the custom fields with the respective values.
Is this possible? How can I achieve this (hopefully without changing my database structure)? I'm learning MySQL, SQL and databases as I go so I really appreciate your help. Thanks!
Problems I see with your design:
Incorrect JSON format.
[{name: 'Title', id: 'AbCdE123'}, ...]
Should be:
[{"name": "Title", "id": "AbCdE123"}, ...]
You should use the JSON data type instead of LONGTEXT, because JSON will at least reject invalid JSON syntax.
Setting column headings based on data. You can't do this in SQL. Columns and headings must be fixed at the time you prepare the query. You can't do an SQL query that changes its own column headings.
Your object def has an array of attributes, but there's no way in MySQL 5.7 to loop over the "rows" of a JSON array. You'll need to use the JSON_TABLE() in MySQL 8.0.
That will get you closer to being able to look up object values, but then you'll still have to pivot the data into the result set you describe, with one attribute in each column, as if the data had been stored in a traditional way. But SQL doesn't allow you to do dynamic pivoting in a single query. You can't make an SQL query that dynamically grows its own select-list based on the data it finds.
This all makes me wonder...
Why don't you just store the data in the traditional way?
Create a table per object type. Add one column to that table per attribute. That way you get column names. You get column types. You get column constraints — for example, how would you simulate NOT NULL or UNIQUE in your current system?
If you don't want to use SQL, then don't. There are alternatives, like document databases or key/value databases. But don't torture poor SQL by using it to implement an Inner-Platform.

Laravel - Retrive row from DB as json when one column is in json format

I store my data in DB where one of column keep data as a json format.
When I try to retrieve row and response as json then I have that column as a string instead of json object.
DB:
id | name | map_id | map_settings | created_at
1 | Europe | 2 | {"zoom":7,"minZoom":5,"maxZoom":9,"zoomControl":true,"disableDefaultUI":true,"center":"new google.maps.LatLng(51.954422144707960, 19.140930175781250)"} | 2018-08-19 05:19:50
PHP
$mapConfig = MapConfig::with(['places'])->where(['id'=>$id])->get();
return response()->json($mapConfig);
Result
id: 1,
name: "Europe",
map_id: 2,
map_settings: "{\"zoom\":7,\"minZoom\":5,\"maxZoom\":9,\"zoomControl\":true,\"disableDefaultUI\":true,\"center\":\"new google.maps.LatLng(51.954422144707960, 19.140930175781250)\"}"
Why that map_settings is not in correct JSON format? And how to do it?
Thank you.
You have to parse it (map_settings) by json_decode becausue it is a string.
you can use laravel attribute casting and convert your json to array, then it would work fine with response json function.

Is it possible to use a LOAD DATA INFILE type command to UPDATE rows in the db?

Pseudo table:
| primary_key | first_name | last_name | date_of_birth |
| 1 | John Smith | | 07/04/1982 |
At the moment first_name contains a users full name for many rows. The desired outcome is to split the data, so first_name contains "John" and last_name contains "Smith".
I have a CSV file which contains the desired format of data:
| primary_key | first_name | last_name |
| 1 | John | Smith |
Is there a way of using the LOAD DATA INFILE command to process the CSV file to UPDATE all rows in this table using the primary_key - and not replace any other data in the row during the process (i.e. date_of_birth)?
In this situation I usually LOAD DATA INFILE to a temp table with identical structure. Then I do INSERT with ON DUPLICATE KEY UPDATE from the temp table to the real table. This allows for data type checking without wrecking your real table; it's relatively quick and it doesn't require fiddling with your .csv file.
No. While LOAD DATA INFILE has a REPLACE option, it will actually replace the row in question - that is, delete the existing one and insert a new one.
If you configure your LOAD DATA INFILE to only insert certain columns all others will be set to their default values, not to values they currently contain.
Can you modify your CSV file to contain a bunch of UPDATE statements instead? Should be reasonably straightforward via some regex replaces.