Postgres Update with json to record? - json

i try to find a simple solution to update data with the json_to_record statement. I want to update the data every night with a json_array i get by web_craping. Any idea how i can solve the problem when theres a new article?
Thanks a lot :)
INSERT INTO prices
Select * from
json_to_record(
'{"distributor“: "1", "articelnr": 4711, "price": „700“, "delivery": "too late", "created_on": "2022-09-25 03:14:07"}'
) AS x(distributor INT, articelnr VARCHAR, price VARCHAR, delivery VARCHAR, created_on TIMESTAMP)
Is it possible to use an update statement in a similar way like:
UPDATE prices
SET
json_to_record(
'{"distributor“: "1", "articelnr": 4711, "price": „700“, "delivery": "too late", "created_on": "2022-09-25 03:14:07"}'
) AS x(distributor INT, articelnr VARCHAR, price VARCHAR, delivery VARCHAR, created_on TIMESTAMP)

According to UPDATE syntax it is possible to update a table using a sub-select, e.g.:
update prices set
(distributor, articelnr, price, delivery, created_on) =
(
select *
from json_to_record(
'{"distributor": "1", "articelnr": 4711, "price": "700", "delivery": "too late", "created_on": "2022-09-25 03:14:07"}'
) as x(distributor int, articelnr varchar, price varchar, delivery varchar, created_on timestamp)
)
where distributor = 1 and articelnr = 4711
Unfortunately, the sub-select results are not visible in the WHERE clause. Hence using the query in the FROM section seems more suitable, e.g.:
update prices p set
(distributor, articelnr, price, delivery, created_on) =
(x.distributor, x.articelnr, x.price, x.delivery, x.created_on)
from (
select *
from json_to_record(
'{"distributor": "1", "articelnr": 4711, "price": "700", "delivery": "too late", "created_on": "2022-09-25 03:14:07"}'
) as x(distributor int, articelnr varchar, price varchar, delivery varchar, created_on timestamp)
) as x
where p.distributor = x.distributor and p.articelnr = x.articelnr
Note, that you need unique columns (typically a primary key) to identify a row to update. Examples assume that they are (distributor, articelnr).

Related

Need to convert the SQL Query to Gorm query

I have this SQL query
Select CONCAT(kafka_user_stream.FirstName,' ', kafka_user_stream.LastName) AS "Full Name",
kafka_user_stream.UID AS "User ID",
kafka_user_stream.CountryCode AS "Country",
kafka_user_stream.CreatedAt AS "Registration Date & Time",
COUNT(jackpotmessage_stream.UID) AS "Win Count"
FROM kafka_user_stream LEFT JOIN
jackpotmessage_stream ON jackpotmessage_stream.UID = kafka_user_stream.UID
WHERE "Type"='goldenTicketWin'
GROUP BY "Full Name", "User ID", "Country", "Registration Date & Time"
ORDER BY "Win Count" DESC
I want to convert it to Gorm. I can use it using
err = s.db.Exec("...QUERY")
but i cannot extract data from the above query. I need to extract all of the above fields (Full Name, User ID etc) and store them in a struct.
In above query, kafka_user_stream and jackpot_message are the tables extracted from a kafka stream. I am using go-gorm and go.
I tried the Gorm documentation as well as few other references but I am unable to find any solution. Would be very thankful for any leads, insight or help.
With native go/mysql driver, you should use Query() and Scan() methods to get results from the database and store them in a struct, not Exec().
In GORM, you can use SQL Builder for your custom queries:
type Result struct {
ID int
Name string
Age int
}
var result Result
db.Raw("SELECT id, name, age FROM users WHERE name = ?", 3).Scan(&result)
I figured out a slightly different way as suggested by Aykut which works fine.
rows, _err := s.gdb.Raw(`Select CONCAT(kafka_user_stream.FirstName,' ', kafka_user_stream.LastName) AS "FullName",
kafka_user_stream.UID AS "UserID",
kafka_user_stream.CountryCode AS "Country",
kafka_user_stream.CreatedAt AS "CreatedAt",
COUNT(jackpotmessage_stream.UID) AS "WinCount"
FROM kafka_user_stream LEFT JOIN
jackpotmessage_stream ON jackpotmessage_stream.UID = kafka_user_stream.UID
WHERE "Type"='goldenTicketWin'
GROUP BY "FullName", "UserID", "Country", "CreatedAt"
ORDER BY "WinCount" DESC;`).Rows()

How to correctly insert entries into `Json Array` within MySQL8?

I am using MySQL 8 and my table structure looks as below:
CREATE TABLE t1 (id INT, `group` VARCHAR(255), names JSON);
I am able to correctly insert records using the following INSERT statement:
INSERT INTO t1 VALUES
(1100000, 'group1', '[{"name": "name1", "type": "user"}, {"name": "name2", "type": "user"}, {"name": "techDept", "type": "dept"}]');
The JSON format has two types - user and dept
Now, I have an array of users and depts which looks as below:
SET #userlist = '["user4", "user5"]';
SET #deptlist = '["dept4", "dept5"]';
For a new group group2, I want to insert all those users and depts from within userlist and deptlist array into t1 table using a single query and I have written following query:
SELECT JSON_ARRAY_INSERT(JSON_OBJECT('name', #userlist , 'type', 'user'), ('name', #deptlist , 'type', 'gdl'));
Incorrect parameter count in the call to native function 'JSON_ARRAY_INSERT'

How to import JSON values inside MySQL (10.2.36-MariaDB) table?

I have the following JSON file :
{
"ID": 5464015,
"CUSTOMER_ID": 1088020,
"CUSOTMER_NAME": "My customer 1"
}
{
"ID": 5220812,
"CUSTOMER_ID": 523323,
"CUSOTMER_NAME": "My customer 2"
}
{
"ID": 5205039,
"CUSTOMER_ID": 1934806,
"CUSOTMER_NAME": "My customer 3"
}
From a shell script, I would like to import these values into a MariaDB table (MariaDB Server version : 10.2.36-MariaDB) with the related columns already created :
ID
CUSTOMER_ID
CUSTOMER_NAME
But for CUSTOMER_NAME, I don't want to import double quotes at the beginning and at the end of the value.
Is there a simple way to do it?
Or if not possible, If I have a txt or csv file like this :
5464015,1088020,"My customer 1"
5220812,523323,"My customer 2"
5205039,1934806,"My customer 3"
How to import it?
Many thanks
CREATE TABLE test (ID INT, CUSTOMER_ID INT, CUSTOMER_NAME VARCHAR(255));
SET #data := '
[ { "ID": 5464015,
"CUSTOMER_ID": 1088020,
"CUSTOMER_NAME": "My customer 1"
},
{ "ID": 5220812,
"CUSTOMER_ID": 523323,
"CUSTOMER_NAME": "My customer 2"
},
{ "ID": 5205039,
"CUSTOMER_ID": 1934806,
"CUSTOMER_NAME": "My customer 3"
}
]
';
INSERT INTO test
SELECT *
FROM JSON_TABLE(#data,
"$[*]" COLUMNS( ID INT PATH "$.ID",
CUSTOMER_ID INT PATH "$.CUSTOMER_ID",
CUSTOMER_NAME VARCHAR(255) PATH "$.CUSTOMER_NAME")
) AS jsontable;
SELECT * FROM test;
ID
CUSTOMER_ID
CUSTOMER_NAME
5464015
1088020
My customer 1
5220812
523323
My customer 2
5205039
1934806
My customer 3
db<>fiddle here
The solution which must work on 10.2.36-MariaDB (all used constructions are legal for this version):
CREATE TABLE test (ID INT, CUSTOMER_ID INT, CUSTOMER_NAME VARCHAR(255))
WITH RECURSIVE
cte1 AS ( SELECT LOAD_FILE('C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/json.txt') jsondata ),
cte2 AS ( SELECT 1 level, CAST(jsondata AS CHAR) oneobject, jsondata
FROM cte1
UNION ALL
SELECT level + 1,
TRIM(SUBSTRING(jsondata FROM 1 FOR 2 + LOCATE('}', jsondata))),
TRIM(SUBSTRING(jsondata FROM 1 + LOCATE('}', jsondata) FOR LENGTH(jsondata)))
FROM cte2
WHERE jsondata != '' )
SELECT oneobject->>"$.ID" ID,
oneobject->>"$.CUSTOMER_ID" CUSTOMER_ID,
oneobject->>"$.CUSTOMER_NAME" CUSTOMER_NAME
FROM cte2 WHERE level > 1;
Tested on MySQL 8.0.16 (I have no available MariaDB now):
The content of json.txt file matches shown in the question (misprint in attribute name edited).
PS. Of course the SELECT itself may be used for to insert the data into existing table.
If you have access to php a simple script is a good method, as it can turn json into an array (and automatically remove said quotes around text) and then you can decide what columns in the json equate to what mysql columns.
Depending on your mysql version you may have access to this utility to inport json from command line
https://mysqlserverteam.com/import-json-to-mysql-made-easy-with-the-mysql-shell/
But it may not work if your columns don't match perfectly with the MySQL columns ( I belieivd it is not case sensitive however )

How to compare json attribute ignoring type in mysql

In mysql 5.7, I have a table:
CREATE TABLE `item` (
`id` INT(11) NOT NULL AUTO_INCREMENT,
`name` VARCHAR(50) NOT NULL,
`price` DECIMAL(10,2) NOT NULL,
`attributes` JSON NULL DEFAULT NULL,
);
Then I INSERT INTOitemVALUES (1, 'trademark', 250000.00, '{"sn": "5108174", "type": 1, "group": "2501 2502 2503 2506 2507 2508 2509 2510 2511 2512", "regDate": 1210694400}');
I can select the row by:
select * from item where price = '250000.00'
Or
select * from item where price = 250000.00
because mysql ignore normal field type by default.
But it's different when I filtered by json attribute:
#Return empty
select * from item where json_extract(attributes, '$.type') = '1';
#Return one row
select * from item where json_extract(attributes, '$.type') = 1;
Since attribute.type is number, I can convert it to string to ignore type:
select * from item where concat(json_extract(attributes, '$.type'),'') = '1';
or
select * from item where cast(json_extract(attributes, '$.type') as char) = '1';
Unfortunately, the magic was lost when attribute.type is string
INSERT INTOitemVALUES (1, 'trademark', 250000.00, '{"sn": "5108174", "type": "1", "group": "2501 2502 2503 2506 2507 2508 2509 2510 2511 2512", "regDate": 1210694400}')
So when I receive attribute.type from url param, it's hard to get it's real type to select data.
Is there any way to compare json attribute ignoring type like normal field ?

Insert into Linked table in access without the primary key

Im trying to insert into a linked SQL table in access 2007, below is my query
INSERT INTO tblProducts ( ProductPrefix, ProductCode, ProductDescription, MadeFrom, MadeFromDescription, SamFamilySort1, SamFamilySort2, SamFamilySort3, SamFamilySort4, SamFamilySort5, Grade, Length, Thickness, fWidth, Factor, CubicMtrs, CubicMtrsFull, [Weight(T)], DrawingFilepath, EFACSProductGrouping, BatchSize, PackSize, Density, createdby, createddate, ProductType, customer, DimA, DimB, DimC, DimD, DimE, DimF, DimG, DimH, DimI, DimJ, DimK, DimL, DimM, DimN, DimO, DimP, DimQ, DimR, DimS, DimT, DimU, DimV, DimW, DimX, DimY, DimZ, TolA, TolB, TolC, TolD, TolE, TolF, TolG, TolH, TolI, TolJ, TolK, TolL, TolM, TolN, TolO, TolP, TolQ, TolR, TolS, TolT, TolU, TolV, TolW, TolX, TolY, TolZ, Dimension, Main, Saws, Moulders, PaintLines, XCut, DET, Wrapper, Blocks, HingeRecess, reorderpolicy, machinedaway, UseOtherM3XC, UseOtherM3MS, ShrinkWrap, ShrinkWrapPackSize, SW, samtype1, vtype1, vtype2, profile, productchamp, UOM, SAMPartGrp, PostingClass, ProductID )
SELECT DISTINCT tblProducts.ProductPrefix, tblProducts.ProductCode, tblProducts.ProductDescription, tblProducts.MadeFrom, tblProducts.MadeFromDescription, tblProducts.SamFamilySort1, tblProducts.SamFamilySort2, tblProducts.SamFamilySort3, tblProducts.SamFamilySort4, tblProducts.SamFamilySort5, tblProducts.Grade, tblProducts.Length, tblProducts.Thickness, tblProducts.fWidth, tblProducts.Factor, tblProducts.CubicMtrs, tblProducts.CubicMtrsFull, tblProducts.[Weight(T)], tblProducts.DrawingFilepath, tblProducts.EFACSProductGrouping, tblProducts.BatchSize, tblProducts.PackSize, tblProducts.Density, tblProducts.createdby, Date() AS Expr1, tblProducts.ProductType, tblProducts.customer, tblProducts.DimA, tblProducts.DimB, tblProducts.DimC, tblProducts.DimD, tblProducts.DimE, tblProducts.DimF, tblProducts.DimG, tblProducts.DimH, tblProducts.DimI, tblProducts.DimJ, tblProducts.DimK, tblProducts.DimL, tblProducts.DimM, tblProducts.DimN, tblProducts.DimO, tblProducts.DimP, tblProducts.DimQ, tblProducts.DimR, tblProducts.DimS, tblProducts.DimT, tblProducts.DimU, tblProducts.DimV, tblProducts.DimW, tblProducts.DimX, tblProducts.DimY, tblProducts.DimZ, tblProducts.TolA, tblProducts.TolB, tblProducts.TolC, tblProducts.TolD, tblProducts.TolE, tblProducts.TolF, tblProducts.TolG, tblProducts.TolH, tblProducts.TolI, tblProducts.TolJ, tblProducts.TolK, tblProducts.TolL, tblProducts.TolM, tblProducts.TolN, tblProducts.TolO, tblProducts.TolP, tblProducts.TolQ, tblProducts.TolR, tblProducts.TolS, tblProducts.TolT, tblProducts.TolU, tblProducts.TolV, tblProducts.TolW, tblProducts.TolX, tblProducts.TolY, tblProducts.TolZ, tblProducts.Dimension, tblProducts.Main, tblProducts.Saws, tblProducts.Moulders, tblProducts.PaintLines, tblProducts.XCut, tblProducts.DET, tblProducts.Wrapper, tblProducts.Blocks, tblProducts.HingeRecess, tblProducts.reorderpolicy, tblProducts.machinedaway, tblProducts.useotherm3XC, tblProducts.useotherm3MS, tblProducts.ShrinkWrap, tblProducts.ShrinkWrapPackSize, tblProducts.SW, tblProducts.samtype1, tblProducts.vtype1, tblProducts.vtype2, tblProducts.profile, tblProducts.productchamp, tblProducts.UOM, tblProducts.SAMPartGrp, tblProducts.PostingClass, tblProducts.ProductID
FROM tblProducts
This works fine and uploads all records in the table with new keys if i want to (I dont). I want to only recreate one product ive tried added the below
WHERE (((tblProducts.ProductID)=[tests]));
Where tests is a popupbox for User entry
i get an error below
My primary key in the table is called [ProductID]. It is possible to add a WHERE [ProductID] = 1234 in this query somehow?
Notice that the very last item in the column list of the INSERT INTO clause is ProductID. So, you are trying to insert a new row with an existing Primary Key value, and that won't work. As a simplified example,
INSERT INTO tblProducts (ProductDescription, ProductID)
SELECT tblProducts.ProductDescription, tblProducts.ProductID
FROM tblProducts
WHERE tblProducts.ProductID=1
will fail with a primary key violation. You need to remove ProductID from both the INSERT INTO and SELECT clauses, and only use it in the WHERE clause:
INSERT INTO tblProducts (ProductDescription)
SELECT tblProducts.ProductDescription
FROM tblProducts
WHERE tblProducts.ProductID=1