Ecto: schemaless many_to_many with unique constraint - many-to-many

What is the right wait to propagate error from Postgres, when inserting duplicate element into many-to-many relation defined using table that has unique_index:
schema "books" do
...
many_to_many :authors, Author, join_through: "books_authors"
end
preload(book, :authors)
|> change
|> put_assoc(:authors, authors)
|> unique_constraint(:authors, name: :books_authors_book_id_author_id_index)
So this throws the error:
** (Postgrex.Error) ERROR (unique_violation): duplicate key value violates unique constraint "books_authors_book_id_author_id_index"
Possible solution is to use BookAuthor schema, but is there way to make it work in schemaless fashion?

Related

Delete many2many column in table and the table itself - odoo 13

While running a pre-migration script to delete a (wizard) transient model, ended up with below mentioned issue.
from openupgradelib import openupgrade
#openupgrade.migrate()
def migrate(env, version):
openupgrade.delete_records_safely_by_xml_id(
env,
["moduel_name.view_id)"],
delete_childs=True,
)
try:
env.cr.execute("DROP TABLE IF EXISTS table_name CASCADE")
env.cr.execute("DROP TABLE IF EXISTS dependent_table_names CASCADE")
except Exception as e:
raise ("Exception--------------", e)
Error:
psycopg2.errors.ForeignKeyViolation: update or delete on table "ir_model" violates foreign key constraint "ir_model_relation_model_fkey" on table "ir_model_relation"
Similar issue: https://github.com/odoo/odoo/issues/54178
According to the above issue, having Many2many in transient model might cause this issue. It is true in my case as well. I have many2many fields. No solution there.
I kind of tried deleting the problematic fields(Many2many) before deleting columns. But it is known that many2many fields can't be located in db. kind of stuck.
openupgrade.drop_columns(
env.cr,
[
("table_name", "any_other_column_name"), # ---> This works
("table_name", "many2many_column_name"), # ---> This doesn't
],
)
is there anyway to get rid of many2many fields from the model ? Any help is appreciated.
Could you try this :
Let's say your Transient is my_transient_model and the Many2many field is e.g. sale_line_ids = fields.Many2many('sale.order_line')
First thing to know : Did you specify the relation table ? like
sale_line_ids = fields.Many2many('sale.order_line', 'my_relation_table_name') ?
If so, 'my_relation_table_name' is the name you want to delete from ir_model_relation.
If not, the relation table name is my_transient_model_sale_order_line_rel (so model then _ then the model we point to with _ instead of . then _rel.
Second set: delete the data from ir_model_relation:
DELETE FROM ir_model_relation WHERE name='my_transient_model_sale_order_line_rel';
Then you should be able to delete the Many2many table :
DROP TABLE my_transient_model_sale_order_line_rel;
(for sure, change my_transient_model_sale_order_line_rel if you specified the relation table like my_relation_table_name in the example)
Hope it helped, keep me updated :)

Fluent NHibernate Schema output with errors when using list

I have two tables which are Many-To-One mapped. However, it is important to maintain the order of the second table, so when I use automapping, Fluent automapper creates a bag. I changed this to force a list by using this command:
.Override(Of ingredients)(Function(map) map.HasMany(Function(x) x.PolygonData).AsList())
(VB.NET syntax)
So I say "AsList" and instead of using a bag, the mapping xml which gets generated contains a list now. Fine so far. However,
the statement generated cannot be handled by MySQL. I use MySQL55Dialect to create the statements and I use exactly that version. But it creates the following create:
create table `ingredients` (
Id INTEGER NOT NULL AUTO_INCREMENT,
Name FLOAT,
Amout FLOAT,
Soup_id INTEGER,
Index INTEGER,
primary key (Id)
)
It crashes because of the line "Index INTEGER," but I don't know what to do here. Any ideas?
Thanks!!
Best,
Chris
I would suspect that Index could be a keyword for MySQL. To avoid such conflict, we can define different Index column name (sorry for C# notation)
HasMany(x => x.PolygonData)
.AsList(idx => idx.Column("indexColumnName").Type<int>())

JSON Schema validation in PostgreSQL?

I can't find any information about JSON schema validation in PostgreSQL, is there any way to implement JSON Schema validation on PostgreSQL JSON data type?
There is another PostgreSQL extension that implements json validation. The usage is almost the same as "Postgres-JSON-schema"
CREATE TABLE example (id serial PRIMARY KEY, data jsonb);
-- do is_jsonb_valid instead of validate_json_schema
ALTER TABLE example ADD CONSTRAINT data_is_valid CHECK (is_jsonb_valid('{"type": "object"}', data));
INSERT INTO example (data) VALUES ('{}');
-- INSERT 0 1
INSERT INTO example (data) VALUES ('1');
-- ERROR: new row for relation "example" violates check constraint "data_is_valid"
-- DETAIL: Failing row contains (2, 1).
I've done some benchmarking validating tweets and it is 20x faster than "Postgres-JSON-schema", mostly because it is written in C instead of SQL.
Disclaimer, I've written this extension.
There is a PostgreSQL extension that implements JSON Schema validation in PL/PgSQL.
It is used like this (taken from the project README file):
CREATE TABLE example (id serial PRIMARY KEY, data jsonb);
ALTER TABLE example ADD CONSTRAINT data_is_valid CHECK (validate_json_schema('{"type": "object"}', data));
INSERT INTO example (data) VALUES ('{}');
-- INSERT 0 1
INSERT INTO example (data) VALUES ('1');
-- ERROR: new row for relation "example" violates check constraint "data_is_valid"
-- DETAIL: Failing row contains (2, 1).
What you need is something to translate JSON Schema constraints into PostgreSQL ones, e.g.:
{
"properties": {
"age": {"minimum": 21}
},
"required": ["age"]
}
to:
SELECT FROM ...
WHERE (elem->>'age' >= 21)
I'm not aware of any existing tools. I know of something similar for MySQL which might be useful for writing your own, but nothing for using the JSON type in PostgreSQL.

DoctrineORM : Error while inserting data into a table which has foreign key attribute

I have the following database structure given by
The "subcat_id" column in the "Course" table points to the "id" column in the "sub_category" table.
The "instructort_id" column in the "Course" table points to the "id" column in the "user" table.
I want to insert data in the "course" table. I am using Symfony2 framework with Doctrine as the database library. When I try to insert data into the course table using the following statements:
$newCourse=new \FrontEndBundle\Entity\Course;
$newCourse->setSubcat($data['subcat']);
$newCourse->setName($data['coursename']);
$newCourse->setInstructor($instructorId);
$newCourse->setDescription($data['description']);
$em->persist($newCourse);
$em->flush();
, I get an error($newCourse is an object of the Course Entity class)
Error shown is displayed below:
I think the error relates to foreign key issues. Can anyone help me on how can I insert data
In the course table correctly?
Thanks in advance..!!
this problem is because you are passing a id, which is not a SubCategory object to your setter in the SubCategory entity
so, you have to retrieve you SubCategory object first, then set it to the Course entity
try this way
$subCat = $this->getDoctrine()
->getRepository('FrontEndBundle:SubCategory')
->find($data['s‌​ubcat']);
and then
$newCourse->setSubcat($subCat);

Why is ActiveRecord trying to delete the row corresponding to an object when it's deleted from an array?

Inside an AR object class definition (I'm not sure that's relevant, but I'm including that info in case it is), I have a method that collects an array of other AR objects and selectively deletes some of them inside a loop. Essentially:
class SomeApplicationModel < ActiveRecord::Base
def user_method
c_list = [array of c objects]
p_list.each{|p|
... a bunch of logic to determine if c should be deleted
c_list.delete(c)
}
end
end
When it hits c_list.delete(c), I get an error in the logs from a relation that includes c and p:
ActiveRecord::StatementInvalid - Mysql2::Error: Cannot delete or update a parent row: a foreign key constraint fails (`stagingdb/c_p`, CONSTRAINT `c_p_ibfk_1` FOREIGN KEY (`cp_id`) REFERENCES `cs` (`id`)): DELETE FROM `cs` WHERE `cs`.`id` = 147:
Why is it trying to delete the record in the db that corresponds to c here (the stack trace pegs the .delete line as where the error is thrown)?
If c_list is an ActiveRecord::Relation instead of an Array, then delete will remove the value from the database. You may want to check the value of c_list.class. Depending on what you want to do, you could either use to_a to work with an Array or translate your filtering logic into a set of ActiveRecord::Relation methods.