SQLAlchemy not marking session as dirty for changed attribute - sqlalchemy

When attempting to change an object loaded in with SQLAlchemy, the session.dirty object is not behaving the way I'd expect:
o = sqla.session.query(sqla.Game).first()
<sqla.Game at 0x7fdcc1f707b8>
o.wiki
<null>
o.wiki = 'test'
o.wiki
'test'
sqla.session.is_modified(o)
False
sqla.session.dirty
IdentitySet([])
inspect(o).attrs['wiki'].history
History(added=(), unchanged=['test'], deleted=())
Committing this to the database does in fact update it, but I'm really unclear on why it's marked as "unchanged". If I modify a relationship on the object, that does properly show in the "new" and "deleted" areas in history. I'm loading the models in via automap, and the session does not have autocommit on.
I have also tried manually calling flag_modified, and specifying the column directly (without automap) to no avail.

Annnd I figured it out. I had autoflush on by default. Turning it to False fixed the issue.

Related

Django save/update not change data in database

Django 1.11.7
MySQL
I was trying to change the value of an object like this:
# change the value of the filed and save
def patch(...):
instance.field_name = new_name
instance.save()
print(instance.filed_name)
When I run the code I got the print result as new_name. But when I check the database manually I got the result as old_name.
Then I tried ways like:
instance.save(update_fields=['field'])
and
ModelName.objects.filter(id=instance.id).update(field_name=new_name)
but get the above problem as well. And meanwhile, the project runs perfectly functional except for this segment of code.
Any idea what caused this problem or suggestion on how to solve it?
Is that piece of code inside a transaction? Maybe the transaction gets rolledback somewhere later.
When you read from the DB are you inside a transaction? Some transaction modes may not show you this change.
Are you sure that field_name is the correct field name? Maybe you have a typo and you just set a property of the object without changing model field. From what I see you sometimes type "field_name" and sometimes "filed_name"

SQLAlchemy override reflected columns dynamically

I'm using SA in a script I'll be using to periodically 'copy' a subset of mysql tables from a 'production' replica to dev/test systems. I had written code to simply reflect the source tables and meta.create_all(destination_engine). Due to the nature of FKs, I now know I need to apply use_alter=True to the ForeignKeys on the tables as I create them so that I won't get CircularDependencyErrors or other problems. I need to assume I dont know how many FK's or their names until I go through the metadata.
I'm new to SA and typically Java programmer (as you will tell :D). I tried to change the use_alter attr. iteratively at first:
tablesd = smeta.tables.items()
for tname, t in tablesd:
for c in t.columns:
for fk in c.foreign_keys:
fk.use_alter = True
smeta.create_all(to_engine)
EDIT: It's important to note that create_all() does NOT throw a CircularDependencyError after I set the use_alter property like I do above. If I remove that code, create_all() does not work. It just doesnt seem to be removing the FKs from the create...
This obviously didn't work. I then read Overriding Reflected Columns in the SA docs, sample being:
mytable = Table('mytable', meta,
Column('id', Integer, primary_key=True), # override reflected 'id' to have primary key
Column('mydata', Unicode(50)), # override reflected 'mydata' to be Unicode, autoload=True)
I'd guess reflecting each table individually then adding use_alter=True in the FK definition would work, but I CANNOT assume the names and values or # of FK's/columns. I read a lot about using DeclarativeBase to do something like this, but I'm not really sure how that would work...
How can I take my arbitrary list of tables, reflect them, then Override the use_alter option on their respective foreign keys? Am I thinking about this the wrong way?
The answer ended up being inside the problem (Imagine that...). Although each ForeignKey object has a use_alter value that can be set, Constraints also have a separate property that can be set (I was not able to find this in the API Documentation. After running it through PyDev's Debugger, I noticed the former were being set, but all the keys that had Constraints associated with them were still False. I set them to true thusly:
for fk in table.foreign_keys:
fk.use_alter=True
fk.constraint.use_alter=True
This seemed to produce the SQL I was looking for and tables were created correctly with no CircularDependencyErrors and metadata.sorted_tables seemed to work fine with no errors. I was actually able to refactor my code and do things the RIGHT way!
For anyone looking to do DB-->DB reflecting with complex FKs using SQLAlchemy, this answer and Tyler Lesmann's article are for you.
*UPDATE: * Using this method has passed a peer review and is now being used as production code. Seems to work well!

Define custom POST method for MyDAC

I have three tables objects, (primary key object_ID) flags (primary key flag_ID) and object_flags (cross-tabel between objects and flags with some extra info).
I have a query returning all flags, and a one or zero if a given object has a certain flag:
SELECT
f.*,
of.*,
of.objectID IS NOT NULL AS object_has_flag,
FROM
flags f
LEFT JOIN object_flags of
ON (f.flag_ID = of.flag_ID) AND (of.object_ID = :objectID);
In the application (which is written in Delphi), all rows are loaded in a component. The user can assign flags by clicking check boxes in a table, modifying the data.
Suppose one line is edited. Depending on the value of object_has_flag, the following things have to be done:
If object_has_flag was true and still is true, an UPDATE should be done on the relevant row in objects_flags.
If object_has_flag was false but is now true, and INSERT should be done
If object_has_flag was true, but is now false, the row should be deleted
It seems that this cannot be done in one query https://stackoverflow.com/questions/7927114/conditional-replace-or-delete-in-one-query.
I'm using MyDAC's TMyQuery as a dataset. I have written separate code that executes the necessary queries to save changes to a row, but how do I couple this to the dataset? What event handler should I use, and how do I tell the TMyQuery that it should refresh instead of post?
EDIT: apparently, it is not completely clear what the problem is. The standard UpdateSQL, DeleteSQL and InsertSQL cannot be used because sometimes after editing a line (not deleting it or inserting a line), an INSERT or DELETE has to be done.
The short answer is, to paraphrase your answer here:
Look up the documentation for "Updating Data with MyDAC Dataset Components" (as of MyDAC 5.80).
Every TCustomDADataSet (such as TMyQuery) descendant has the capability to set update SQL statements using SQLInsert, SQLUpdate and SQLDelete properties.
TMyUpdateSQL is also a promising component for custom update operations.
It seems that the easiest way is to use the BeforePost event, and determine what has to be done using the OldValue and NewValue properties of several fields.

ActiveRecord caching and update_attributes

If a model changes an attribute locally, then changes it back, ActiveRecord doesn't send the change to the DB. This is great for performance, but if something else changes the database, and I want to revert it to the original value, the change doesn't take:
model = Model.find(1)
model.update_attribute(:a, 1) # start it off at 1
# some code here that changes model.a to 2
model.a = 2 # I know it changed, reflecting my local model of the change
model.update_attribute(:a, 1) # try change it back, DOESN'T WORK
The last line doesn't work because AR thinks in the DB it's still 1, even though something else changed it to 2. How can I force an AR update, or update the cache directly if I know the new value?
Side note: the code that changes it is an update_all query that locks the record, but it has side effects that mess up the cache. Multiple machines read this table. If there's a better way to do this I'd love to know.
Model.update_all(["locked_by = ?", lock_name], ["id = ? AND locked_by IS NULL", id])
Use the reload method for this.
model.reload(:select => "a")
OR
You can try the will_change! method(Its not clear how your change happens. But you can try this method).
model.update_attribute(:a, 1) # start it off at 1
model.a_will_change! #fore warn the model about the change
model.a = 2 #perform the change
model.update_attribute(:a, 1)
The answer by Harish Shetty is correct, you need to call reload on the reference, however I found a better way to do that automatically.
In your model you want to reload attribute to, create a after_update callback and call reload directly there, like so:
after_update :reload_attr
def reload_attr
reload select: "attr"
end

Linq update with explicit operator

I am trying to do an update with linq using an explict cast and the changes arent submitting.
Here is the code
Image update = db.Images.Where(i => i.ImageId == imageWithChanges.ImageId).SingleOrDefault();
update = (Image)imageWithChanges;
db.SubmitChanges();
I have an explicit operator in my image class. Can anyone help?
Thanks
The line
update = (Image)imageWithChanges;
is not changing anything. It's merely swapping the thing the variable update points at. If you want to actually change the image, you'd probably have to copy each property from imageWithChanges to update.
Another way you can do this, is to attach imageWithChanges to db.Images and say it was a modified instance:
db.Images.Attach((Image)imageWithChanges, true); // true means "it's modified"
db.SaveChanges();
You say you got it fixed, but don't tell How.
For all others that will read this, I agree with Ruben, you have to Attach it. The error it gives you is valid, you have to either handle concurrency checking (with timestamp or version number) or let last in wins (by setting UpdateCheck to false for all your entity's properties).