Database model custom JSONField values persisting between test cases - json

I'm using a Django database model to store objects corresponding to a remote ticket service. In testing this model, I'm running several tests to make sure that log messages work correctly for the database model - I'm using Django-nose to run these tests.
The database model looks something like this, using the JSONField from this StackOverflow answer with a small modification to support lists as well as dicts:
class TicketEntity(django.db.models.Model)
tickets = JSONField(null=True, blank=True, default=[], serialize=True)
queued_ticket_data = JSONField(null=True, blank=True, default=[], serialize=True)
...
#classmethod
def create(cls, *args, **kwargs):
instance = cls(*args, **kwargs)
instance.save()
return instance
def queue_output_for_ticket(self, log_data):
self.queued_ticket_data += [log_data]
self.save()
def push_ticket(self):
self.tickets += [self.queued_ticket_data]
self.queued_ticket_data = []
self.save()
return True
The tests (in the order they appear to get run) look like this:
def test_ticket_entity_push_ticket(self):
entity = TicketEntity.create()
entity.queue_output_for_ticket("log output")
entity.queue_output_for_ticket("more log output")
entity.push_ticket()
self.assertEquals(entity.tickets, [[log_data, log_data_1]])
self.assertFalse(entity.queued_ticket_data)
def test_ticket_entity_queue_output_for_ticket(self):
entity = TicketEntity.create()
log_data = "Here's some output to log.\nHere's another line of output.\nAnd another."
entity.queue_output_for_ticket(log_data)
self.assertEquals(entity.queued_ticket_data, [log_data])
The first test passes, perfectly fine. The second test fails on that assert statement, because the entity.queued_ticket_data looks like this:
["log output",
"more log output",
"Here's some output to log.\nHere's another line of output.\nAnd another."]
Those first two elements are there at the very start of the test, immediately after we call TicketEntity.create(). They shouldn't be there - the new instance should be clean because we're creating it from scratch.
Similarly, tickets is pre-filled as well. The TicketEntity has other fields that are not JSONFields, and they do not seem to exhibit this problem.
Can someone shed some light on why this problem might be happening, and what kind of modification we'd need for a fix?

Related

odoo 10 QWEB report how to pass value that is used in methods in parser?

I have a model that saves reports in binary fields for archiving. To do that I use the pdf_get().
document = self.env['report'].sudo().get_pdf(ids, report_name)
The problem is when I want to create a report that doesn't use the models fields but has to compute values from related models with the model that is pass with ids.
My report model
class ReportHistory(models.AbstractModel):
_name = 'report.hr.report_history'
def _get_report(self, ids[0]):
record = self.env['hr.history'].search([('id', '=', ids[0])])
return record
def _get_company(self, ids):
rec = self._get_report(ids)
if len(rec) > 0:
return rec[0].company_name
My biggest problem is that I can't debug so I can't what data is passed. The print or logger or raise ValidationError won't work. Probably due to running odoo on windows pc.
Every answer that I found it was said to pass values to report like this but it doesn't work.
#api.model
def render_html(self, docids, data=None):
docargs =
'doc_ids': self.ids,
'doc_model': self.model,
'data': data,
'company': self._get_company,
}
return self.env['report'].render()
So how to correctly pass values from methods to report? Or did I only do a dumb mistake?
Try this:
return self.env['report'].render(report_name, docargs)

Implementing MySQL "generated columns" on Django 1.8/1.9

I discovered the new generated columns functionality of MySQL 5.7, and wanted to replace some properties of my models by those kind of columns. Here is a sample of a model:
class Ligne_commande(models.Model):
Quantite = models.IntegerField()
Prix = models.DecimalField(max_digits=8, decimal_places=3)
Discount = models.DecimalField(max_digits=5, decimal_places=3, blank=True, null=True)
#property
def Prix_net(self):
if self.Discount:
return (1 - self.Discount) * self.Prix
return self.Prix
#property
def Prix_total(self):
return self.Quantite * self.Prix_net
I defined generated field classes as subclasses of Django fields (e.g. GeneratedDecimalField as a subclass of DecimalField). This worked in a read-only context and Django migrations handles it correctly, except a detail : generated columns of MySQL does not support forward references and django migrations does not respect the order the fields are defined in a model, so the migration file must be edited to reorder operations.
After that, trying to create or modify an instance returned the mysql error : 'error totally whack'. I suppose Django tries to write generated field and MySQL doesn't like that. After taking a look to django code I realized that, at the lowest level, django uses the _meta.local_concrete_fields list and send it to MySQL. Removing the generated fields from this list fixed the problem.
I encountered another problem: during the modification of an instance, generated fields don't reflect the change that have been made to the fields from which they are computed. If generated fields are used during instance modification, as in my case, this is problematic. To fix that point, I created a "generated field descriptor".
Here is the final code of all of this.
Creation of generated fields in the model, replacing the properties defined above:
Prix_net = mk_generated_field(models.DecimalField, max_digits=8, decimal_places=3,
sql_expr='if(Discount is null, Prix, (1.0 - Discount) * Prix)',
pyfunc=lambda x: x.Prix if not x.Discount else (1 - x.Discount) * x.Prix)
Prix_total = mk_generated_field(models.DecimalField, max_digits=10, decimal_places=2,
sql_expr='Prix_net * Quantite',
pyfunc=lambda x: x.Prix_net * x.Quantite)
Function that creates generated fields. Classes are dynamically created for simplicity:
from django.db.models import fields
def mk_generated_field(field_klass, *args, sql_expr=None, pyfunc=None, **kwargs):
assert issubclass(field_klass, fields.Field)
assert sql_expr
generated_name = 'Generated' + field_klass.__name__
try:
generated_klass = globals()[generated_name]
except KeyError:
globals()[generated_name] = generated_klass = type(generated_name, (field_klass,), {})
def __init__(self, sql_expr, pyfunc=None, *args, **kwargs):
self.sql_expr = sql_expr
self.pyfunc = pyfunc
self.is_generated = True # mark the field
# null must be True otherwise migration will ask for a default value
kwargs.update(null=True, editable=False)
super(generated_klass, self).__init__(*args, **kwargs)
def db_type(self, connection):
assert connection.settings_dict['ENGINE'] == 'django.db.backends.mysql'
result = super(generated_klass, self).db_type(connection)
# double single '%' if any because it will clash with later Django format
sql_expr = re.sub('(?<!%)%(?!%)', '%%', self.sql_expr)
result += ' GENERATED ALWAYS AS (%s)' % sql_expr
return result
def deconstruct(self):
name, path, args, kwargs = super(generated_klass, self).deconstruct()
kwargs.update(sql_expr=self.sql_expr)
return name, path, args, kwargs
generated_klass.__init__ = __init__
generated_klass.db_type = db_type
generated_klass.deconstruct = deconstruct
return generated_klass(sql_expr, pyfunc, *args, **kwargs)
The function to register generated fields in a model. It must be called at django start-up, for example in the ready method of the AppConfig of the application.
from django.utils.datastructures import ImmutableList
def register_generated_fields(model):
local_concrete_fields = list(model._meta.local_concrete_fields[:])
generated_fields = []
for field in model._meta.fields:
if hasattr(field, 'is_generated'):
local_concrete_fields.remove(field)
generated_fields.append(field)
if field.pyfunc:
setattr(model, field.name, GeneratedFieldDescriptor(field.pyfunc))
if generated_fields:
model._meta.local_concrete_fields = ImmutableList(local_concrete_fields)
And the descriptor. Note that it is used only if a pyfunc is defined for the field.
class GeneratedFieldDescriptor(object):
attr_prefix = '_GFD_'
def __init__(self, pyfunc, name=None):
self.pyfunc = pyfunc
self.nickname = self.attr_prefix + (name or str(id(self)))
def __get__(self, instance, owner):
if instance is None:
return self
if hasattr(instance, self.nickname) and not instance.has_changed:
return getattr(instance, self.nickname)
return self.pyfunc(instance)
def __set__(self, instance, value):
setattr(instance, self.nickname, value)
def __delete__(self, instance):
delattr(instance, self.nickname)
Note the instance.has_changed that must tell if the instance is being modified. If found a solution for this here.
I have done extensive tests of my application and it works fine, but I am far from using all django functionalities. My question is: could this settings clash with some use cases of django ?

Django REST Errors when serializing model with ManyToManyField

I've created a class modeling a group of files within a software product build, on a Django server using the Django-REST package. The design is that the group of files (a Depot instance) should be able to be assigned to multiple Build instances (e.g. both the "alpha" and "beta" builds using the same exact audio file depot). However, at the time that the depot is created, it is being created as part of the creation of single Build on the client; it is only later that a utility script will allow an existing Depot to be added to other Builds.
It seemed natural to me that the Depot class should represent this relationship with a ManyToManyField. The problem is that the serializer does not seem to know what to do with this ManyToManyField. I've tried several workarounds, but each has its own error. I've tried having my DepotSerializer be either a rest_framework.serializers.Serializer or a rest_framework.serializers.ModelSerializer, but that seems largely unrelated to this problem.
Models.py:
class Depot(models.Model):
name = models.CharField(max_length=64)
builds = models.ManyToManyField(Build)
TYPE_EXECUTABLE = 0
TYPE_CORE = 1
TYPE_STREAMING = 2
depot_type = models.IntegerField(choices = (
(TYPE_EXECUTABLE, 'Executable'),
(TYPE_CORE, 'Core'),
(TYPE_STREAMING, 'Streaming'),
))
def __str__(self):
return self.name
Views.py:
class DepotCreate(mixins.CreateModelMixin,
generics.GenericAPIView):
serializer_class = DepotSerializer
queryset = Depot.objects.all()
def post(self, request, *args, **kwargs):
return self.create(request, *args, **kwargs)
Serializers.py version 1:
class DepotSerializer(serializers.ModelSerializer):
builds = serializers.PrimaryKeyRelatedField()
class Meta:
model = Depot
fields = ('id', 'name', 'builds', 'depot_type')
read_only_fields = ('id',)
def validate(self, attrs):
build = attrs['builds']
if build == None:
raise serializers.ValidationError("Build could not be found")
for depot in build.depot_set.all():
if depot.name == attrs['name']:
raise serializers.ValidationError("Build already contains a depot \"{}\"".format(depot.name))
return attrs
def restore_object(self, attrs, instance=None):
# existence of the build has already been validated
return Depot(**attrs)
This version results in the following error during the Depot init call:
Exception Type: TypeError
Exception Value:
'builds' is an invalid keyword argument for this function
Exception Location: /webapps/cdp_admin_django/lib/python3.4/site-packages/django/db/models/base.py in __init__, line 417
That appears to indicate that the Depot model cannot handle the 'builds' parameter despite the fact that it has a 'builds' ManyToManyField member.
Serializers.py 'restore_object' ver 2:
def restore_object(self, attrs, instance=None):
# existence of the build has already been validated
build = attrs['builds']
depotObj = Depot(name=attrs['name'], depot_type=attrs['depot_type'])
depotObj.builds.add(build)
return depotObj
This gave me the error:
Exception Type: ValueError
Exception Value:
"<Depot: depot_test4>" needs to have a value for field "depot" before this many-to-many relationship can be used.
Exception Location: /webapps/cdp_admin_django/lib/python3.4/site-packages/django/db/models/fields/related.py in __init__, line 524
After quite a bit of investigation, I found that ManyToMany relationships can give you trouble if you don't save the MYSQL entry before attempting to manipulate that field. Hence, restore_object ver 3:
def restore_object(self, attrs, instance=None):
# existence of the build has already been validated
build = attrs['builds']
depotObj = Depot(name=attrs['name'], depot_type=attrs['depot_type'])
depotObj.save()
depotObj.builds.add(build)
return depotObj
This does successfully create the table entry for this instance, but ends up throwing the following error:
Exception Type: IntegrityError
Exception Value:
(1062, "Duplicate entry '5' for key 'PRIMARY'")
Exception Location: /webapps/cdp_admin_django/lib/python3.4/site-packages/MySQLdb/connections.py in defaulterrorhandler, line 38
This error takes place during rest_framework/mixins.py call to serializer.save(force_insert=True). Which looks like it is supposed to force the creation of a new table entry, presumably disagreeing with my earlier call to Model.save.
Does anyone know the correct approach for a design like this? I feel like this can't be that unusual of a table structure.
EDIT 10/20/2014:
After the suggestion below, I experimented with writing a new ModelSerializer for one of my models; for the most part because of these types of order-of-operations problems, I'd backed off from using ModelSerializer and did all of my data-to-object field processing in views.py by reading serializer.data.
Having a PrimaryKeyRelatedField(many=True) in the ModelSerializer DID help. Notably, I was able to create a serializer instance with existent models and get the correct serializer.data. However, I still have the problem where restore_object can do everything except create a new model instance and pass down the ManyToManyField value. I still get "TypeError: '[PrimaryKeyRelatedField name]' is an invalid keyword argument for this function" if I pass the field to the model's init func. I still cannot save the model before the REST library does it itself. In addition, in this mode, the serializer populates serializer.data with the values of the Model, not the values provided in the data input. So if you do not use the PrimaryKeyRelatedField's attrs value in restore_object, it is discarded.
It appears that I need to override ModelSerializer.save to some kind of a pre-save, apply ManyToMany input, and a post-save, but I would need the attrs values so I can apply and modify the ManyToManyField at that time. I realize that the serializer does have the init_data field to see the original inputs, but in the case where the serializer is being used to deserialize a list of data into a list of new objects, I don't think there's a way to trace which serializer.init_data corresponds with which serializer.object.
In your serializer version 1, you do not have to add
builds = serializers.PrimaryKeyRelatedField()
as the model serializer will create this for you. In fact if you look at the exemple of the documentation (http://www.django-rest-framework.org/api-guide/relations/) you'll see that the PrimaryKeyRelatedField is applied when there is a FK 'to' the current model (not a M2M relation).
I would remove this from the serializer and see then what's going on.

How do we update an HSTORE field with Flask-Admin?

How do I update an HSTORE field with Flask-Admin?
The regular ModelView doesn't show the HSTORE field in Edit view. It shows nothing. No control at all. In list view, it shows a column with data in JSON notation. That's fine with me.
Using a custom ModelView, I can change the HSTORE field into a TextAreaField. This will show me the HSTORE field in JSON notation when in edit view. But I cannot edit/update it. In list view, it still shows me the object in JSON notation. Looks fine to me.
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
When I attempt to save/edit the JSON, I receive this error:
sqlalchemy.exc.InternalError
InternalError: (InternalError) Unexpected end of string
LINE 1: UPDATE mytable SET attributes='{}' WHERE mytable.id = ...
^
'UPDATE mytable SET attributes=%(attributes)s WHERE mytable.id = %(mytable_id)s' {'attributes': u'{}', 'mytable_id': 14L}
Now -- using code, I can get something to save into the HSTORE field:
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
def on_model_change(self, form, model, is_created):
model.attributes = {"a": "1"}
return
This basically overrides the model and put this object into it. I can then see the object in the List view and the Edit view. Still not good enough -- I want to save/edit the object that the user typed in.
I tried to parse and save the content from the form into JSON and back out. This doesn't work:
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
def on_model_change(self, form, model, is_created):
x = form.data['attributes']
y = json.loads(x)
model.attributes = y
return
json.loads(x) says this:
ValueError ValueError: Expecting property name: line 1 column 1 (char
1)
and here are some sample inputs that fail:
{u's': u'ff'}
{'s':'ff'}
However, this input works:
{}
Blank also works
This is my SQL Table:
CREATE TABLE mytable (
id BIGSERIAL UNIQUE PRIMARY KEY,
attributes hstore
);
This is my SQA Model:
class MyTable(Base):
__tablename__ = u'mytable'
id = Column(BigInteger, primary_key=True)
attributes = Column(HSTORE)
Here is how I added the view's to the admin object
admin.add_view(ModelView(models.MyTable, db.session))
Add the view using a custom Model View
admin.add_view(MyView(models.MyTable, db.session))
But I don't do those views at the same time -- I get a Blueprint name collision error -- separate issue)
I also attempted to use a form field converter. I couldn't get it to actually hit the code.
class MyModelConverter(AdminModelConverter):
def post_process(self, form_class, info):
raise Exception('here I am') #but it never hits this
return form_class
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
The answer gives you a bit more then asked
Fist of all it "extends" hstore to be able to store actually JSON, not just key-value
So this structure is also OK:
{"key":{"inner_object_key":{"Another_key":"Done!","list":["no","problem"]}}}
So, first of all your ModelView should use custom converter
class ExtendedModelView(ModelView):
model_form_converter=CustomAdminConverter
Converter itself should know how to use hstore dialect:
class CustomAdminConverter(AdminModelConverter):
#converts('sqlalchemy.dialects.postgresql.hstore.HSTORE')
def conv_HSTORE(self, field_args, **extra):
return DictToHstoreField(**field_args)
This one as you can see uses custom WTForms field which converts data in both directions:
class DictToHstoreField(TextAreaField):
def process_data(self, value):
if value is None:
value = {}
else:
for key,obj in value.iteritems():
if (obj.startswith("{") and obj.endswith("}")) or (obj.startswith("[") and obj.endswith("]")):
try:
value[key]=json.loads(obj)
except:
pass #
self.data=json.dumps(value)
def process_formdata(self, valuelist):
if valuelist:
self.data = json.loads(valuelist[0])
for key,obj in self.data.iteritems():
if isinstance(obj,dict) or isinstance(obj,list):
self.data[key]=json.dumps(obj)
if isinstance(obj,int):
self.data[key]=str(obj)
The final step will be to actual use this data in application
I did not make it in common nice way for SQLalchemy, since was used with flask-restful, so I have only adoption for flask-restful in one direction, but I think it's easy to get the idea from here and do the rest.
And if your case is simple key-value storage so nothing additionaly should be done, just use it as is.
But if you want to unwrap JSON somewhere in code, it's simple like this whenever you use it, just wrap in function
if (value.startswith("{") and value.endswith("}")) or (value.startswith("[") and value.endswith("]")):
value=json.loads(value)
Creating dynamical field for actual nice non-JSON way for editing of data also possible by extending FormField and adding some javascript for adding/removing fields, but this is whole different story, in my case I needed actual json storage, with blackjack and lists :)
Was working on postgres JSON datatype. The above solution worked great with a minor modifications.
Tried
'sqlalchemy.dialects.postgresql.json.JSON',
'sqlalchemy.dialects.postgresql.JSON',
'dialects.postgresql.json.JSON',
'dialects.postgresql.JSON'
The above versions did not work.
Finally the following change worked
#converts('JSON')
And changed class DictToHstoreField to the following:
class DictToJSONField(fields.TextAreaField):
def process_data(self, value):
if value is None:
value = {}
self.data = json.dumps(value)
def process_formdata(self, valuelist):
if valuelist:
self.data = json.loads(valuelist[0])
else:
self.data = '{}'
Although, this is might not be the answer to your question, but by default SQLAlchemy's ORM doesn't detect in-place changes to HSTORE field values. But fortunately there's a solution: SQLAlchemy's MutableDict type:
from sqlalchemy.ext.mutable import MutableDict
class MyClass(Base):
__tablename__ = 'mytable'
id = Column(Integer, primary_key=True)
attributes = Column(MutableDict.as_mutable(HSTORE))
Now when you change something in-place:
my_object.attributes.['some_key'] = 'some value'
The hstore field will be updated after session.commit().

PYQT : qCombobox displaying Column "Name" but passing Column "ID"

I've been trying very hard to get this working but so far haven't found the correct route.
I am using pyqt, and I am querying a MySql DataBase, collecting from it a model with all the columns. Until here it's all good..
I've created a combobox that is displaying the correct text using model.setcolumn(1)
What I need now is for this combobox to send on "activated" the relative unique ID of this record, so I am able to create a category relatioship.
What exactly is the best way to do this? I feel I've arrived to a dead end, any help would be appreciated.
Best,
Cris
Best way would be sub-classing QComboBox. You can't override the activated signal but you can create a custom signal that will also be emitted with ID whenever activated is emitted. And you can connect to this signal and do your stuff. It will be something like this:
class MyComboBox(QtGui.QComboBox):
activatedId = QtCore.pyqtSignal(int) #correct this if your ID is not an int
def __init__(self, parent=None):
super(MyComboBox, self).__init__(parent)
self.activated.connect(self.sendId)
#QtCore.pyqtSlot(int)
def sendId(self, index):
model = self.model()
uniqueIdColumn = 0 # if ID is elsewhere adjust
uniqueId = model.data(model.createIndex(index,uniqueIdColumn,0),QtCore.Qt.DisplayRole)
self.activatedId.emit(uniqueId)
Edit
Here is a similar version without Signals. This will return uniqueId whenever you call sendId with an index of the combobox.
class MyComboBox(QtGui.QComboBox):
def __init__(self, parent=None):
super(MyComboBox, self).__init__(parent)
def sendId(self, index):
model = self.model()
uniqueIdColumn = 0 # if ID is elsewhere adjust
uniqueId = model.data(model.createIndex(index,uniqueIdColumn,0),QtCore.Qt.DisplayRole)
return uniqueId