Share Meta classes between two SQLAlchemyObjectType - sqlalchemy

I'm using graphene_sqlalchemy package in my Flask app (https://github.com/graphql-python/graphene-sqlalchemy) and I need to make two queries using the same Model. I was wondering if is possible two types share the same model as bellow:
class A(SQLAlchemyObjectType):
x = graphene.Int()
class Meta:
model = Model
interfaces = (graphene.relay.Node,)
class B(SQLAlchemyObjectType):
y = graphene.Int()
class Meta:
model = Model
interfaces = (graphene.relay.Node,)
or using inheritance:
class A(SQLAlchemyObjectType):
x = graphene.Int()
class Meta:
model = Model
interfaces = (graphene.relay.Node,)
class B(A):
y = graphene.Int()
But I'm getting errors using these two aproachs. Is there any workaround for it? Thank you in advance !

Related

How do I implement polymorphic dataclass models with default values in the base model in SQLAlchemy 2.0 / Declarative?

I'm in the process of migrating to SQLAlchemy 2.0 and adopting new Declarative syntax with MappedAsDataclass. Previously, I've implemented joined table inheritance for my models. The (simplified) code looks like this:
from sqlalchemy import ForeignKey, String
from sqlalchemy.orm import DeclarativeBase, Mapped, MappedAsDataclass, mapped_column
class Base(MappedAsDataclass, DeclarativeBase):
pass
class Foo(Base):
__tablename__ = "foo"
id: Mapped[int] = mapped_column(primary_key=True)
type: Mapped[str] = mapped_column(String(50))
foo_value: Mapped[float] = mapped_column(default=78)
__mapper_args__ = {"polymorphic_identity": "foo", "polymorphic_on": "type"}
class Bar(Foo):
__tablename__ = "bar"
id: Mapped[int] = mapped_column(ForeignKey("foo.id"), primary_key=True)
bar_value: Mapped[float]
__mapper_args__ = {"polymorphic_identity": "bar"}
The important bit for the question is the default value in foo_value. Because of its presence, a TypeError: non-default argument 'bar_value' follows default argument is raised. While moving fields around in the definition of a single class could make this error go away (but why is it raised in first place, since the field order is not really important?), it's not possible with inherited models.
How can I fix or work around this limitation? Am I missing something relevant from the documentation?
It seems I needed to use insert_default with MappedAsDataclass instead of default, as described in the docs.

SQLAlchemy 1.4 abstracting MetaData.reflect into function not returning MetaData object?

I have a class that acts as a PostgreSQL database interface. It has a number of methods which do things with the MetaData, such as get table names, drop tables, etc.
These methods keep calling the same two lines to set up MetaData. I am trying to tidy this up by abstracting this MetaData setup into its own function which is initiated when the class is instantiated, but this isn't working, as the function keeps returning NoneType instead of the MetaData instance.
Here is an example of the class, BEFORE adding the MetaData function:
class Db:
def __init__(self, config):
self.engine = create_async_engine(ENGINE, echo=True, future=True)
self.session = sessionmaker(self.engine, expire_on_commit=False, class_=AsyncSession)
def get_table_names(self):
meta = MetaData()
meta.reflect(bind=sync_engine)
meta = self.meta()
return meta.tables.keys()
This works well, returns a list of table keys:
dict_keys(['user', 'images', 'session'])
When I try to shift the MetaData call into its own function like so:
class Db:
def __init__(self, config):
self.engine = create_async_engine(ENGINE, echo=True, future=True)
self.session = sessionmaker(self.engine, expire_on_commit=False, class_=AsyncSession)
self.meta = self.get_metadata()
def get_metadata(self):
meta = MetaData()
return meta.reflect(bind=sync_engine)
def get_table_names(self):
return self.meta.tables.keys()
It returns this error:
in get_table_names return self.meta.tables.keys()
AttributeError: 'NoneType' object has no attribute 'tables'
How can I achieve this sort of functionality by calling self.meta() from within the various class methods?
Reflect alters the current metadata in-place. So you can just return the meta variable explicitly.
class Db:
# ...
def get_metadata(self):
meta = MetaData()
meta.reflect(bind=sync_engine)
return meta
# ...
Although it might be better to do this in a factory function, like def db_factory(config): and inject these things already prepped in the class constructor, def __init__(self, metadata, engine, session):. Just a thought.
Just wanted to post an answer, as with someone else's help I was able to solve this. The code should look like this:
class Db:
def __init__(self, config):
self.engine = create_async_engine(ENGINE, echo=True, future=True)
self.session = sessionmaker(self.engine, expire_on_commit=False, class_=AsyncSession)
self._meta = MetaData()
#property
def meta(self):
self._meta.reflect(bind=sync_engine)
return self._meta
def get_table_names(self):
return self.meta.tables.keys()

Python Sqlalchemy Question - what is the class DicMixIn to Class Records?

I am very new to Python and SqlAlchemy. I stumbled upon this code while learning using SqlALchemy with Flask. Can you please help me to understand the class DictMixIn class - what are we doing here? Why are we using this?
class DictMixIn:
def to_dict(self):
return {
column.name: getattr(self, column.name)
if not isinstance(
getattr(self, column.name), (datetime.datetime, datetime.date)
)
else getattr(self, column.name).isoformat()
for column in self.__table__.columns
}
class Record(Base, DictMixIn):
__tablename__ = "Records"
id = Column(Integer, primary_key=True, index=True)
date = Column(Date)
country = Column(String, index=True)
cases = Column(Integer)
deaths = Column(Integer)
recoveries = Column(Integer)
At the end, the following snipped code was used - I believe they are using to_dict function above to print it. Am I right?
def show_records():
records = app.session.query(models.Record).all()
return jsonify([record.to_dict() for record in records])
The original code is here - https://github.com/edkrueger/sars-flask
I would really appreciate your help.
A mixin let's you add methods and properties to a class outside of the direct class hierarchy. You can then use the same mixin to inject methods/properties into many classes, ie. preventing duplication by not writing that same to_dict() into each database model. This mixin helps extend model classes in this case but this pattern is a python pattern and not specific to SQLAlchemy.
This specific mixin just lets you convert a database model instance into a simple dictionary. It seems that the intent as you pointed out is to jsonify a database model instance. JSON does not support Dates so that is why special care is taken to convert datetimes into a str with isoformat().
record.to_dict() might output something like:
{
"id": 100,
"date": "2021-09-11"
"country": "USA",
"cases": 100000000,
"deaths": 600000,
"recoveries": 95000000
}
mixin explained on stackoverflow
multiple inheritance in python docs -- the mixin pattern utilizes multiple inheritance in Python

DJANGO ListView returns HTTPResponse, but I need a json

I am having a problem around the Class Based Views in Django. I need to create JSON responses, instead of rendering to a template.
class AllItem(ListView):
model = MyModel
context_object_name = 'items'
template_name = 'mymodel/index.html'
I also have a serializer class
class SpendingConfigSerializer(serializers.ModelSerialier):
class Meta:
model = SpendingConfig
fields = ('record_date', 'amount', 'name')
Does anyone know how to connect them?
Thanks
Z.
You can make use of a ListAPIView and specify your SpendingConfigSerializer as serializer:
from rest_framework import generics
class UserList(generics.ListCreateAPIView):
queryset = SpendingConfig.objects.all()
serializer_class = SpendingConfigSerializer

Django - serializing queryset along with related models

There are hundreds of posts related to an elementary goal.
I have a simple model:
class ModelA(models.Model):
# I've got only two fields in reality,
# but let's suppose there are 150 of them
class ModelB(models.Model):
# fields
class ModelC(models.Model):
field_c = Integer
modelA = models.ForeignKey('ModelB')
modelB = models.ForeignKey(ModelC)
model_c_instance = ModelC.objects.select_related().get(pk=pk)
All I want to do is to come up with a JSON object which would include the fields for ModelA and ModelB.
Wadofstaff (post) does not suit for Django 1.7 and higher.
This post does not shed light on how I should serialize the objects.
This post tell about Full Serializers, but there's no code snippet to see how it is used.
My final JSON object should look like
[{
"model": "userinfo",
"fields": {
"field_c ": "9966",
"modelA": [{
# modelA fields
}
etc...
}]
Do I need REST framework ? Or Full Serializers ?
Please could anyone suggest a structured answer to this topic. I can't come up with a solution for two weeks.
You want to use a nested relationship. You don't show anything about the userinfo model so this answer doesn't take that into account. You could do something similar to this pseudo-code:
# serializers.py
from rest_framework import serializers
class ModelASerializer(serializers.ModelSerializer):
class Meta:
model = ModelA
class ModelBSerializer(serializers.ModelSerializer):
class Meta:
model = ModelB
class ModelCSerializer(serializers.ModelSerializer):
modelA = ModelASerializer(many=True)
modelB = ModelBSerializer(many=True)
class Meta:
model = ModelC