I am attempting to use the script found here.
I am connecting to an MS SQL database and attempting to copy it into a MySQL database. When the script gets to this line:
table.metadata.create_all(dengine)
I get the error of:
sqlalchemy.exc.CircularDependencyError
I reasearched this error and found that it occurs when using the autoload=True when creating a table. The solution though doesn't help me. The solution for this is to not use autoload=True and to make use of the use_alter=True flag when defining the foreign key, but I'm not defining the tables manually, so I can't set that flag.
Any help on how to correct this issue, or on a better way to accomplish what I am trying to do would be greatly appreciated. Thank you.
you can iterate through all constraints and set use_alter on them:
from sqlalchemy.schema import ForeignKeyConstraint
for table in metadata.tables.values():
for constraint in table.constraints:
if isinstance(constraint, ForeignKeyConstraint):
constraint.use_alter = True
Or similarly, iterate through them and specify them as AddConstraint operations, bound to after the whole metadata creates:
from sqlalchemy import event
from sqlalchemy.schema import AddConstraint
for table in metadata.tables.values():
for constraint in table.constraints:
event.listen(
metadata,
"after_create",
AddConstraint(constraint)
)
see Controlling DDL Sequences
Related
I'm working with the SQLAlchemy Expression Language (not the ORM), and I'm trying to figure out how to update a query result.
I've discovered that RowProxy objects don't support assignment, throwing an AttributeError instead:
# Get a row from the table
row = engine.execute(mytable.select().limit(1)).fetchone()
# Check that `foo` exists on the row
assert row.foo is None
# Try to update `foo`
row.foo = "bar"
AttributeError: 'RowProxy' object has no attribute 'foo'
I've found this solution, which makes use of the ORM, but I'm specifically looking to use the Expression Language.
I've also found this solution, which converts the row to a dict and updates the dict, but that seems like a hacky workaround.
So I have a few questions:
Is this in fact the only way to do it?
Moreover, is this the recommended way to do it?
And lastly, the lack of documentation made me wonder: am I just misusing SQLAlchemy by trying to do this?
You are misusing SQLAlchemy. The usage you've described is the benefit of using an ORM. If you only want to restrict yourself to SQLAlchemy Core, then you need to do
engine.execute(mytable.update().where(mytable.c.id == <id>).values(foo="bar"))
I'm trying to load some JSON from a REST API (using Neo4j 3.0.4 & APOC apoc-3.0.4.1-all) that has null values in it. This is throwing up this error:
"Cannot merge node using null property value"
The nulls can be spread across multiple keys and it varies which keys have null values. Hence I'd prefer to avoid specifying which individual keys to handle nulls for if possible.
I found the apoc.map.clean(map,[keys],[values]) procedure but not much info on how to use it. Is this the best procedure to use this for every key or is there an simpler way?
Thanks!
Thanks stdob - I managed to find another post you had written which helped me to understand solution. I need to substitute the first property for one that was never null.
MERGE (label:Label{key2: json.key2}) ON CREATE
SET label.key3 = json.key3, label.key1 = json.key1
I am having trouble sending a SQL statement through a DbContext using context.Database.ExecuteSqlCommand().
I am trying to execute
CREATE TABLE Phones([Id] [uniqueidentifier] NOT NULL PRIMARY KEY,
[Number] [int],[PhoneTypeId] [int])
GO
ALTER TABLE [dbo].[Phones] ADD CONSTRAINT [DF_Phones_Id]
DEFAULT (newid()) FOR [Id]
GO
This fails with the error string
Incorrect syntax near the keyword 'ALTER'.
Incorrect syntax near 'GO'.
However running that exact statement in SSMS runs without errors? Any issues I need to resolve regarding the default constraint throught the DbContext. I have see problems with people using constraints and not having IsDbGenerated set to true. I am not sure how that would apply here though.
GO is not a part of SQL, so it can't be executed with ExecuteSqlCommand(). Think of GO as a way to separate batches when using Management Studio or the command-line tools. Instead, just remove the GO statements and you should be fine. If you run into errors because you need to run your commands in separate batches, just call ExecuteSqlCommand() once for each batch you want to run.
I know, necroposting is bad maner, but may be this post would save someone's time. As it was mentioned in Dave's post, GO is not a part of SQL, so we can create little workaround to make it work
var text = System.IO.File.ReadAllText("initialization.sql");
var parts = text.Split(new string[] { "GO" }, System.StringSplitOptions.None);
foreach (var part in parts) { context.Database.ExecuteSqlCommand(part); }
context.SaveChanges();
In this case your commands would be splitted and executed without problems
Dave Markle beat me to it. In fact, you can change "GO" to any other string to separate batches.
An alternative implementation here is to use SMO instead of the Entity Framework. There is a useful method there called ExecuteNonQuery that I think will make your life a lot simpler. Here is a good implementation example.
I want to programatically generate ALTER TABLE statements in SQL Alchemy to add a new column to a table. The column to be added should take its definition from an existing mapped class.
So, given an SQL Alchemy Column instance, can I generate the SQL schema definition(s) I would need for ALTER TABLE ... ADD COLUMN ... and CREATE INDEX ...?
I've played at a Python prompt and been able to see a human-readable description of the data I'm after:
>>> DBChain.__table__.c.rName
Column('rName', String(length=40, convert_unicode=False, assert_unicode=None, unicode_error=None, _warn_on_bytestring=False), table=<Chain>)
When I call engine.create_all() the debug log includes the SQL statements I'm looking to generate:
CREATE TABLE "Chain" (
...
"rName" VARCHAR(40),
...
)
CREATE INDEX "ix_Chain_rName" ON "Chain" ("rName")
I've heard of sqlalchemy-migrate, but that seems to be built around static changes and I'm looking to dynamically generate schema-changes.
(I'm not interested in defending this design, I'm just looking for a dialect-portable way to add a column to an existing table.)
After tracing engine.create_all() with a debugger I've discovered a possible answer:
>>> engine.dialect.ddl_compiler(
... engine.dialect,
... DBChain.__table__.c.rName ) \
... .get_column_specification(
... DBChain.__table__.c.rName )
'"rName" VARCHAR(40)'
The index can be created with:
sColumnElement = DBChain.__table__.c.rName
if sColumnElement.index:
sIndex = sa.schema.Index(
"ix_%s_%s" % (rTableName, sColumnElement.name),
sColumnElement,
unique=sColumnElement.unique)
sIndex.create(engine)
I am working to use django's ContentType framework to create some generic relations for a my models; after looking at how the django developers do it at django.contrib.comments.models I thought I would imitate their approach/conventions:
from django.contrib.comments.models, line 21):
content_type = models.ForeignKey(ContentType,
verbose_name='content type',
related_name="content_type_set_for_%(class)s")
object_pk = models.TextField('object ID')
content_object = generic.GenericForeignKey(ct_field="content_type", fk_field="object_pk")
That's taken from their source and, of course, their source works for me (I have comments with object_pk's stored just fine (integers, actually); however, I get an error during syncdb on table creation that ends:
_mysql_exceptions.OperationalError: (1170, "BLOB/TEXT column 'object_pk' used in key specification without a key length")
Any ideas why they can do it and I can't ?
After looking around, I noticed that the docs actually state:
Give your model a field that can store a primary-key value from the models you'll be relating to. (For most models, this means an IntegerField or PositiveIntegerField.)
This field must be of the same type as the primary key of the models that will be involved in the generic relation. For example, if you use IntegerField, you won't be able to form a generic relation with a model that uses a CharField as a primary key.
But why can they do it and not me ?!
Thanks.
PS: I even tried creating an AbstractBaseModel with these three fields, making it abstract=True and using that (in case that had something to do with it) ... same error.
After I typed out that really long question I looked at the mysql and realized that the error was stemming from:
class Meta:
unique_together = (("content_type", "object_pk"),)
Apparently, I can't have it both ways. Which leaves me torn. I'll have to open a new question about whether it is better to leave my object_pk options open (suppose I use a textfield as a primary key?) or better to enforce the unique_togetherness...