How do you update a SQLAlchemy RowProxy? - sqlalchemy

I'm working with the SQLAlchemy Expression Language (not the ORM), and I'm trying to figure out how to update a query result.
I've discovered that RowProxy objects don't support assignment, throwing an AttributeError instead:
# Get a row from the table
row = engine.execute(mytable.select().limit(1)).fetchone()
# Check that `foo` exists on the row
assert row.foo is None
# Try to update `foo`
row.foo = "bar"
AttributeError: 'RowProxy' object has no attribute 'foo'
I've found this solution, which makes use of the ORM, but I'm specifically looking to use the Expression Language.
I've also found this solution, which converts the row to a dict and updates the dict, but that seems like a hacky workaround.
So I have a few questions:
Is this in fact the only way to do it?
Moreover, is this the recommended way to do it?
And lastly, the lack of documentation made me wonder: am I just misusing SQLAlchemy by trying to do this?

You are misusing SQLAlchemy. The usage you've described is the benefit of using an ORM. If you only want to restrict yourself to SQLAlchemy Core, then you need to do
engine.execute(mytable.update().where(mytable.c.id == <id>).values(foo="bar"))

Related

How to make sqlachemy see implicit lateral joins as in json_each or jsonb_each?

I'm trying to figure out the proper way of using json_each. I've seen some tricks like using column or text. So far I've found a quite clean way using table_valued, that works except for the cross join warning.
term = 'connection'
about_exp = func.json_each(EventHistory.event, '$.about').table_valued('value')
events = s.query(EventHistory).filter(about_exp.c.value == term)
EventHistory contains one json field that looks like this: {"about": ["antenna", "connection", "modem", "network"]}
The resulting query works as expected but I'm getting the following warning:
SAWarning: SELECT statement has a cartesian product between FROM element(s) "event_history" and FROM element "anon_1". Apply join condition(s) between each element to resolve.
For any one that would like to experiment here is a working example in from of unit tests: https://gist.github.com/PiotrCzapla/579f76bdf95a485eaaafed1492d9a70e
So far the only way I found not to emit the warning is to add join(about_exp, true())
from sqlalchemy import true
about_exp = func.json_each(EventHistory.event, '$.about').table_valued('value')
events = s.query(EventHistory).join(about_exp, true()).filter(
about_exp.c.value == about_val
)
But it needs additional import of true and additional join statement, if anyone has a better solution please let me know.
As of sqlalchemy version 1.4.33, you can use the joins_implicitly=True option for table_valued.
term = 'connection'
about_exp = func.json_each(EventHistory.event, '$.about').table_valued('value', joins_implicitly=True)
events = s.query(EventHistory).filter(about_exp.c.value == term)
joins_implicitly – when True, the table valued function may be used in the FROM clause without any explicit JOIN to other tables in the SQL query, and no “cartesian product” warning will be generated. May be useful for SQL functions such as func.json_each().
source

SQLAlchemy Join using PyBigquery to filter results

Using a SQLAlchemy class, I'm trying to generate a query that resembles
SELECT
DISTINCT(non_unique_key)
FROM
`tablename`,
UNNEST(tasks_dns) AS dns
WHERE
create_date_utc = TIMESTAMP("2020-12-31T23:59:59")
AND dns LIKE "%whatever%"
Being an implicit join using unnest(), I don't have a clue how to construct my statement.
Using a combination of .label() and moving the unnest() call around, I've managed to move the unnest clause to either the SELECT or WHERE clauses, but not in the FROM.
For example,
session.query(Table.non_unique_key).filter(func.unnest(Table.dns) != '').filter(Table.create_date == "2021-04-22")
leaves me with
SELECT `tablename`.`non_unique_key` AS `tablename_non_unique_key`
FROM `tablename`
WHERE unnest(`tablename`.`tasks_dns`) IS NOT NULL AND `tablename`.`create_date_utc` = %(create_date_utc_1)s
So far, using join() has just caused exceptions around not having a column to join on (which while yes, I understand what that means, I'm not sure how to get around that since an unnest is basically doing an expansion of a nested data type that doesn't have a column to join on.. which is probably where my ignorance around how to properly use SQLAlchemys join() method comes in)
Is this just a SQLAlchemy / BigQuery dialect issue at this point? Or am I just a dunce? I know the dialect library is still infant, but even with Postgres, I would have thought that this should be a somewhat common query pattern?
After some additional digging, I've figured it out
Model().query().select_from(func.unnest(Model.col1).alias("whatever")).filter()....

Neo4j JSON APOC load - skip nulls

I'm trying to load some JSON from a REST API (using Neo4j 3.0.4 & APOC apoc-3.0.4.1-all) that has null values in it. This is throwing up this error:
"Cannot merge node using null property value"
The nulls can be spread across multiple keys and it varies which keys have null values. Hence I'd prefer to avoid specifying which individual keys to handle nulls for if possible.
I found the apoc.map.clean(map,[keys],[values]) procedure but not much info on how to use it. Is this the best procedure to use this for every key or is there an simpler way?
Thanks!
Thanks stdob - I managed to find another post you had written which helped me to understand solution. I need to substitute the first property for one that was never null.
MERGE (label:Label{key2: json.key2}) ON CREATE
SET label.key3 = json.key3, label.key1 = json.key1

Same ID which occurs multiple times with SQL-IN Operator at rails

I use the following sql statement:
Keyword.where("id IN (#{params[:keyword_ids]})").order("find_in_set(id, '#{params[:keyword_ids]}')")
The Problem at this statement is that if "keyword id" hold the same id more than ones, the call returns it only ones.
But I need the same number(not fulfilled) as well as the same order(which is fulfilled with this statement) which occurs in the array, independent if the same id occurs more than ones.
How should I change that statement to fix.
Thanks, dot
Well, that's not a bug, it's a feature ;)
My first recommendation would be to sanitize your input. Passing params[:keyword_ids] directly to the database, despite the help that the Rails framework does, is prone to lead to some kind of vulnerability sooner or later.
Secondly, the easiest solution is probably to keep the query as is, convert the results to a map and map the input params to the result.
Sth like
keywords = Keyword.where('id IN (?)', checked_keyword_ids)
keyword_map = Hash[keywords.map { |kw| [kw.id, kw] }]
checked_keyword_ids.map { |id| keyword_map[id] }

Creating an OR statement using existing conditions hash

I am working on a problem where I need to add an OR clause to a set of existing conditions. The current conditions are built in a hash in a method and at the end, they are used in the where clause. Here is a simplified example:
...
conds.merge!({:users => {:archived => false}})
Model.where(conds)
I am trying to add an OR clause to the current set of conditions so it would be something like '(conditions) OR new_condition'. I'd like to add the OR statement without converting each addition to the conds hash into a string. That would be my last option. I was hoping someone has done something like this before (without using Arel). I seem to recall in Rails 2 there was a way to parse a conditions hash using a method from the model (something like Model.some_method(conds) would produce the where clause string. Maybe that would be a good option to just add the OR clause on to that string. Any ideas are appreciated. Thank you for your help!
I found a way to do what I needed. Instead of changing all of the conditions that I am building, I am parsing the conditions to SQL using sanitize_sql_for_conditions. This is a private method in ActiveRecord, so I had to put a method on the model to allow me to access it. Here is my model method:
def self.convert_conditions_hash_to_sql(conditions)
self.sanitize_sql_for_conditions(conditions)
end
So, once I convert my conditions to text, I can add my OR clause (along with the appropriate parentheses) to the end of the original conditions. So, it would go something like this:
Model.where('(?) OR (model.type = ? AND model.id IN(?))', Model.convert_conditions_hash_to_sql(conds), model_type, model_id_array)