Retrieving a key value from WTForms SelectField using Flask [duplicate] - mysql

This question already has an answer here:
Python sqlite3 parameterized drop table
(1 answer)
Closed 5 years ago.
I am having a problem with WTForms in Flask, I want to create a add_menu function which adds menu to the database. User can choose from SelectField "Appetizer", "Main Dish", or "Drinks" accordingly. So whenever user chooses the value from SelectField it adds to the corresponding table in a database. (I use MySQL). For some reason when I use menu_type = form.menu_type.data it gives me the following error
mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''main_dishes'(name,ingredients,price) VALUES('Salmon', 'duude,frv', '35')' at line 1")
It takes the right value, but I have this awkward '' signs infront of main_dishes string
My code looks as follows:
class MenuForm(Form):
menu_type = SelectField('Menu Type', [validators.DataRequired()], choices=[('appetizers','Appetizer'),('main_dishes','Main Dish'),('desserts','Dessert'),('drinks','Drinks')], coerce=str)
name = StringField('Name', [validators.Length(min=1, max=2000)])
ingredients = TextAreaField('Ingredients', [validators.Length(min=10)])
price = DecimalField('Price (Manat)', [validators.DataRequired()])
#app.route('/add_menu', methods=['GET','POST'])
#is_logged_in
def add_menu():
form = MenuForm(request.form)
if request.method == 'POST' and form.validate():
menu_type = form.menu_type.data # <---Here is the problem
name = form.name.data
ingredients = form.ingredients.data
price = form.price.data
#Create cursor
cur = mysql.connection.cursor()
#execute
cur.execute("INSERT INTO %s(name,ingredients,price) VALUES(%s, %s, %s)", (menu_type,name,ingredients,price))
#Commit to DB
mysql.connection.commit()
#CLose connection
cur.close()
flash('Menu is Added', 'success')
return redirect(url_for('dashboard'))
return render_template('add_menu.html', form=form)

The table name is substituted as a quoted string and the query executed as such.
You may want to build your query with the table name before binding parameterized values.
query = "INSERT INTO {}(name,ingredients,price) VALUES(%s, %s, %s)".format(menu_type)
cur.execute(query, (name,ingredients,price))

Related

Python MySQL Connector inserting, but info is not actually in database

I am making a login and registration project. The login code looks similar to this where the query is obviously not an insert statement. This is the registration code below to register a user. The front end sends their json to me as a back end and I parse it. I remember this was working yesterday. Now today, I've noticed it's not inserting data into the database, yet I am getting the true response that it inserted. I check the table and nothing is getting inserted.
def auth(n):
cnx = mysql.connector.connect(user='dbuser', password='dbpass', host='localhost', port='3306', database='dbname')
cursor = cnx.cursor(buffered = True)
value_list = list()
for value in n.values():
value_list.append(value)
value_string = str(value_list)
a = value_string.strip("[")
b = a.strip("]")
c = b.replace("'", "")
d = c.split(', ')
authquery=("INSERT INTO members (id, firstname, lastname, email, password, history) VALUES (id, %s, %s, %s, %s);")
cursor.execute(authquery, d)
if cursor.rowcount:
return "true"
else:
return "false"
cursor.close()
cnx.commit()
cnx.close()
Anyone know why it is doing this? I even tried creating a new table and even a new database.

Inserting data into a SQL server from an excel file

First of all, sorry for my lack of knowledge regarding databases, this is my first time working with them.
I am having some issues trying to get the data from an excel file and putting it into a data base.
Using answers from the site, I managed to kind of connect to the database by doing this.
import pandas as pd
import pyodbc
server = 'XXXXX'
db = 'XXXXXdb'
# create Connection and Cursor objects
conn = pyodbc.connect('DRIVER={SQL Server};SERVER=' + server + ';DATABASE=' + db + ';Trusted_Connection=yes')
cursor = conn.cursor()
# read data from excel
data = pd.read_excel('data.csv')
But I dont really know what to do now.
I have 3 tables, which are connected by a 'productID', my excel file mimics the data base, meaning that all the columns in the excel file have a place to go in the DB.
My plan was to read the excel file and make lists with each column, then insert into the DB each column value but I have no idea how to create a query that can do this.
Once I get the query I think the data insertion can be done like this:
query = "xxxxxxxxxxxxxx"
for row in data:
#The following is not the real code
productID = productID
name = name
url = url
values = (productID, name, url)
cursor.execute(query,values)
conn.commit()
conn.close
Database looks like this.
https://prnt.sc/n2d2fm
http://prntscr.com/n2d3sh
http://prntscr.com/n2d3yj
EDIT:
Tried doing something like this, but i'm getting 'not all arguments converted during string formatting' Type error.
import pymysql
import pandas as pd
connStr = pymysql.connect(host = 'xx.xxx.xx.xx', port = xxxx, user = 'xxxx', password = 'xxxxxxxxxxx')
df = pd.read_csv('GenericProducts.csv')
cursor = connStr.cursor()
query = "INSERT INTO [Productos]([ItemID],[Nombre])) values (?,?)"
for index,row in df.iterrows():
#cursor.execute("INSERT INTO dbo.Productos([ItemID],[Nombre])) values (?,?,?)", row['codigoEspecificoProducto'], row['nombreProducto'])
codigoEspecificoProducto = row['codigoEspecificoProducto']
nombreProducto = row['nombreProducto']
values = (codigoEspecificoProducto,nombreProducto)
cursor.execute(query,values)
connStr.commit()
cursor.close()
connStr.close()
I think my problem is in how I'm defining the query, surely thats not the right way
Try this, you seem to have changed the library from pyodbc to mysql, it seems to expect %s instead of ?
import pymysql
import pandas as pd
connStr = pymysql.connect(host = 'xx.xxx.xx.xx', port = xxxx, user = 'xxxx', password = 'xxxxxxxxxxx')
df = pd.read_csv('GenericProducts.csv')
cursor = connStr.cursor()
query = "INSERT INTO [Productos]([ItemID],[Nombre]) values (%s,%s)"
for index,row in df.iterrows():
#cursor.execute("INSERT INTO dbo.Productos([ItemID],[Nombre]) values (%s,%s)", row['codigoEspecificoProducto'], row['nombreProducto'])
codigoEspecificoProducto = row['codigoEspecificoProducto']
nombreProducto = row['nombreProducto']
values = (codigoEspecificoProducto,nombreProducto)
cursor.execute(query,values)
connStr.commit()
cursor.close()
connStr.close()

How to insert data into a table containing one single column?

I'm currently learning Python and MySQL and have an issue inserting data if my table has one single column (actually one auto-incremented id and a column).
I tried several syntaxes, "playing" with quotes and parenthesis, several ways to implement execute() method, but nothing worked.
Here is my statement :
import mysql.connector
db_name = "purbeurre"
list_categories = ['Drinks', 'Meat', 'Bread']
cnx = mysql.connector.connect(user='toto', password='toto', host='123.456.0.78')
cursor = cnx.cursor()
cursor.execute("USE {}".format(db_name))
insert_categories = ("INSERT INTO Categories (name) VALUES (%s)")
cursor.executemany(insert_categories, list_categories)
The error is : "ValueError: Could not process parameters"
If I add a column, the statement becomes this one and works fine :
import mysql.connector
db_name = "purbeurre"
list_categories = [('Drinks', 'Liquid products'), ('Meat', 'All kind of meat', ('Bread', 'Bakery products')]
cnx = mysql.connector.connect(user='toto', password='toto', host='123.456.0.78')
cursor = cnx.cursor()
cursor.execute("USE {}".format(db_name))
insert_categories = ("INSERT INTO Categories (name, description) VALUES (%s)")
cursor.executemany(insert_categories, list_categories)
As you can see, the only difference is the number of columns.
Any idea of what happens?
I received the answer.
The list of data was not correctly defined, here is the correct syntax :
list_categories = [('Drinks',), ('Meat',), ('Bread',)]
Beware of the comma before the parenthesis is closed to ensure each element in the list are tuples.

MySQL query in R to compare with certain value [duplicate]

This question already has answers here:
Pass string variable in R script to use it in SQL statement
(4 answers)
Closed 6 years ago.
args <- commandArgs(trailingOnly = TRUE)
id = as.character(args)
mysqlconnection = dbConnect(MySQL(), user = 'root', password = '', dbname = 'manu',host = 'localhost')
sql<-sprintf("select * from net where ips1=%s;",id)
up = dbGetQuery(mysqlconnection, sql)
I am trying to retrieve records from the table net using R.
I want to retrieve the records with a specific id which is being passed as a command line argument.However i am getting an error near " ips1=%s ",saying that the SQL syntax which i used is incorrect. Any help?
Please, try to enclose the string value to compare in single quotes ':
sql <- sprintf("select * from net where ips1='%s';",id)

MySQL Dynamic Query Statement in Python with Dictionary

Very similar to this question MySQL Dynamic Query Statement in Python
However what I am looking to do instead of two lists is to use a dictionary
Let's say i have this dictionary
instance_insert = {
# sql column variable value
'instance_id' : 'instnace.id',
'customer_id' : 'customer.id',
'os' : 'instance.platform',
}
And I want to populate a mysql database with an insert statement using sql column as the sql column name and the variable name as the variable that will hold the value that is to be inserted into the mysql table.
Kind of lost because I don't understand exactly what this statement does, but was pulled from the question that I posted where he was using two lists to do what he wanted.
sql = "INSERT INTO instance_info_test VALUES (%s);" % ', '.join('?' for _ in instance_insert)
cur.execute (sql, instance_insert)
Also I would like it to be dynamic in the sense that I can add/remove columns to the dictionary
Before you post, you might want to try searching for something more specific to your question. For instance, when I Googled "python mysqldb insert dictionary", I found a good answer on the first page, at http://mail.python.org/pipermail/tutor/2010-December/080701.html. Relevant part:
Here's what I came up with when I tried to make a generalized version
of the above:
def add_row(cursor, tablename, rowdict):
# XXX tablename not sanitized
# XXX test for allowed keys is case-sensitive
# filter out keys that are not column names
cursor.execute("describe %s" % tablename)
allowed_keys = set(row[0] for row in cursor.fetchall())
keys = allowed_keys.intersection(rowdict)
if len(rowdict) > len(keys):
unknown_keys = set(rowdict) - allowed_keys
print >> sys.stderr, "skipping keys:", ", ".join(unknown_keys)
columns = ", ".join(keys)
values_template = ", ".join(["%s"] * len(keys))
sql = "insert into %s (%s) values (%s)" % (
tablename, columns, values_template)
values = tuple(rowdict[key] for key in keys)
cursor.execute(sql, values)
filename = ...
tablename = ...
db = MySQLdb.connect(...)
cursor = db.cursor()
with open(filename) as instream:
row = json.load(instream)
add_row(cursor, tablename, row)
Peter
If you know your inputs will always be valid (table name is valid, columns are present in the table), and you're not importing from a JSON file as the example is, you can simplify this function. But it'll accomplish what you want to accomplish. While it may initially seem like DictCursor would be helpful, it looks like DictCursor is useful for returning a dictionary of values, but it can't execute from a dict.