I'm currently learning Python and MySQL and have an issue inserting data if my table has one single column (actually one auto-incremented id and a column).
I tried several syntaxes, "playing" with quotes and parenthesis, several ways to implement execute() method, but nothing worked.
Here is my statement :
import mysql.connector
db_name = "purbeurre"
list_categories = ['Drinks', 'Meat', 'Bread']
cnx = mysql.connector.connect(user='toto', password='toto', host='123.456.0.78')
cursor = cnx.cursor()
cursor.execute("USE {}".format(db_name))
insert_categories = ("INSERT INTO Categories (name) VALUES (%s)")
cursor.executemany(insert_categories, list_categories)
The error is : "ValueError: Could not process parameters"
If I add a column, the statement becomes this one and works fine :
import mysql.connector
db_name = "purbeurre"
list_categories = [('Drinks', 'Liquid products'), ('Meat', 'All kind of meat', ('Bread', 'Bakery products')]
cnx = mysql.connector.connect(user='toto', password='toto', host='123.456.0.78')
cursor = cnx.cursor()
cursor.execute("USE {}".format(db_name))
insert_categories = ("INSERT INTO Categories (name, description) VALUES (%s)")
cursor.executemany(insert_categories, list_categories)
As you can see, the only difference is the number of columns.
Any idea of what happens?
I received the answer.
The list of data was not correctly defined, here is the correct syntax :
list_categories = [('Drinks',), ('Meat',), ('Bread',)]
Beware of the comma before the parenthesis is closed to ensure each element in the list are tuples.
Related
I have connected to Spotify's API in Python to extract the top twenty tracks of a searched artist. I am trying to store the data in MySQL Workbench in a database named 'spotify_api', I created called 'spotify'. Before I added my code to connect to MySQL Workbench, my code worked correctly and was able to extract the list of tracks, but I have run into issues in getting my code to connect to my database. Below is the code I have written to both extract the data and store it into my database:
import spotipy
from spotipy.oauth2 import SpotifyClientCredentials
import mysql.connector
mydb = mysql.connector.connect(
host = "localhost",
user = "root",
password = "(removed for question)",
database = "spotify_api"
)
mycursor = mydb.cursor()
sql = 'DROP TABLE IF EXISTS spotify_api.spotify;'
mycursor.execute(sql)
sp = spotipy.Spotify(auth_manager=SpotifyClientCredentials(client_id="(removed for question)",
client_secret="(removed for question)"))
results = sp.search(q='sza', limit=20)
for idx, track in enumerate(results['tracks']['items']):
print(idx, track['name'])
sql = "INSERT INTO spotify_api.spotify (tracks, items) VALUES (" + \
str(idx) + ", '" + track['name'] + "');"
mycursor.execute(sql)
mydb.commit()
print(mycursor.rowcount, "record inserted.")
mycursor.execute("SELECT * FROM spotify_api.spotify;")
myresult = mycursor.fetchall()
for x in myresult:
print(x)
mycursor.close()
Every time I run my code in the VS Code terminal, I receive an error stating that my table doesn't exist. This is what it states:
"mysql.connector.errors.ProgrammingError: 1146 (42S02): Table 'spotify_api.spotify' doesn't exist"
I'm not sure what I need to fix in my code or in my database in order to eliminate this error and get my data stored into my table. In my table I have created two columns 'tracks' and 'items', but I'm not sure if my issues lie in my database or in my code.
Well, it seems pretty clear. You ran
DROP TABLE IF EXISTS spotify_api.spotify;
...
INSERT INTO spotify_api.spotify (tracks, items) VALUES ...
We won't even raise the spectre of the Chuck Berry
track titled little ol' Bobby Tables here.
You DROP'd it, then tried to INSERT into it.
That won't work.
You'll need to CREATE TABLE prior to the INSERT.
I am querying my SQL table using the code below and converted the result to a list. Why is the list having unwanted commas and parenthesis?
The query result
[(34830,), (34650,), (35050,), (34500,), (35050,), (34500,), (34725,), (34550,), (34725,), (34760,), (34760,)]
It should just return a list with just numbers on it. Right?
The schema is simple (link text, price int);
What is the problem here? Is there something wrong with my code?
import pymysql
connection = pymysql.connect(host='localhost',
user='root',
password='passme',
db='hpsize') # connection obhect to pass the database details
sql = "SELECT price FROM dummy WHERE link ='https://www.flipkart.com/bose-noise-cancelling-700-anc-enabled-bluetooth-headset/p/itma57a01d3bd591?pid=ACCFGYZEVVGYM8FP'"
my_cursor = connection.cursor()
my_cursor.execute(sql)
result = list(my_cursor.fetchall())
print(result)
connection.close()
The query result
[(34830,), (34650,), (35050,), (34500,), (35050,), (34500,), (34725,), (34550,), (34725,), (34760,), (34760,)]
try
connection.row_factory = lambda cursor, row: row[0]
instead of list(my_cursor.fetchall())
then
result = connection.execute("""SELECT * FROM dummy""").fetchall()
or you can also use strip() to cut the unwanted part
First of all, sorry for my lack of knowledge regarding databases, this is my first time working with them.
I am having some issues trying to get the data from an excel file and putting it into a data base.
Using answers from the site, I managed to kind of connect to the database by doing this.
import pandas as pd
import pyodbc
server = 'XXXXX'
db = 'XXXXXdb'
# create Connection and Cursor objects
conn = pyodbc.connect('DRIVER={SQL Server};SERVER=' + server + ';DATABASE=' + db + ';Trusted_Connection=yes')
cursor = conn.cursor()
# read data from excel
data = pd.read_excel('data.csv')
But I dont really know what to do now.
I have 3 tables, which are connected by a 'productID', my excel file mimics the data base, meaning that all the columns in the excel file have a place to go in the DB.
My plan was to read the excel file and make lists with each column, then insert into the DB each column value but I have no idea how to create a query that can do this.
Once I get the query I think the data insertion can be done like this:
query = "xxxxxxxxxxxxxx"
for row in data:
#The following is not the real code
productID = productID
name = name
url = url
values = (productID, name, url)
cursor.execute(query,values)
conn.commit()
conn.close
Database looks like this.
https://prnt.sc/n2d2fm
http://prntscr.com/n2d3sh
http://prntscr.com/n2d3yj
EDIT:
Tried doing something like this, but i'm getting 'not all arguments converted during string formatting' Type error.
import pymysql
import pandas as pd
connStr = pymysql.connect(host = 'xx.xxx.xx.xx', port = xxxx, user = 'xxxx', password = 'xxxxxxxxxxx')
df = pd.read_csv('GenericProducts.csv')
cursor = connStr.cursor()
query = "INSERT INTO [Productos]([ItemID],[Nombre])) values (?,?)"
for index,row in df.iterrows():
#cursor.execute("INSERT INTO dbo.Productos([ItemID],[Nombre])) values (?,?,?)", row['codigoEspecificoProducto'], row['nombreProducto'])
codigoEspecificoProducto = row['codigoEspecificoProducto']
nombreProducto = row['nombreProducto']
values = (codigoEspecificoProducto,nombreProducto)
cursor.execute(query,values)
connStr.commit()
cursor.close()
connStr.close()
I think my problem is in how I'm defining the query, surely thats not the right way
Try this, you seem to have changed the library from pyodbc to mysql, it seems to expect %s instead of ?
import pymysql
import pandas as pd
connStr = pymysql.connect(host = 'xx.xxx.xx.xx', port = xxxx, user = 'xxxx', password = 'xxxxxxxxxxx')
df = pd.read_csv('GenericProducts.csv')
cursor = connStr.cursor()
query = "INSERT INTO [Productos]([ItemID],[Nombre]) values (%s,%s)"
for index,row in df.iterrows():
#cursor.execute("INSERT INTO dbo.Productos([ItemID],[Nombre]) values (%s,%s)", row['codigoEspecificoProducto'], row['nombreProducto'])
codigoEspecificoProducto = row['codigoEspecificoProducto']
nombreProducto = row['nombreProducto']
values = (codigoEspecificoProducto,nombreProducto)
cursor.execute(query,values)
connStr.commit()
cursor.close()
connStr.close()
This question already has an answer here:
Python sqlite3 parameterized drop table
(1 answer)
Closed 5 years ago.
I am having a problem with WTForms in Flask, I want to create a add_menu function which adds menu to the database. User can choose from SelectField "Appetizer", "Main Dish", or "Drinks" accordingly. So whenever user chooses the value from SelectField it adds to the corresponding table in a database. (I use MySQL). For some reason when I use menu_type = form.menu_type.data it gives me the following error
mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''main_dishes'(name,ingredients,price) VALUES('Salmon', 'duude,frv', '35')' at line 1")
It takes the right value, but I have this awkward '' signs infront of main_dishes string
My code looks as follows:
class MenuForm(Form):
menu_type = SelectField('Menu Type', [validators.DataRequired()], choices=[('appetizers','Appetizer'),('main_dishes','Main Dish'),('desserts','Dessert'),('drinks','Drinks')], coerce=str)
name = StringField('Name', [validators.Length(min=1, max=2000)])
ingredients = TextAreaField('Ingredients', [validators.Length(min=10)])
price = DecimalField('Price (Manat)', [validators.DataRequired()])
#app.route('/add_menu', methods=['GET','POST'])
#is_logged_in
def add_menu():
form = MenuForm(request.form)
if request.method == 'POST' and form.validate():
menu_type = form.menu_type.data # <---Here is the problem
name = form.name.data
ingredients = form.ingredients.data
price = form.price.data
#Create cursor
cur = mysql.connection.cursor()
#execute
cur.execute("INSERT INTO %s(name,ingredients,price) VALUES(%s, %s, %s)", (menu_type,name,ingredients,price))
#Commit to DB
mysql.connection.commit()
#CLose connection
cur.close()
flash('Menu is Added', 'success')
return redirect(url_for('dashboard'))
return render_template('add_menu.html', form=form)
The table name is substituted as a quoted string and the query executed as such.
You may want to build your query with the table name before binding parameterized values.
query = "INSERT INTO {}(name,ingredients,price) VALUES(%s, %s, %s)".format(menu_type)
cur.execute(query, (name,ingredients,price))
Very similar to this question MySQL Dynamic Query Statement in Python
However what I am looking to do instead of two lists is to use a dictionary
Let's say i have this dictionary
instance_insert = {
# sql column variable value
'instance_id' : 'instnace.id',
'customer_id' : 'customer.id',
'os' : 'instance.platform',
}
And I want to populate a mysql database with an insert statement using sql column as the sql column name and the variable name as the variable that will hold the value that is to be inserted into the mysql table.
Kind of lost because I don't understand exactly what this statement does, but was pulled from the question that I posted where he was using two lists to do what he wanted.
sql = "INSERT INTO instance_info_test VALUES (%s);" % ', '.join('?' for _ in instance_insert)
cur.execute (sql, instance_insert)
Also I would like it to be dynamic in the sense that I can add/remove columns to the dictionary
Before you post, you might want to try searching for something more specific to your question. For instance, when I Googled "python mysqldb insert dictionary", I found a good answer on the first page, at http://mail.python.org/pipermail/tutor/2010-December/080701.html. Relevant part:
Here's what I came up with when I tried to make a generalized version
of the above:
def add_row(cursor, tablename, rowdict):
# XXX tablename not sanitized
# XXX test for allowed keys is case-sensitive
# filter out keys that are not column names
cursor.execute("describe %s" % tablename)
allowed_keys = set(row[0] for row in cursor.fetchall())
keys = allowed_keys.intersection(rowdict)
if len(rowdict) > len(keys):
unknown_keys = set(rowdict) - allowed_keys
print >> sys.stderr, "skipping keys:", ", ".join(unknown_keys)
columns = ", ".join(keys)
values_template = ", ".join(["%s"] * len(keys))
sql = "insert into %s (%s) values (%s)" % (
tablename, columns, values_template)
values = tuple(rowdict[key] for key in keys)
cursor.execute(sql, values)
filename = ...
tablename = ...
db = MySQLdb.connect(...)
cursor = db.cursor()
with open(filename) as instream:
row = json.load(instream)
add_row(cursor, tablename, row)
Peter
If you know your inputs will always be valid (table name is valid, columns are present in the table), and you're not importing from a JSON file as the example is, you can simplify this function. But it'll accomplish what you want to accomplish. While it may initially seem like DictCursor would be helpful, it looks like DictCursor is useful for returning a dictionary of values, but it can't execute from a dict.