Python rowcount returning negative 1 - mysql

Please can someone help i cant believe I am not getting this right. I am trying to count the number of rows that I have in a mysql db. The correct number is 7, however everytime I execute the following I get an answer of -1. The connection is established successfully.I am using python 3.4.4
import mysql.connector
config = {
'user': 'root',
'password': '',
'host': '127.0.0.1',
'database': 'test'
}
cnx =mysql.connector.MySQLConnection(**config)
if cnx.is_connected():
print('Connected to MySQL database')
cursor = cnx.cursor()
cursor.execute("SELECT * FROM test")
numrows = int (cursor.rowcount)
print(numrows)

According to the documentation:
For nonbuffered cursors, the row count cannot be known before the rows have been fetched. In this case, the number of rows is -1 immediately after query execution and is incremented as rows are fetched.
Which means that, in order for .rowcount to work in this case, you have to fetch the results first.
cursor.execute("SELECT * FROM test")
rows = cursor.fetchall()
numrows = int(cursor.rowcount)
Of course, you can also get the length of the rows list.

Related

Why does Fast API take upwards of 10 minutes to insert 100,000 rows into a SQL database

I've tried using SqlAlchemy, as well as raw mysql.connector here, but commiting an insert into a SQL database from FastAPI takes forever.
I wanted to make sure it wasn't just my DB, so I tried it on a local script and it ran in a couple seconds.
How can I work with FastAPI to make this query possible?
Thanks!
'''
#router.post('/')
def postStockData(data:List[pydanticModels.StockPrices], raw_db = Depends(get_raw_db)):
cursor = raw_db[0]
cnxn = raw_db[1]
# i = 0
# for row in data:
# if i % 10 == 0:
# print(i)
# db.flush()
# i += 1
# db_pricing = models.StockPricing(**row.dict())
# db.add(db_pricing)
# db.commit()
SQL = "INSERT INTO " + models.StockPricing.__tablename__ + " VALUES (%s, %s, %s)"
print(SQL)
valsToInsert = []
for row in data:
rowD = row.dict()
valsToInsert.append((rowD['date'], rowD['symbol'], rowD['value']))
cursor.executemany(SQL, valsToInsert)
cnxn.commit()
return {'message':'Pricing Updated'}
'''
You are killing performances because you try a "RBAR" approach which is not suitable in RDBMS...
You use a loop and execute an SQL INSERT of only one row...
When the RDBMS is facing a query, the sequence of execution is the following :
does the user that throw the query be authenticate ?
parsing the string to verify the syntax
looking for metadata (tables, columns, datatypes...)
analyzing which operations on tables and columns this user is granted
creating an execution plan to sequences all the operations needed for the query
setting up lock for concurrency
executing the query (inserting only 1 row)
throw back an error or a OK message
Every steps consumes time... and your are all theses steps 100 000 times because of your loop.
Usually when inserting in a table many rows, there just one query to do even if the INSERT concerns 10000000000 rows from a file !

Comparing user input to usernames in database

I am having an issue with comparing user input to a database of already used usernames. The database is working exactly how it is supposed to, but what seems like a simple task it proving to be more difficult than I thought. I figure I am missing something really simple! I get no error codes from the code but it does not print "username is taken" when the username is in fact in the database.
Thus far I have tried a for loop to compare the user's input to the database, I have tried making a list of the usernames in the database and iterating through the list to compare the user input:
### check to see if the user is already in the database
import mysql.connector
# Database entry
mydb = mysql.connector.connect(
host='Localhost',
port='3306',
user='root',
passwd='passwd',#changed for help
database='pylogin'
)
searchdb = 'SELECT username FROM userpass'
mycursor = mydb.cursor()
mycursor.execute(searchdb)
username = mycursor.fetchall()
print(username)
user = input('username')
for i in username:
if user == username:
print('username is taken')
else:
print("did not work")
Output that does not work:
[('Nate',), ('test',), ('test1',), ('test2',), ('n',), ('test4',)] username: n
('Nate',)
('test',)
('test1',)
('test2',)
('n',)
('test4',)
I expect the above code to iterate through each database entry and compare it to the user's input to verify that the username is not already taken. It should print "username is taken" instead it prints "did not work".
Welcome to Stack Overflow Nate!
You can use:
mycursor.execute("SELECT * FROM userpass WHERE username = ?", [(user)])
results = cursor.fetchall()
To create a variable called 'results' that stores all (*) of the column values for the record in the userpass database table where the username of the record is equal to the value of the user variable.
You can then use an if statement:
if results:
That is run if the results variable has a value (if there is a record with a username that is equal to the value of the user variable in the table) AKA if the username is taken.
This if statement can then, when run print 'username is taken'
The full code:
import mysql.connector
# Database entry
mydb = mysql.connector.connect(
host='Localhost',
port='3306',
user='root',
passwd='passwd',#changed for help
database='pylogin'
)
user = input('username')
mycursor.execute("SELECT * FROM userpass WHERE username = ?", [(user)])
results = cursor.fetchall()
if results:
print('username is taken')
else:
print("did not work")
edit* when adding this code the the rest of the program and testing i found it gives the results username taken every time, even if it is a new username. If you have any recommendations to fix this, I am open to them. I am going to continue work on this and will post results when the program is working properly.
I would like to thank you guys for your help, I did find the response I was looking for!
### check to see if user is already in database
import mysql.connector
# Database entry
mydb = mysql.connector.connect(
host='Localhost',
port='3306',
user='root',
passwd='passwd',#changed for help
database='pylogin'
)
mycursor = mydb.cursor()
user = input('username')
mycursor.execute('SELECT * FROM userpass WHERE username = username')
results = mycursor.fetchall()
if results:
print('Username is taken')
else:
print('did not work')`

Fetch rows from Mysql and display it in html using Django

I am fairly new to Django. I want to know how to fetch rows from mysql and get it in views.py and send it to html where it will be displayed.
My views.py:
def fetchDate1(request):
query = request.session.get('query')
date1 = request.session.get('date1');
db = pymysql.connect(host="localhost", # your host
user="root", # username
passwd="=", # password
db="Golden") # name of the database
# Create a Cursor object to execute queries.
cur = db.cursor()
# Select data from table using SQL query.
stmt = "SELECT * FROM golden_response WHERE query LIKE '%s' AND DATE(updated_at) = '%s' " % (
query.replace("'", r"\'"), date1)
cur.execute(stmt)
if cur.rowcount is None:
return None
else:
rows = cur.fetchall()
row_headers = [x[0] for x in cur.description] # this will extract row headers
json_data = []
for result in rows:
json_data.append(dict(zip(row_headers, result)))
return json.dumps(json_data)
I don't know where I am going wrong. Have also saved required configuration in settings.py.
However when i try to run my program :
ProgrammingError: (1146, "Table 'Golden.django_session' doesn't exist")
Please help!!
I dare to hypothesize that you have not made the first migrations.
python manage.py makemigrations
python manage.py migrate
Moreover, you should check the database connection parameters in settings.py like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'db_name',
'USER': 'db_user',
'PASSWORD': 'db_user_password',
'HOST': 'db_server',
'PORT': '3306',
}
}

Values are not inserted into MySQL table using pool.apply_async in python2.7

I am trying to run the following code to populate a table in parallel for a certain application. First the following function is defined which is supposed to connect to my db and execute the sql command with the values given (to insert into table).
def dbWriter(sql, rows) :
# load cnf file
MYSQL_CNF = os.path.abspath('.') + '/mysql.cnf'
conn = MySQLdb.connect(db='dedupe',
charset='utf8',
read_default_file = MYSQL_CNF)
cursor = conn.cursor()
cursor.executemany(sql, rows)
conn.commit()
cursor.close()
conn.close()
And then there is this piece:
pool = dedupe.backport.Pool(processes=2)
done = False
while not done :
chunks = (list(itertools.islice(b_data, step)) for step in
[step_size]*100)
results = []
for chunk in chunks :
print len(chunk)
results.append(pool.apply_async(dbWriter,
("INSERT INTO blocking_map VALUES (%s, %s)",
chunk)))
for r in results :
r.wait()
if len(chunk) < step_size :
done = True
pool.close()
Everything works and there are no errors. But at the end, my table is empty, meaning somehow the insertions were not successful. I have tried so many things to fix this (including adding column names for insertion) after many google searches and have not been successful. Any suggestions would be appreciated. (running code in python2.7, gcloud (ubuntu). note that indents may be a bit messed up after pasting here)
Please also note that "chunk" follows exactly the required data format.
Note. This is part of this example
Please note that the only thing I am changing in the above example (linked) is that I am separating the steps for creation of and inserting into the tables since I am running my code on gcloud platform and it enforces GTID standards.
Solution was changing dbwriter function to:
conn = MySQLdb.connect(host = # host ip,
user = # username,
passwd = # password,
db = 'dedupe')
cursor = conn.cursor()
cursor.executemany(sql, rows)
cursor.close()
conn.commit()
conn.close()

How to query multiple times and close the connection at the end?

I'd like to open a connection to mysql database and retrieve data with different queries. Do I need to close the connection every time I fetch the data or is there a better way to query multiple times and close the connection only at the end?
Currently I do this:
db = dbConnect(MySQL(), user='root', password='1234', dbname='my_db', host='localhost')
query1=dbSendQuery(db, "select * from table1")
data1 = fetch(query1, n=10000)
query2=dbSendQuery(db, "select * from table2") ##ERROR !
and I get the error message:
Error in mysqlExecStatement(conn, statement, ...) :
RS-DBI driver: (connection with pending rows, close resultSet before continuing)
Now if I clear the result with dbClearResult(query1) I need to redo the connection (dbConnect...)
Is there a better/efficient way to fetch everything first instead of open/close every time?
Try dbGetQuery(...) instead of using dbSendQuery(...) and fetch() like this
db = dbConnect(MySQL(), user='root', password='1234', dbname='my_db', host='localhost')
query1=dbGetQuery(db, "select * from table1")
query2=dbGetQuery(db, "select * from table1")
From the help page:
The function ‘dbGetQuery’ does all these in one operation (submits the statement, fetches all output records, and clears the result set).