I'm working on the front end of a webapp, and my co-developer is using Pyramid and SQAlchemy. We've just moved from SQLite to MySQL. I installed MySQL 5.6.15 (via Homebrew) on my OS X machine to get the Python MySQLdb install to work (via pip in a virtualenv).
Because in MySQL >= 5.6.5 secure_auth is now ON by default I can only connect to the remote database (pre 5.6.5) with the --skip-secure-auth flag, which works fine in a terminal.
However, in the Python Pyramid code, it only seems possible to add this flag as an argument to create_engine(), but I can't find create_engine() in my co-dev's code, only the connection string below in an initialisation config file. He's not available, this isn't my area of expertise, and we launch next week :(
sqlalchemy.url = mysql+mysqldb://gooddeeds:deeds808letme1now#146.227.24.38/gooddeeds_development?charset=utf8
I've tried appending various "secure auth" strings to the above with no success. Am I looking in the wrong place? Has MySQLdb set secure_auth to ON because I'm running MySQL 5.6.15? If so, how can I change that?
If you are forced to use the old passwords (bah!) when using MySQL 5.6, and using MySQLdb with SQLAlchemy, you'll have to add the --skip-secure-auth to an option file and use URL:
from sqlalchemy.engine.url import URL
..
dialect_options = {
'read_default_file': '/path/to/your/mysql.cnf',
}
engine = create_engine(URL(
'mysql',
username='..', password='..',
host='..', database='..',
query=dialect_options
))
The mysql.cnf would contain:
[client]
skip-secure-auth
For Pyramid, you can do the following. Add a line in your configuration ini-file that holds the connection arguments:
sqlalchemy.url = mysql://scott:tiger#localhost/test
sqlalchemy.connect_args = { 'read_default_file': '/path/to/foo' }
Now you need to change a bit the way the settings are read and used. In the file that launches your Pyramic app, do the following:
def main(global_config, **settings):
try:
settings['sqlalchemy.connect_args'] = eval(settings['sqlalchemy.connect_args'])
except KeyError:
settings['sqlalchemy.connect_args'] = {}
engine = engine_from_config(settings, 'sqlalchemy.')
# rest of code..
The trick is to evaluate the string in the ini file which contains a dictionary with the extra options for the SQLAlchemy dialect.
Related
I have been trying to connect to my SQL that I am trying to create. I recently downloaded MySQL, the workbench, the connector ODBC, and the ODBC Manager, but I can't find the solution to solve the error for the connection.
Do I need to download anything else? I can't find a solution on internet or youtube for Mac.
packages_required = c("quantmod", "RSQLite", "data.table", "lubridate", "pbapply", "DBI")
install.packages(packages_required)
library("quantmod")
library("RSQLite")
library("data.table")
library("lubridate")
library("pbapply")
library("odbc")
PASS <- new.env()
assign("pwd","My Password",envir=PASS)
library("DBI")
con <- dbConnect(odbc(), Driver = "/usr/local/mysql-connector-odbc-8.0.28-macos11-x86-64bit/lib/libmyodbc8w.so",
Server = "localhost", Database = "data", UID = "root", PWD = PASS$pwd,
Port = 3306)
-----------------------------------------------------------------------------------------
> con <- dbConnect(odbc(), Driver = "/usr/local/mysql-connector-odbc-8.0.28-macos11-x86-64bit/lib/libmyodbc8w.so",
+ Server = "localhost", Database = "data", UID = "root", PWD = PASS$pwd,
+ Port = 3306)
Error: nanodbc/nanodbc.cpp:1021: 00000: [
>
Thank you
Presuming you're on Windows, try creating an ODBC connection using the most recent driver. The ODBC data sources tool should already be installed, you just need to open it and create a new one.
Press the windows key (or click the search spyglass) and type in "ODBC." The "ODBC Data Sources (64-bit)" tool should come up.
How to Create an ODBC Connection in Windows
Open the "ODBC Data Sources (64-bit)" application
Click "Add"
Choose"MySQL ODBC 8.0 Unicode Driver" (or whatever the newest version you
have is). If you don't have it, you can download it here:
https://dev.mysql.com/downloads/connector/odbc/
Enter the following information:
Data source name (the example code below uses "my_odbc_connection"), TCP/IP Server, Port, User and Password
Click "Details" to expand the box.
In the "Connection" tab, you may need to check the "Enable Cleartext
Authentication" box. Could depend on your system configuratoin.
Click "Test" to test the connection. If everything went right you
should get "Connection Successful" message. If you aren't able to get a
successful connection, make sure that you have access and that your
connection information doesn't have any typos.
After making a successful connection, perform these 2 additional steps (the
drop-downs won't populate until you connect successfully):
Click the "Database" drop down to choose the default database that you'll
be writing data to. If you will be writing to more than 1 database
then you may need to create a separate connection for each database
that you'll be writing to, specifying the default database
differently for each one.
Click the "Character Set" drop down and choose utf8.
You should now able to use the "DBI" and "ODBC" packages to read, write, etc. any data directly from R. Specific settings listed above may or may not apply depending on situation.
See the example code below.
Further reading: https://www.r-bloggers.com/setting-up-an-odbc-connection-with-ms-sql-server-on-windows/
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#~~ Load or install packages
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Load or install librarian
if(require(librarian) == FALSE){
install.packages("librarian")
if(require(librarian)== FALSE){stop("Unable to install and load librarian")}
}
# Load multiple packages using the librarian package
librarian::shelf(tidyverse, readxl, DBI, lubridate, ODBC, quiet = TRUE)
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#~~ Read
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Connect to a pre-defined ODBC connection named "my_odbc_connection"
conn <- DBI::dbConnect(odbc::odbc(), "my_odbc_connection")
# Create a query
query <- "
SELECT *
FROM YOUR_SCHEMA.YOUR_TABLE;
"
# Run the query
df_data <- DBI::dbGetQuery(conn,query)
# Close the open connection
try(DBI::dbDisconnect(conn), silent = TRUE)
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#~~ Write
#~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
# Define the connection and database you'll be writing to
conn <- DBI::dbConnect(odbc::odbc(), "my_odbc_connection", db ="YOUR_DEFAULT_DB")
# Define variable types for your data frame. As a general rule, it's a good idea to define your data types rather than let the package guess.
field_types <- c("INTEGER","VARCHAR(20)","DATE","DATETIME","VARCHAR(20)","VARCHAR(50)","VARCHAR(20)")
names(field_types) <- names(df_data)
# Record start time
start_time <- Sys.time()
# Example write statement
DBI::dbWriteTable(conn,"YOUR_TABLE_NAME",YOUR_DATA_FRAME,overwrite=TRUE,field.types=field_types, row.names = FALSE)
# Print time difference
print("Writing complete.")
print(Sys.time() - start_time)
# Close the open connection
try(DBI::dbDisconnect(conn), silent = TRUE)
So I am trying to use my (continously updating) database on MySQL with some visualizations which I want to put into my Streamlit app. In other words, I want to use the data from MySQL database in my Streamlit application.
For this purpose I consulted the official streamlit documentation here.
The problem here is that the tutorial tells me to create a file like this: .streamlit/secrets.toml and fill it with the following information (copy-pasting the syntax):
[
mysql
]
host = "localhost"
port = 3306
database = "xxx"
user = "xxx"
password = "xxx"
Everything was going good up until now but when I paste my secret.toml info in the SECRET MANAGEMENT widget (it is prompted when I am creating a new app in Streamlit cloud) it gives me a syntax error.
Invalid format: please enter valid TOML.
Up untill this point I was going by the book(tutorial). Now to go over this I tried using only the variable definitions like following (since I am not aware of the .toml syntax):
db_user = "root"
db_name = "dbname"
db_password = "123abc"
Am I doing this right? Or am I missing something obvious?
With all of that aside, I also need to know how to call dependencies on stream cloud for my app. For example, I need mysql-connector-python module but I don't see any console with which I can do that
NOTE:
This is my first time deploying an app on the cloud
[
mysql
]
It should be [mysql] in one line
In your GitHub repo, add requirements.txt file with your dependencies.
streamlit cloud will install those packages for your app.
I want to notify another way we can use a database within a Streamlit App rather than using the conventional method.
We can refer to this Medium.com article here.
It explains a way in which we can use Pandas Library to load a database and it also updates in real-time. By using this knowledge, connecting to a database becomes a "Python" problem, not a "streamlit" problem.
Assuming we are using MySQL
We can, according to the official tutorial for MySQL Database, create a .streamlit/secrets.toml file in which we will store our information(related to our database) as below:
# .streamlit/secrets.toml
[
mysql
]
host = "localhost"
port = 3306
database = "xxx"
user = "xxx"
password = "xxx"
Also install mysql-connector-python for python and import it on your application file. you will also need Pandas and toml Ofcourse:
pip install mysql-connector-python pandas toml
Here is what each of them do:
| Library | It's use |
| -------- | -------------- |
| mysql-connector-python | to connect to our database |
| pandas | to read and convert our database table into a Dataframe |
|toml| to read details from secrets.toml file |
STEP 1
We read details from secrets.toml
# Reading data
toml_data = toml.load("secrets.toml")
# saving each credential into a variable
HOST_NAME = toml_data['mysql']['host']
DATABASE = toml_data['mysql']['database']
PASSWORD = toml_data['mysql']['password']
USER = toml_data['mysql']['user']
PORT = toml_data['mysql']['port']
STEP 2
Connecting to our Database:
# Using the variables we read from secrets.toml
mydb = connection.connect(host=HOST_NAME, database=DATABASE, user=USER, passwd=PASSWORD, use_pure=True)
STEP 3
Making queries from our database:
query = pd.read_sql('SELECT * FROM mytable;' , mydb)
The query variable is now a display-able table in streamlit or Jupyter notebooks
Likewise, we can make any MySQL query(syntax applied) we want from our database.
This information is based on my own experience.
To prepare my upgrade from mysql 5.7 to mysql 8, I want to run the upgrade utility checker. Here's what I did so far:
installed mysqlsh on my machine
started mysqlsh
executed util.checkForServerUpgrade targeting the server that I want to upgrade
Here's the exact command that I used in step 3:
util.checkForServerUpgrade('root#my-remote-host:3306', { "password":"my-password" })
This runs fine but some checks are not executed because I don't provide the configPath parameter. For example, here's a warning that I get:
14) Removed system variables for error logging to the system log configuration
To run this check requires full path to MySQL server configuration file to be specified at 'configPath' key of options dictionary
More information:
https://dev.mysql.com/doc/relnotes/mysql/8.0/en/news-8-0-13.html#mysqld-8-0-13-logging
Anybody knows the value that I should provide for the configPath parameter?
I've tried to do the same using the command util.checkForServerUpgrade defining the configPath, without success. I then tried to run the same command directly from outside the mysqlsh shell, with success:
mysqlsh -- util check-for-server-upgrade root#localhost --target-version=8.0.13 --output-format=JSON --config-path=/etc/mysql/my.cnf
and it worked. To be noted that when I've tried to run from mysqlsh in the session root#localhost the command:
util.checkForServerUpgrade({"configPath":"/etc/mysql/my.cnf"})
mysqlsh replied with:
"Util.checkForServerUpgrade: Argument #1: Invalid values in connection options: configPath (ArgumentError)"
Try putting in the connection string, for example,
util.checkForServerUpgrade('root#localhost',{'configPath': '/etc/my.cnf'})
This worked for me, but without the connection string it doesn't.
I have been using Heroku for a while to host my Discord bot. It has been connecting to a MySQL database hosted on ClearDB successfully. However, very recently, whenever I use the bot and it tries to connect to the database, it throws this error:
2026 (HY000): SSL connection error: error:1425F102:SSL routines:ssl_choose_client_version:unsupported protocol
It has been working completely fine until now, and I haven't changed anything. For background, all I did was delete a pipeline and make my app a standalone app without any pipeline. Just in case this helps.
Is this because Heroku has been updated? How can I fix my bot? Let me know if you need any more information.
Any help is appreciated, and Thank You in advance!
EDIT:
Database connection code:
import mysql.connector
def create_conn():
conn = None
try:
conn = mysql.connector.connect(host="HOST",
database="DB",
user="USER",
password="PWD")
except Exception as e:
print(e)
return conn
def execute_query(query, params, fetchall=True):
conn = create_conn()
if conn:
cursor = conn.cursor()
cursor.execute(query % params)
try:
if fetchall:
results = cursor.fetchall()
else:
results = cursor.fetchone()
except:
results = None
conn.commit()
cursor.close()
conn.close()
return results
else:
return False
The database connection used to work, and still works when I run it on my testing machine, a raspberry pi.
EDIT 2:
requirements.txt:
aiohttp==3.6.3
async-timeout==3.0.1
attrs==20.3.0
CacheControl==0.12.6
cachetools==4.2.0
certifi==2020.12.5
cffi==1.14.4
chardet==3.0.4
click==7.1.2
cryptography==3.3.1
cssselect==1.1.0
cssutils==1.0.2
discord==1.0.1
discord-pretty-help==1.2.0
discord.py==1.6.0
emoji==0.6.0
Flask==1.1.2
google-api-core==1.24.1
google-api-python-client==1.12.8
google-auth==1.24.0
google-auth-httplib2==0.0.4
google-cloud-core==1.5.0
google-cloud-firestore==2.0.2
google-cloud-storage==1.35.0
google-crc32c==1.1.0
google-resumable-media==1.2.0
googleapis-common-protos==1.52.0
grpcio==1.34.0
gunicorn==20.0.4
httplib2==0.18.1
idna==2.8
importlib-metadata==3.3.0
itsdangerous==1.1.0
jeepney==0.6.0
Jinja2==2.11.2
keyring==21.8.0
lxml==4.6.2
MarkupSafe==1.1.1
msgpack==1.0.2
multidict==4.7.6
mysql-connector-python==8.0.22
numpy==1.19.4
pandas==1.1.5
premailer==3.7.0
proto-plus==1.13.0
protobuf==3.14.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
pycparser==2.20
python-dateutil==2.8.1
python-dotenv==0.15.0
pytz==2020.4
requests==2.25.1
rsa==4.7
schedule==0.6.0
SecretStorage==3.3.0
six==1.15.0
typing-extensions==3.7.4.3
uritemplate==3.0.1
urllib3==1.26.2
Werkzeug==1.0.1
yagmail==0.14.245
yarl==1.5.1
zipp==3.4.0
Just in case you can turn of ssl by:
conn = mysql.connector.connect(host="HOST",
database="DB",
user="USER",
password="PWD",
ssl_disabled=True)
i'm not quite sure how to do this, but i'm pretty sure you have to disable SSL for it to work, hope this helps.
Clearly, you need to enforce an SSL connection between your app and MySQL.
If you are using ruby stack then follow the given options and your SSL error problem will be solved.
Download the CA, Client, and Private Key files from your ClearDB dashboard and place them in the root of the application’s filesystem.
Make sure you have OpenSSL installed, which you can find here for Unix/Linux/OS X and here for Windows.
*Due to the MySQL client library configuration used on Heroku, you will need to strip the password from the private key file, which can be done like this:
$ openssl rsa -in cleardb_id-key.pem -out cleardb_id-key-no-password.pem
You can now delete the cleardb_id-key.pem and rename cleardb_id-key-no-password.pem to cleardb_id-key.pem, which you will use with your app.
*Set the DATABASE_URL config variable with the value of your modified CLEARDB_DATABASE_URL, like this:
$ heroku config:add DATABASE_URL="mysql2://abc1223:dfk243#us-cdbr-east.cleardb.com/my_heroku_db?
sslca=cleardb-ca-cert.pem&sslcert=cleardb_id-cert.pem&sslkey=cleardb_id-key.pem&reconnect=true"
notice how we added the “reconnect=true” parameters to the end of the URL? This is so that your application will automatically reconnect to ClearDB in the event of a connection timeout.
From here, simply restart your application (if Heroku didn’t already do that for you), and as long as you specified the correct file names and paths to the certificates in your DATABASE_URL, your app will now connect via SSL to ClearDB.
default: on
# description: mysqlchk
service mysqlchk
{
# this is a config for xinetd, place it in /etc/xinetd.d/
disable = no
flags = REUSE
socket_type = stream
type = UNLISTED
port = 9200
wait = no
user = root
server = /usr/bin/mysqlclustercheck
log_on_failure += USERID
only_from = 0.0.0.0/0
#
# Passing arguments to clustercheck
# <user> <pass> <available_when_donor=0|1> <log_file> <available_when_readonly=0|1> <defaults_extra_file>"
# Recommended: server_args = user pass 1 /var/log/log-file 0 /etc/my.cnf.local"
# Compatibility: server_args = user pass 1 /var/log/log-file 1 /etc/my.cnf.local"
# 55-to-56 upgrade: server_args = user pass 1 /var/log/log-file 0 /etc/my.cnf.extra"
#
# recommended to put the IPs that need
# to connect exclusively (security purposes)
per_source = UNLIMITED
}
/etc/xinetd.d #
It is kind of strange that script works fine when run manually when it runs using /etc/xinetd.d/ , it is not working as expected.
In mysqlclustercheck script, instead of using --user= and passord= syntax, I am using --login-path= syntax
script runs fine when I run using command line but status for xinetd was showing signal 13. After debugging, I have found that even simple command like this is not working
mysql_config_editor print --all >>/tmp/test.txt
We don't see any output generated when it is run using xinetd ( mysqlclustercheck)
Have you tried the following instead of /usr/bin/mysqlclustercheck?
server = /usr/bin/clustercheck
I am wondering if you could test your binary location with the linux which command.
A long time ago since this question was asked, but it just came to my attention.
First of all as mentioned, Percona Cluster Control script is called clustercheck, so make sure you are using the correct name and correct path.
Secondly, since the server script runs fine from command line, it seems to me that the path of mysql client command is not known by the xinetd when it runs the Cluster Control script.
Since the mysqlclustercheck script as it is offered from Percona, it uses only the binary name mysql without specifying the absolute path I suggest you do the following:
Find where mysql client command is located on your system:
ccloud#gal1:~> sudo -i
gal1:~ # which mysql
/usr/local/mysql/bin/mysql
gal1:~ #
then edit script /usr/bin/mysqlclustercheck and in the following line:
MYSQL_CMDLINE="mysql --defaults-extra-file=$DEFAULTS_EXTRA_FILE -nNE --connect-timeout=$TIMEOUT \
place the exact path of mysql client command you found in the previous step.
I also see that you are not using MySQL connection credentials for connecting to MySQL server. mysqlclustercheck script as it is offered from Percona, it uses User/Password in order to connect to MySQL server.
So normally, you should execute the script in the command line like:
gal1:~ # /usr/sbin/clustercheck haproxy haproxyMySQLpass
HTTP/1.1 200 OK
Content-Type: text/plain
Where haproxy/haproxyMySQLpass is the MySQL connection user/pass for HAProxy monitoring user.
Additionally, you should specify them to your script's xinetd settings like:
server = /usr/bin/mysqlclustercheck
server_args = haproxy haproxyMySQLpass
Last but not least, the signal 13 you are getting is because you try to write something in a script run by xinetd. If for example in your mysqlclustercheck you try to add a statement like
echo "debug message"
you probably going to see the broken pipe signal (13 in POSIX).
Finally, I had issues with this script using SLES 12.3 and I finally manage to run it not as 'nobody' but as 'root'.
Hope it helps