I am trying to access data from a View in a MySQL database using RStudio. When I connect to the database, it shows the names of all Views in the Connections window. I can even return the names of every View using the dbListTables() function.
But, when I try to run the tbl() function, I get the following error:
Error: nanodbc/nanodbc.cpp:1655: HY000: [MySQL][ODBC 8.0(a) Driver][mysqld-5.5.5-10.3.34-
MariaDB-log]Prepared statement needs to be re-prepared
<SQL> 'SELECT *
FROM `database_view` AS `q01`
WHERE (0 = 1)'
Here are the packages I have loaded:
library(tidyverse)
library(dbplyr)
library(DBI)
library(odbc)
Here is my tbl() code:
tbl(con, "database_view")
Here is my connection code (I replaced the actual values with placeholders in brackets):
con <- DBI::dbConnect(odbc::odbc(),
Driver = "MySQL ODBC 8.0 ANSI Driver",
Server = "[Server]",
UID = "[UID]",
PWD = "[PWD]",
Port = 3306,
Database = "[Database]")
Any help would be much appreciated!
Related
I have referred various articles in stackoverflow and external sources but somehow unable to get answer for this. I want to read a table from MySQL workbench database into a dataframe in colab.
1st Method
In this method, first line of code is successfully executed.
Note: I have hidden database, table and password name for security reasons.
Source -
USER = 'root'
PASSWORD = 'PASSWORD'
DATABASE = 'DATABASE'
TABLE = 'TABLE'
connection_string = f'mysql+pymysql://root:PASSWORD/DATABASE'
engine = sqlalchemy.create_engine(connection_string)
I am getting error for second line of code. Is it because my password ends with #786 or some other reasons.
query = f"SELECT * FROM DATABASE.TABLE"
import pandas as pd
df = pd.read_sql_query(query, engine)
**OperationalError: (pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on '786#3306' ([Errno -2] Name or service not known)") (Background on this error at: https://sqlalche.me/e/14/e3q8)
**
2nd Method
In this method, first line of code is successfully executed.
Note: I have hidden database, table and password name for security reasons.
connection_string = 'mysql+pymysql://root:PASSWORD#3306/DATABASE'
connect_args = {'ssl': {'ca': '/content/rds-ca-2015-root.pem'}}
db = create_engine(connection_string, connect_args=connect_args)
I am getting error for second line of code.
query = """SELECT * FROM DATABASE.TABLE"""
events_df = pd.read_sql(query, con=db)
**FileNotFoundError: [Errno 2] No such file or directory **
My Queries:
1) Why I am getting error for 2nd line of code in both methods?
2) Is there any workaround or alternative approach / codes where I can successfully connect colab with MySQL database and read table?
I'm trying to connect to my Google Cloud MySQL database through a Google Cloud Function to read some data. The function build succeeds, but when executed only this is displayed:
Error: (pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)") (Background on this error at: http://sqlalche.me/e/e3q8)
Here is my connection code:
import sqlalchemy
# Depending on which database you are using, you'll set some variables differently.
# In this code we are inserting only one field with one value.
# Feel free to change the insert statement as needed for your own table's requirements.
# Uncomment and set the following variables depending on your specific instance and database:
connection_name = "single-router-309308:europe-west4:supermarkt-database"
db_name = "supermarkt-database"
db_user = "hidden"
db_password = "hidden"
# If your database is MySQL, uncomment the following two lines:
driver_name = 'mysql+pymysql'
query_string = dict({"unix_socket": "/cloudsql/{}".format(connection_name)})
# If the type of your table_field value is a string, surround it with double quotes. < SO note: I didn't really understand this line. Is this the problem?
def insert(request):
request_json = request.get_json()
stmt = sqlalchemy.text('INSERT INTO products VALUES ("Testid", "testname", "storename", "testbrand", "4.20", "1kg", "super lekker super mooi", "none")')
db = sqlalchemy.create_engine(
sqlalchemy.engine.url.URL(
drivername=driver_name,
username=db_user,
password=db_password,
database=db_name,
query=query_string,
),
pool_size=5,
max_overflow=2,
pool_timeout=30,
pool_recycle=1800
)
try:
with db.connect() as conn:
conn.execute(stmt)
except Exception as e:
return 'Error: {}'.format(str(e))
return 'ok'
I got it mostly from following this tutorial: https://codelabs.developers.google.com/codelabs/connecting-to-cloud-sql-with-cloud-functions#0 . I'm also using Python 3.7, as used in the tutorial.
SQLAlchemy describes it as not necessarily under the control of the programmer.
For context, the account through which is being connected has the SQL Cloud Admin role, and the Cloud SQL Admin API is enabled. Thanks in advance for the help!
PS: I did find this answer: Connecting to Cloud SQL from Google Cloud Function using Python and SQLAlchemy but have no idea where the settings for Firewall with SQL can be found. I didn't find them in SQL > Connection / Overview or Firewall.
Alright so I figured it out! In Edit Function > Runtime, Build and Connection Settings, head over to Connection Settings and make sure "Only route requests to private IPs through the VPC connector" is enabled. The VPC connector requires different authorization.
Also, apparently I needed my TABLE name, not my DATABASE name as the variable DB_NAME. Thanks #guillaume blaquiere for your assistance!
What would be the ODBC equivalent of the following:
hconn = database('{schema name}','{username}','{password}',...
'com.mysql.jdbc.Driver',...
'jdbc:mysql://{hostname}:{port}/{schema name}?...
useSSL=true&requireSSL=false&autoReconnect=true&');
I am using MATLAB's Database Toolbox Version 7.1
You can use the following code to perform a traditional connection to the database specifying the ODBC data source name (ODBC DB for example):
conn = database('ODBC DB','myuser','mypass');
If you want to use Windows Authentication instead, all you have to do is to specify the authenticated ODBC data source name (ODBC DB AUTH for example) and provide blank username and password:
conn = database('ODBC DB AUTH','','');
Refer to this page for more information.
I used the following code to connect R to my own MySQL server (i.e. localhost server).
con <- dbConnect(MySQL(),user="root",password="********",dbname="try",host="localhost")
dbListTables(con) # to see what all tables the database has
#data(test) (shows error becuase its not yet in R, its still on server)
dbListFields(con, 'test') #to see what all fields the table has
rs <- dbSendQuery(con, "SELECT * FROM test") #data is still on the server
data <- fetch(rs, n = -1) #using fetch to bring data into R
Now I have to connect to someone else's MySQL server (i.e. their IP would be different, the server would be on their system/machine) to get the data from them.
So, what all details do I need and what all modification do I need to do in the code?
Thank You.
Set the correct host (remote server domain name or at a pinch, dotted IP address), user name and password as defined by the admin of the remote MySQL server.
Once that's set in con <- dbConnect(... everything should be the same.
con <- dbConnect(MySQL(),user="fred",password="********",dbname="try",host="yoursql.wherever.com")
Note you might have problems if your local network policy blocks any of the ports that MySQL uses.
I am trying to use SAS v9.0 to connect to a MySQL database on a Windows 7 machine. Below is the code I am using.
proc sql;
connect to odbc(datasrc=localhost user= root password=password);
create table tmp as
select *
from connection to odbc
(
select * from mysql.time_zone
)
;
quit;
This is giving me following error.
ERROR: CLI error trying to establish connection: [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
I am using the MySQL ODBC 5.1 Driver.
Looks like you haven't set your ODBC data source in windows. Go to start->control panel->system and security->administration->ODBC data sources. Select "add", provide all information asked. Then you should provided ODBC data source name when you submit your proc sql datasrc option.