I have migrated MySQL older version to v8.0.19 . In older version it worked fine, but now facing some issues with pool connection as below:
R SQL connection pool:
library(pool)
library(DBI)
library(RMariaDB)
pool = dbPool(
drv=MariaDB(),
dbname="mydb",
username="root",
password=Sys.getenv("MYSQL_PASSWORD"),
host="localhost",
sslmode = 'require',
port=3306
)
statement = paste0("select * from Employee where Id IN (", ids ,")")
con3 = poolCheckout(pool)
x = dbGetQuery(con3,statement) //Throwing Error here
poolReturn(con3)
But it is working when I changed same code to :
statement = paste0("select * from Employee where Id IN (", ids ,")")
con3 = poolCheckout(pool)
x = dbGetQuery(pool,statement) //Not getting any issue like this
poolReturn(con3)
Is it I am doing in wrong way?
Whats going on there (", ids ,")")?
x = dbGetQuery(con3,statement) not the same as below , you have pool there!
Related
I got a data frame in R querying a SQL Server DB, Now I want to loop on each line and insert it to MySQL DB
Tried with dbwritetable but it didn't work
library(RODBC)
library(odbc)
library(RMySQL)
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "XX",
Database = "XX",
UID = "XX",
PWD = "XX",
Port = XX)
mydb = dbConnect(MySQL(), user='XX', password='XX', dbname='YY', host='YYY')
resultset <- dbGetQuery(con, "SET NOCOUNT ON
DECLARE #StartDate DateTime
DECLARE #EndDate DateTime
SET #StartDate = dateadd(d,-1,getdate())
SET #EndDate = getdate()
SET NOCOUNT OFF
SELECT …..
LEFT JOIN ... ON ….
LEFT JOIN …. ON x.Key = y.Key
WHERE temp.StartDateTime >= #StartDate")
nrows <- nrow(resultset)
colnames(resultset) <- c("tagName", "date_inserted","value") `
So in here I got my result, in resultset but I don't know how to insert the resulset in MySQL
dbWriteTable(mydb, name='data', value=resultset[0,],append=TRUE)
dbReadTable(mydb, "data")
I Expect to insert the data, but I don't know should it be a for loop (for each line a query) or how is it done
More details with this images :
This is my data set
This is MySQL DB structure
Try using a parameterized insert using the RODBCext package. I have used the following function in the past.
This will append records into your database
library(RODBC)
library(RODBCext)
First we need to connect to the database using RODBC.
sql.driver = "MySQL ODBC 5.3 ANSI Driver" # need to figure the version out
sql.server = "your_server_here"
sql.port = "3306" # or whatever your port number is
sql.user = "your_user_name_here"
sql.pass = "your_password_name_here"
sql.db = "your_database_name_here"
con.string = paste0("Driver=", sql.driver, ";",
"Server=", sql.server, ";",
"Port=", sql.port, ";",
"Uid=", sql.user, ";",
"Pwd=", sql.pass, ";",
"Database=", sql.db, ";")
ch = odbcDriverConnect(con.string)
Then here is the custom function saveTable(). You will want to run this with your specific inputs, defined below.
saveTable <- function(con, table_name, df) {
# con = the ODBC connection (e.g., ch)
# table_name = the SQL database table to append to
# df = the data.frame() to append
sql_code = paste("INSERT INTO",table_name,"(",paste(colnames(df),collapse=", "),") VALUES (",paste(rep("?",ncol(df)),collapse=","),")")
sqlExecute(con, sql_code, df)
}
I'm using Python 3.6, mysql-connector-python 8.0.11 and 8.0.11
MySQL Community Server - GPL. The table in question is using the innoDB engine.
When using the MySQL Workbench I can enter:
USE test; START TRANSACTION; SELECT * FROM tasks WHERE task_status != 1 LIMIT 1 FOR UPDATE;
And it provides 1 record as expected:
When I use a script using python3 (from the same machine - same access, etc):
* SQL QRY: START TRANSACTION; SELECT * FROM test WHERE task_status != 1 LIMIT 1 FOR UPDATE;
* SQL RES: No result set to fetch from.
This is debug output from my script. If I change the Query to normal SELECT, I do get output.
* SQL QRY: SELECT * FROM test WHERE task_status != 1 LIMIT 1;
* SQL RES: [(1, 0, 'TASK0001')]
I know SELECT * isn't the way to go but just trying to get some response for now.
I'm trying to allow multiple worker scripts to pick up a task without the workers taking the same task:
Do a select and row lock the task so other workers 'SELECT' query doesn't show them,
Set the task status to 'being processed' and unlock the record.
This is my first venture into locking so this is new ground. I'm able to do normal queries and populate tables etc so have some experience but not with locking.
TABLE creation:
create table test
(
id int auto_increment
primary key,
task_status int not null,
task_ref varchar(16) not null
);
Questions:
Is this the correct mindset? I.e. is there a more pythonic/mysql way to do this?
Is there a specific way I need to initiate the mysql connection? Why would it work using the MySQL workbench but not via script? I've tried using direct mysql and this works too - so I think it is the python connector that may need setting up correctly as it is the only component not working.
Currently I'm using 'autocommit=1' on the connector and 'buffered=True' on the cursor. I know you can set 'autocommit=0' in the SQL before the 'START TRANSACTION' so understand for the locking I may need to do this, but for all other transactions I would prefer to keep autocommit on. Is this OK and/or doable?
CODE:
#!/usr/bin/env python
import mysql.connector
import pprint
conn = mysql.connector.connect(user='testuser',
password='testpass',
host='127.0.0.1',
database='test_db',
autocommit=True)
dbc = conn.cursor(buffered=True)
qry = "START TRANSACTION; SELECT * FROM 'test' WHERE task_status != 1 LIMIT 1 ON UPDATE;"
sql_select = dbc.execute(qry)
try:
output = dbc.fetchall()
except mysql.connector.Error as e:
print(" * SQL QRY: {0}".format(qry))
print(" * SQL RES: {0}".format(e))
exit()
else:
print(" * SQL QRY: {0}".format(qry))
print(" * SQL RES: {0}".format(output))
Many Thanks,
Frank
So after playing around a bit, I worked out (by trial and error) that the proper way to do this is to just put 'FOR UPDATE' at the end of the normal query:
Full code is below (including option to add dummy records for testing):
#!/usr/bin/env python
import mysql.connector
import pprint
import os
conn = mysql.connector.connect(user='testuser',
password='testpass',
host='127.0.0.1',
database='test_db',
autocommit=True)
dbc = conn.cursor(buffered=True)
worker_pid = os.getpid()
all_done = False
create = False
if create:
items = []
for i in range(10000):
items.append([0, 'TASK%04d' % i])
dbc.executemany('INSERT INTO test (task_status, task_ref) VALUES (%s, %s)', tuple(items))
conn.commit()
conn.close
exit()
while all_done is False:
print(all_done)
qry = (
"SELECT id FROM test WHERE task_status != 1 LIMIT 1 FOR UPDATE;"
)
sql_select = dbc.execute(qry)
try:
output = dbc.fetchall()
except mysql.connector.Error as e:
print(" * SQL QRY: {0}".format(qry))
print(" * SQL RES: {0}".format(e))
exit()
else:
print(" * SQL QRY: {0}".format(qry))
print(" * SQL RES: {0}".format(output))
if len(output) == 0:
print("All Done = Yes")
all_done = True
continue
else:
print("Not Done yet!")
if len(output) > 0:
test_id = output[0][0]
print("WORKER {0} FOUND: '{1}'".format(worker_pid, test_id))
qry = "UPDATE test SET task_status = %s, task_ref = %s WHERE id = %s;"
sql_select = dbc.execute(qry, tuple([1, worker_pid, test_id]))
conn.commit()
try:
output = dbc.fetchall()
except mysql.connector.Error as e:
print(" * SQL QRY: {0}".format(qry))
print(" * SQL RES: {0}".format(e))
else:
print(" * SQL QRY: {0}".format(qry))
print(" * SQL RES: {0}".format(output))
print(all_done)
Hope this helps someone else save some time as there are a lot of places with different info but searches for python3, mysql-connector and transactions didn't get me anything.
Good Luck,
Frank
I have a query to run in R which retrieves data from the database and performs operations on it. When I run it in mysql workbench, it works just fine but in r it takes way too long and may hang the entire system. I also tried to run it in command prompt but got the error:
Error: memory exhausted (limit reached?)
mysql query:
library(DBI)
library(RMySQL)
con <- dbConnect(RMySQL::MySQL(),
dbname ="mydb",
host = "localhost",
port = 3306,
user = "root",
password = "")
pedigree <- dbGetQuery (connection, "SELECT aa.name as person, mother as mom, father as dad
FROM addweight LEFT JOIN aa ON addweight.name2 = aa.name2 or addweight.name = aa.name
LEFT JOIN death ON addweight.name2 = death.name2 or addweight.name = death.name
Where((death.dodeath > curdate() OR aa.name2 NOT IN (SELECT name2 FROM death) OR aa.name NOT IN (SELECT name FROM death) OR aa.name NOT IN (SELECT name FROM death)) AND (dob < curdate() AND domove < curdate()))")
The solution could be to replace dbGetQuery with dbSendQuery and dbFetch call.
The simple steps could be:
library(RMySQL)
# From OP
con <- dbConnect(RMySQL::MySQL(),
dbname ="mydb",
host = "localhost",
port = 3306,
user = "root",
password = "")
# iterationresults is a table in your database. One can replace query with his own
rs = dbSendQuery(con, "select * from iterationresults")
# Fetch first 20 rows and repeat it for all rows
df = dbFetch(rs, n=20)
# For repeated call
while (!dbHasCompleted(rs)){
df<- dbFetch(rs, n=20)
}
# OR Fetch all rows in one go
df = dbFetch(rs, n=-1)
# Free all resources
dbClearResult(rs)
# Close connection
dbDisconnect(con)
# df will contain results i.e.
df
# ID Truck_ID Speed trip_id
#1 11 TTI 039 6 217
#2 12 TTI 039 6 217
# ........
I need to join two tables where the common column-id that I want to use has a different name in each table. The two tables have a "false" common column name that does not work when dplyr takes the default and joins on columns "id".
Here's some of the code involved in this problem
library(dplyr)
library(RMySQL)
SDB <- src_mysql(host = "localhost", user = "foo", dbname = "bar", password = getPassword())
# Then reference a tbl within that src
administrators <- tbl(SDB, "administrators")
members <- tbl(SDB, "members")
Here are 3 attempts -- that all fail -- to pass along the information that the common column on the members side is "id" and on the adminisrators side it's "idmember":
sqlq <- semi_join(members,administrators, by=c("id","idmember"))
sqlq <- inner_join(members,administrators, by= "id.x = idmember.y")
sqlq <- semi_join(members,administrators, by.x = id, by.y = idmember)
Here's an example of the kinds of error messages I'm getting:
Error in mysqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: Unknown column '_LEFT.idmember' in 'where clause')
The examples I see out there pertain to data tables and data frames on the R side. My question is about how dplyr sends "by" statements to a SQL engine.
In the next version of dplyr, you'll be able to do:
inner_join(members, administrators, by = c("id" = "idmember"))
Looks like this is an unresolved issue:
https://github.com/hadley/dplyr/issues/177
However you can use merge:
❥ admin <- as.tbl(data.frame(id = c("1","2","3"),false = c(TRUE,FALSE,FALSE)))
❥ members <- as.tbl(data.frame(idmember = c("1","2","4"),false = c(TRUE,TRUE,FALSE)))
❥ merge(admin,members, by.x = "id", by.y = "idmember")
id false.x false.y
1 1 TRUE TRUE
2 2 FALSE TRUE
If you need to do left or outer joins, you can always use the ALL.x, or ALL arguments to merge. A thought though... You've got a sql db, why not use it?
❥ con2 <- dbConnect(MySQL(), host = "localhost", user = "foo", dbname = "bar", password = getPassword())
❥ dbGetQuery(con, "select * from admin join members on id = idmember")
I'm using Orbeon 3.9.0 PE RC1 with liferay-portal-6.0.5. When using Localhost Mysql persistence layer, its works.
but when try to use remote (Local Network) Mysql database, then Form builder can't publish any form and no data shown.
Properties-local.xml configaretion
`
Error Log sample.
2011-04-07 12:37:18,118 INFO ProcessorService - /fr/service/mysql/search/orbeon/builder - Received request
2011-04-07 12:37:20,853 ERROR SQLProcessor - PreparedStatement:
select
(
select count(*) from orbeon_form_data
where
(app, form, document_id, last_modified) in (
select app, form, document_id, max(last_modified) last_modified
from orbeon_form_data
where
app = ?
and form = ?
group by app, form, document_id)
and deleted = 'N'
) total,
(
select count(*) from (
select
data.created, data.last_modified, data.document_id
, extractValue(data.xml, '/*/xhtml:head/xforms:model[#id = ''fr-form-model'']/xforms:instance[#id = ''fr-form-metadata'']/*/application-name') detail_1
, extractValue(data.xml, '/*/xhtml:head/xforms:model[#id = ''fr-form-model'']/xforms:instance[#id = ''fr-form-metadata'']/*/form-name') detail_2
, extractValue(data.xml, '/*/xhtml:head/xforms:model[#id = ''fr-form-model'']/xforms:instance[#id = ''fr-form-metadata'']/*/title[#xml:lang = ''en'']') detail_3
, extractValue(data.xml, '/*/xhtml:head/xforms:model[#id = ''fr-form-model'']/xforms:instance[#id = ''fr-form-metadata'']/*/description[#xml:lang = ''en'']') detail_4
from orbeon_form_data data,
(
select max(last_modified) last_modified, app, form, document_id
from orbeon_form_data
where
app = ?
and form = ?
group by app, form, document_id
) latest
where
data.last_modified = latest.last_modified
and data.app = latest.app
and data.form = latest.form
and data.document_id = latest.document_id
and data.deleted = 'N'
order by created desc
)a
) search_total
2011-04-07 12:37:20,868 INFO DatabaseContext - Rolling back JDBC connection for datasource: jdbc/mysql.
2011-04-07 12:37:20,868 ERROR ProcessorService - Exception at oxf:/apps/fr/persistence/mysql/search.xpl (executing XSLT transformation)
com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: FUNCTION orbeon.extractValue does not exist
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2985)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
at com.mysql.jdbc.Connection.execSQL(Connection.java:3256)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1313)
at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:874)
at org.apache.tomcat.dbcp.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:169)
at org.apache.tomcat.dbcp.dbcp.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:169)
at org.orbeon.oxf.processor.sql.interpreters.QueryInterpreter.end(QueryInterpreter.java:600)
at org.orbeon.oxf.processor.sql.SQLProcessor$InterpreterContentHandler.endElement(SQLProcessor.java:540)
at org.orbeon.oxf.processor.sql.SQLProcessor$ForwardingContentHandler.endElement(SQLProcessor.java:635)
at org.orbeon.oxf.processor.sql.SQLProcessor$InterpreterContentHandler.endElement(SQLProcessor.java:542)
at org.orbeon.oxf.processor.sql.SQLProcessor$ForwardingContentHandler.endElement(SQLProcessor.java:635)
at org.orbeon.oxf.processor.sql.SQLProcessor$InterpreterContentHandler.endElement(SQLProcessor.java:542)
at org.orbeon.oxf.processor.sql.SQLProcessor$ForwardingContentHandler.endElement(SQLProcessor.java:635)
at org.orbeon.oxf.processor.sql.SQLProcessor$InterpreterContentHandler.endElement(SQLProcessor.java:542)
at org.orbeon.oxf.processor.sql.SQLProcessor$RootInterpreter.endElement(SQLProcessor.java:290)
at org.orbeon.oxf.xml.SAXStore.replay(SAXStore.java:288)
at org.orbeon.oxf.xml.SAXStore.replay(SAXStore.java:202)
at org.orbeon.oxf.processor.sql.SQLProcessor.execute(SQLProcessor.java:251)
at org.orbeon.oxf.processor.sql.SQLProcessor$1.readImpl(SQLProcessor.java:89)
at org.orbeon.oxf.processor.impl.ProcessorOutputImpl$TopLevelOutputFilter.read(ProcessorOutputImpl.java:263)
at org.orbeon.oxf.processor.impl.ProcessorOutputImpl.read(ProcessorOutputImpl.java:406)
at `
Since the error you're getting is FUNCTION orbeon.extractValue does not exist, I suspect this is because the other (remote) version of MySQL is an older version which doesn't support extractValue(). The MySQL persistence layer relies on XML functions that have been introduced in MySQL 5.1, so you need to be using the MySQL 5.1 (which was released in November 2008) or newer.