I am running an R script that adds data to mySQL database. I usually format the data and add it as as it comes in after couple of hours (data string is not continuous). My first set of data was added properly in MySQL database. The second string of data can not be added properly.
con = dbConnect(MySQL(), user='root', password='xxxxxx', dbname='test', host='localhost')
dbWriteTable(con, 'Tables', value = parseTweets(filterStream(file.name= "", track=c("lebron"), timeout=10, oauth=my_oauth)))
When I rerun the last code (dbWriteTable) again, it gives me following error
Error: Error in .local(conn, statement, ...) :
could not run statement: Table 'tables' already exists
I also used
dbWriteTable(con, 'Tables', value = parseTweets(filterStream(file.name= "", track=c("lebron"), timeout=10, oauth=my_oauth)), append = TRUE)
but it provides the same error
For some reason giving setting append to TRUE is not working. Instead give it a number. See the code below for better understanding
dbWriteTable(con, 'Tables', value = parseTweets(filterStream(file.name= "", track=c("lebron"), timeout=100, oauth=my_oauth)), overwrite = 0, row.names = 0, append = 1)
Related
I am trying to run the following code to populate a table in parallel for a certain application. First the following function is defined which is supposed to connect to my db and execute the sql command with the values given (to insert into table).
def dbWriter(sql, rows) :
# load cnf file
MYSQL_CNF = os.path.abspath('.') + '/mysql.cnf'
conn = MySQLdb.connect(db='dedupe',
charset='utf8',
read_default_file = MYSQL_CNF)
cursor = conn.cursor()
cursor.executemany(sql, rows)
conn.commit()
cursor.close()
conn.close()
And then there is this piece:
pool = dedupe.backport.Pool(processes=2)
done = False
while not done :
chunks = (list(itertools.islice(b_data, step)) for step in
[step_size]*100)
results = []
for chunk in chunks :
print len(chunk)
results.append(pool.apply_async(dbWriter,
("INSERT INTO blocking_map VALUES (%s, %s)",
chunk)))
for r in results :
r.wait()
if len(chunk) < step_size :
done = True
pool.close()
Everything works and there are no errors. But at the end, my table is empty, meaning somehow the insertions were not successful. I have tried so many things to fix this (including adding column names for insertion) after many google searches and have not been successful. Any suggestions would be appreciated. (running code in python2.7, gcloud (ubuntu). note that indents may be a bit messed up after pasting here)
Please also note that "chunk" follows exactly the required data format.
Note. This is part of this example
Please note that the only thing I am changing in the above example (linked) is that I am separating the steps for creation of and inserting into the tables since I am running my code on gcloud platform and it enforces GTID standards.
Solution was changing dbwriter function to:
conn = MySQLdb.connect(host = # host ip,
user = # username,
passwd = # password,
db = 'dedupe')
cursor = conn.cursor()
cursor.executemany(sql, rows)
cursor.close()
conn.commit()
conn.close()
I am using RMySQL package to write (append) data in current table.
I am using R, version 3.3.2.
My code looks like this:
library(RMySQL)
df_final <- some_data
m<-dbDriver("MySQL")
mydb <- dbConnect(m, user='odvjet12_mislav',
password='my_pass',
host='91.234.46.219',
dbname='odvjet12_fina_pn')
dbWriteTable(mydb, value = df_final, name = "fina_pn", append = TRUE, row.names = FALSE)
This code works fine for some time, but in last ten days, it always return an error:
Error in .local(conn, statement, ...) :
could not run statement: The used command is not allowed with this MySQL version
I don't understand how it is possible for code to work for some time and now, it returns an error?
I kindly ask for feedback on this issue.
Best,
Mislav Ĺ agovac
You could also use dbGetQuery from the RMySQL package and iterate over the rows, which was my solution when I reached a similar error for a dataframe I wanted to write to a MySQL DB:
mydb = dbConnect(MySQL(), user='user', password='password', dbname='databasename', host='hostname')
for(i in 1:nrow(df)){
dbGetQuery(mydb,paste0("INSERT INTO MYTABLE (COL1,COL2) VALUES(",df$col1[i],",",df$col2[i],")"))
}
I have a data frame containing columns 'Quarter' having values like "16/17 Q1", "16/17 Q2"... and 'Vendor' having values like "a", "b"... .
I am trying to write this data frame into database using
query <- paste("INSERT INTO cc_demo (Quarter,Vendor) VALUES(dd$FY_QUARTER,dd$VENDOR.x)")
but it is throwing error :
Error in .local(conn, statement, ...) :
could not run statement: Unknown column 'dd$FY_QUARTER' in 'field list'
I am new to Rmysql, Please provide me some solution to write entire dataframe?
To write a data frame to mySQL DB you need to:
Create a connection to your database, you need to specify:
MySQL connection
User
Password
Host
Database name
library("RMySQL")
connection <- dbConnect(MySQL(), user = 'root', password = 'password', host = 'localhost', dbname = 'TheDB')
Using the connection create a table and then export data to the database
dbWriteTable(connection, "testTable", testTable)
You can overwrite an existing table like this:
dbWriteTable(connection, "testTable", testTable_2, overwrite=TRUE)
I would advise against writing sql query when you can actually use very handy functions such as dbWriteTable from the RMySQL package. But for the sake of practice, below is an example of how you should go about writing the sql query that does multiple inserts for a MySQL database:
# Set up a data.frame
dd <- data.frame(Quarter = c("16/17 Q1", "16/17 Q2"), Vendors = c("a","b"))
# Begin the query
sql_qry <- "insert into cc_demo (Quarter,Vendor) VALUES"
# Finish it with
sql_qry <- paste0(sql_qry, paste(sprintf("('%s', '%s')", dd$Quarter, dd$Vendors), collapse = ","))
You should get:
"insert into cc_demo (Quarter,Vendor) VALUES('16/17 Q1', 'a'),('16/17 Q2', 'b')"
You can provide this query to your database connection in order to run it.
I hope this helps.
I would like to extract all Create Statements in my 50 MySQL Databases via SHOW CREATE TABLE db.table or SHOW CREATE TABLE db1.mytableor SHOW CREATE TABLE db2.sometableor SHOW CREATE TABLE db3.mytable1. Thus each of the DBs has some tables inside db1(table,mytable...) db2(table1,sometable) and so on
To illustrate the DBs via a example query:
SELECT *
FROM db.table1 m
LEFT JOIN db1.sometable o ON m.id = o.id
LEFT JOIN db2.sometables t ON p.id=t.id
LEFT JOIN db3.sometable s ON s.column='john'
library(RMySQL)
library(DBI)
con <- dbConnect(RMySQL::MySQL(),
username = "",
password = "",
host = "",
port = 3306,
dbname= mydbname)# when using dbs<-dbGetQuery(con ,"SHOW DATABASES") I have to ## dbname= mydbname## to get all DBs
Using dbs<-dbGetQuery(con ,"SHOW DATABASES")I can extract all 50 Databases in the dbConnection as character vector. I would like loop over each DB in the dbsand apply SHOW CREATE TABLE to each row/db. I suppose I have to parse the each row/db into dbname= mydbnameand dbs<-dbGetQuery(con ,"SHOW CREATE TABLE"). But I just cant figure out how to make the loops
I tried:
apply(dbs, 1, function(row) {
dbname <- row[]
for (i in 1:length(dbname)) {
create<-dbGetQuery(con,"SHOW CREATE TABLE") }
})
But that doesnt seem right. I suppose I have to include the con into the loop somehow. Otherwise I'll get:
Error in .local(drv, ...) : object 'dbname' not found
So I tried:
apply(dbs, 1, function(row) {
dbname <- row[]
for (i in 1:length(dbname)) {
con <- dbConnect(RMySQL::MySQL(),
username = "",
password = "",
host = "",
port = 3306,
dbname= [i])
create<-dbGetQuery(con,"SHOW CREATE TABLE") }})
I suppose that comes close to the solution but I miss something:
dbs<-dbGetQuery(con,"show databases")
library(foreach)
foreach(i = 1:(length(dbs))%dopar%{
query<-paste("SHOW CREATE TABLE",dbs[i])
creates<-dbGetQuery(con,query)
})
Consider this approach of importing a data frame of each database (leaving out the system ones, INFORMATION_SCHEMA and MYSQL) and their corresponding tables. Then, run SHOW CREATE TABLE statements. Finally, merge the original dataframe with binded dataframe of create statements.
Now, the one caveat is tables that repeat names across databases. To return distinct values of such combinations, the aggregate() by head function is used.
con <- dbConnect(RMySQL::MySQL(),
username = "****", password = "****",
host = "****", port = 3306,
dbname= "****")
dbtbls <- dbGetQuery(con, "SELECT `TABLE_SCHEMA` AS `Database`,
`TABLE_NAME` AS `Table`
FROM `INFORMATION_SCHEMA`.`TABLES`
WHERE `TABLE_TYPE` = 'BASE TABLE'
AND `TABLE_SCHEMA` NOT LIKE '%SCHEMA%'
AND `TABLE_SCHEMA` NOT LIKE '%MYSQL%' ")
# LIST OF SQL STATEMENTS
sql <- paste0("SHOW CREATE TABLE ", dbtbls$Database, ".", dbtbls$Table)
# LIST OF DATAFRAMES
createstmts <- lapply(sql, function(x) dbGetQuery(con, x))
dbDisconnect(con)
# ROW BIND LIST INTO ONE DATAFRAME TO MERGE WITH ORIGINAL
stmtsdf <- do.call(rbind, createstmts)
finaldf <- merge(dbtbls, stmtsdf, by='Table')
# RETURN DISTINCT RECORDS
finaldf <- aggregate(.~Database+Table, finaldf, FUN=head, 1)
mysqldump --no-data
does exactly what you are asking for. (There may be other parameters desirable to avoid/include CREATE DATABASE, etc.)
If the requirement is to subsequently pull the CREATEs into R, then I ask whether this is a one-time task, or a recurring task. For one-time, I would suggest that, overall, the mysqldump approach might be simpler.
First, you can just simply use
for (i in 1:length(dbs)) { }
Or you can look into apply functions, particularly, sapply. There you can do parsing per dbConnection string, connect and get all tables as list or vector. Then you can loop inside those to get create table statements.
So, it is basically apply inside apply.
For a good explanation of apply functions, you can look into http://www.r-bloggers.com/using-apply-sapply-lapply-in-r/
I just started using the R package called RMySQL in order to get around some memory limitations on my computer. I am trying to take a matrix with 100 columns in R (called data.df), then make a new table on an SQL database that has "100 choose 2" (=4950) columns, where each column is a linear combination of two columns from the initial matrix. So far I have something like this:
countnumber <- 1
con <- dbConnect(MySQL(), user = "root", password = "password", dbname = "myDB")
temp <- as.data.frame(data.df[,1] - data.df[,2])
colnames(temp) <- paste(pairs[[countnumber]][1], pairs[[countnumber]][2], sep = "")
dbWriteTable(con, "spreadtable", temp, row.names=T, overwrite = T)
for(i in 1:(n-1)){
for(j in (i+1):n){
if(!((i==1)&&(j==2))){ #this part excludes the first iteration already taken care of
temp <- as.data.frame(data.df[,i] - data.df[,j])
colnames(temp) <- "hola"
dbWriteTable(con, "spreadtable", value = temp, append = TRUE, overwrite = FALSE, row.names = FALSE)
countnumber <- countnumber + 1
}
}
}
I've also tried toying around with the "field.types" argument of RMySQL::dbWriteTable(), which was suggested at RMySQL dbWriteTable with field.types. Sadly it hasn't helped me out too much.
Questions:
Is making your own sql database a valid solution to the memory-bound nature of R, even if it has 4950 columns?
Is the dbWriteTable() the proper function to be using here?
Assuming the answer is "yes" to both of the previous questions...why isn't this working?
Thanks for any help.
[EDIT]: code with error output:
names <- as.data.frame(index)
names <- t(names)
#dim(names) is 1 409
con <- dbConnect(MySQL(), user = "root", password = "password", dbname = "taylordatabase")
dbGetQuery(con, dbBuildTableDefinition(MySQL(), name="spreadtable", obj=names, row.names = F))
#I would prefer these to be double types with 8 decimal spaces instead of text
#dim(temp) is 1 409
temp <- as.data.frame(data.df[,1] - (ratios[countnumber]*data.df[,2]))
temp <- t(temp)
temp <- as.data.frame(temp)
dbWriteTable(con, name = "spreadtable", temp, append = T)
The table is created successfully in the database (I will change variable type later), but the dbWriteTable() line produces the error:
Error in mysqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: Unknown column 'row_names' in 'field list')
[1] FALSE
Warning message:
In mysqlWriteTable(conn, name, value, ...) : could not load data into table
If I make a slight change, I get a different error message:
dbWriteTable(con, name = "spreadtable", temp, append = T, row.names = F)
and
Error in mysqlExecStatement(conn, statement, ...) :
RS-DBI driver: (could not run statement: Unknown column 'X2011_01_03' in 'field list')
[1] FALSE
Warning message:
In mysqlWriteTable(conn, name, value, ...) : could not load data into table
I just want to use "names" as a bunch of column labels. They were initially dates. The actual data I would like to be "temp."
Having a query with 4950 rows is ok, the problem is that what columns you need.
If you always "select * ", you will eventually exhaust all you system memory (in the case that the table has 100 columns)
Why not give us some error message if you have encountered any problems ?