I'm trying to upload a csv file with a lot of posible clients (15000) on a MySQL table. I want to keep on a table for later retrieve info, complete forms and make users.
Now i'm on the beginning, trying to import the csv to the MySQL.
I read some solutions that takes smarterCSV gem, so i must do a migration with the db structure and then execute the rake task or not needed to execute the migration for this?
The kind of code i want to use for import the csv is something like i read on before posts like Ruby on Rails - Import Data from a CSV file
require 'smarter_csv'
options = {}
SmarterCSV.process('input_file.csv', options) do |chunk|
chunk.each do |data_hash|
Moulding.create!( data_hash )
end
end
You may use my importer gem. It also uses SmarterCSV.
https://github.com/michaelnera/active_record_importer
Related
I am using exporting neo4j all db to json using apoc APIs & again importing with same. Import query executes successfully but cannot find any data in neo4j.
Export query:
CALL apoc.export.json.all('complete-db.json',{useTypes:true, storeNodeIds:false})
Import query:
CALL apoc.load.json('complete-db.json')
When I execute:
MATCH (n) RETURN n
It shows no results found.
This is a little bit confusing but apoc.load.json just reads(loads) data from the JSON File/URL.
It doesn't import the data or create the graph. You need to create the graph(nodes and/or relationships) using the Cypher statements.
In this case, you just read the file and didn't do anything with it so statement executed successfully. Your query isn't an import query, it's a JSON load query.
Refer the following example for import using apoc.load.json:
CALL apoc.load.json('complete-db.json') YIELD value
UNWIND value.items AS item
CREATE (i:Item(name:item.name, id:item.id)
apoc.import.json does what you need.
The export-import process:
Export:
CALL apoc.export.json.all('file:///complete-db.json', {useTypes:true, storeNodeIds:false})
Import:
CALL apoc.import.json("file:///complete-db.json")
(#rajendra-kadam explains why your version does not work, and this is the complementary API call to apoc.export.json.all you were expecting. )
I work as a Business Analyst and new to Python.
In one of my project, I want to extract data from .csv file and load that data into my MySQL DB (Staging).
Can anyone guide me with a sample code and frameworks I should use?
Simple program to create sqllite. You can read the CSV file and use dynamic_entry to insert into your desired target table.
import sqlite3
import time
import datetime
import random
conn = sqlite3.connect('test.db')
c = conn.cursor()
def create_table():
c.execute('create table if not exists stuffToPlot(unix REAL, datestamp TEXT, keyword TEXT, value REAL)')
def data_entry():
c.execute("INSERT INTO stuffToPlot VALUES(1452549219,'2016-01-11 13:53:39','Python',6)")
conn.commit()
c.close()
conn.close()
def dynamic_data_entry():
unix = time.time();
date = str(datetime.datetime.fromtimestamp(unix).strftime('%Y-%m-%d %H:%M:%S'))
keyword = 'python'
value = random.randrange(0,10)
c.execute("INSERT INTO stuffToPlot(unix,datestamp,keyword,value) values(?,?,?,?)",
(unix,date,keyword,value))
conn.commit()
def read_from_db():
c.execute('select * from stuffToPlot')
#data = c.fetchall()
#print(data)
for row in c.fetchall():
print(row)
read_from_db()
c.close()
conn.close()
You can iterate through the data in CSV and load into sqllite3. Please refer below link as well.
Quick easy way to migrate SQLite3 to MySQL?
If that's a properly formatted CSV file you can use the LOAD DATA INFILE MySQL command and you won't need any python. Then after it is loaded in the staging area (without processing) you can continue transforming it using sql/etl tool of choice.
https://dev.mysql.com/doc/refman/8.0/en/load-data.html
A problem with that is that you need to add all columns but still even if you have data you don't need you might prefer to load everything in the staging.
I have some implementation that is best served by pickling a pandas dataframe and storing it in a DB.
This works fine if the database is sqlite but fails with a load error when it is MySQL
I have found other people with similar issues on stackoverflow and google but it seems that everybodys solution is to use sql to store the dataframe.
As a last resort I would go down that route but it would be a shame for this use case to do that.
Anybody got a solution to get the same behaviour from mysql as sqlite here?
I simply dump the dataframe with
pickledframe = pickle.dumps(frame)
and store pickledframe as a BinaryField
pickledframe = models.BinaryField(null=True)
I load it in with
unpickled = pickle.loads(pickledframe)
with sqlite it works fine, with mysql I get
Exception Type: UnpicklingError
Exception Value: invalid load key, ','.
upon trying to load it.
Thanks
I am trying to export the output of a query in a MYSQL database to a CSV file in the local system using Python. There are 2 issues. First of all using fetchall() I am not getting any data( The same query in database produces more than 5000 rows of data), though I got data output initially. Secondly I would like to know the code to put the username and password in a separate file which the user cannot access but will be imported to this file when the script runs.
import os
import csv
import pymysql
import pymysql.cursors
d=open('c:/Users/dasa17/Desktop/pylearn/Roster.csv', 'w')
c=csv.writer(d)
Connection = pymysql.connect(host='xxxxx', user='xxxxx', password='xxxx',
db='xxxx',charset='utf8mb4',cursorclass=pymysql.cursors.DictCursor )
a=Connection.cursor()
a.execute("select statement")
data=a.fetchall()
for item in data:
c.writerow(item)
a.close()
d.close()
Connection.close()
Relatively new to Ruby, running: Ruby 1.9.2 and MySQL 5.5.19
I have a csv file full of data that I'd like to add to an existing table on a mysql database. The csv file does not have headers. Yes, I'm aware that there are multiple questions on this topic already. Here's how it breaks down:
Answer #1: Use LOAD DATA INFILE
Unfortunately, LOAD DATA INFILE gives me the following error: "Can't get stat of 'filename.csv' (Errcode: 2)"
This appears to be some kind of permissions issue. I've tried this both directly at the mysql prompt (as root), and through a Ruby script. I've tried various chmod options on the csv file, as well as moving the csv file to various directories where other users have said it works for them. No luck. Regardless, most people at this point recommend...
Answer #2: Use LOAD DATA local INFILE
Unfortunately this also returns an error. Apparently local infile is a mysql option turned off by default, because its a security risk. I've tried turning it on, but still get nothing but errors, such as:
ERROR 1148 (42000): The used command is not allowed with this MySQL version
and also
undefined method `execute' for # (NoMethodError)
Answer #3: I can find various answers involving Rails, which don't fit the situation. This isn't for a web application (although a web app might access it later), I'm just trying to add the data to the database for right now as a one-time thing to do some data analysis.
The Ruby file should be incredibly simple:
require 'rubygems'
require 'csv' (or fastercsv?)
require 'mysql'
db = mysql.connect('localhost','root','','databasename')
CSV.foreach('filename.csv') do |row|
?????
db.execute("INSERT INTO tablename ?????")
end
P.S. Much thanks in advance. Please no answers that point to using LOAD DATA INFILE or LOAD DATA LOCAL INFILE. Already wasted enough hours trying to get that to work...
ad mysql:
LOAD DATA INFILE '/complete/path/csvdata.csv' INTO TABLE mytable(column1,column2,...);
ad ruby
require 'csv'
require 'mysql'
db = mysql.real_connect('localhost','root','password','database')
file=CSV::Reader.parse('filename.csv')
file.each do |row|
values = row.inject([]){|k,v| k<<"'#{v}'";k}.join(',')
db.query("insert into table(column, column ...) values(#{values})")
end
db.close
it assumes csv file contains ALL the columns required..