I'm looking for a clever way to extract 500 plus lines of data from an excel spreadsheet and enter is into my database.
The spreadsheet is like this
My table 'tbl_foot_teams' is set out as
id | name | rating
Quite simply, I need to enter get the two columns from the spreadsheet into the database fields name and rating.
Is there any efficient way to achieve this?
Individually, it will take me a ridiculous amount of time!
Thanks
Save Excel file as CSV and use LOAD DATA INFILE command to import data.
Your excel file has no id field. Make id field in the table as AUTO_INCREMENT, and use command like this -
LOAD DATA INFILE 'file_name.csv' INTO TABLE tbl_foot_teams
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
-- IGNORE 1 LINES -- if csv file has column headers
(name, rating)
SET id = NULL; -- this will set unique value for each row
Also, have a look at GUI Data Import tool (Excel or CSV format) in dbForge Studio for MySQL.
In phpmyadmin you have an Import From Excel option.
If you don't have one, you may have Import From CSV, so just convert the spreadsheet to CSV.
If you have none of above, you can write a php function that opens a text file, makes explode by rows and then explode by values
If we are talking about 50 rows, you can create easily a new column on spreadsheet with a formula to concatenate your values to a insert statement. Something like:
=concat( "insert into tbl_foot_teams ( name , rating) values ( " , $b8 , " ...
then, copy paste calculate formula text result on your database.
You don't specify what database you're using, but an easy way to do this with MySQL would be to export the spreadsheet as a csv file and then import to MySQL with mysqlimport.
This is described in a comment on this MySQL page, from user Philippe Jausions:
If you are one of the many people trying to import a CSV file into
MySQL using mysqlimport under MS-Windows command/DOS prompt, try the
following:
mysqlimport --fields-optionally-enclosed-by=""" --fields-terminated-by=, --lines-terminated-by="\r\n" --user=YOUR_USERNAME --password YOUR_DATABASE YOUR_TABLE.csv
Between quotes " and backslashes \ it can really give you a hard time
finding the proper combination under Windows...
I usually run this command from the folder containing the
YOUR_TABLE.csv file.
If you have a header in your .csv file with the name of columns or
other "junk" in it, just add a --ignore-lines=X to skip the first X
lines (i.e. --ignore-lines=1 to skip 1 line)
If your fields are (optionally) enclosed by double-quotes " and which
themselves are doubled inside a value (i.e. a double double-quote "" =
1 double-quote ") then also use --fields-escaped-by=\ (default) and
NOT --fields-escaped-by="""
Working from the Excel end, you can use ADO, for example Excel VBA: writing to mysql database
Dim cn As ADODB.Connection
''Not the best way to get the name, just convenient for notes
strFile = Workbooks(1).FullName
strCon = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & strFile _
& ";Extended Properties=""Excel 8.0;HDR=Yes;IMEX=1"";"
Set cn = CreateObject("ADODB.Connection")
''For this to work, you must create a DSN and use the name in place of
''DSNName, however, you can also use the full connection string
strSQL = "INSERT INTO [ODBC;DSN=DSNName;].NameOfMySQLTable " _
& "Select AnyField As NameOfMySQLField FROM [Sheet1$];"
cn.Execute strSQL
After converting to a CSV file, you can import into any database that supports importing from CSV files (MySQL, PostgreSQL, etc.) but you would have to do this from the command-line:
Importing CSV files in PostgreSQL
Import CSV to Oracle table
MySQL 5.1 LOAD DATA INFILE syntax
Import CSV File into MSSQL
You can use a programming language such as Python and use a library for it such as the Excel spreadsheet reading library and then another library for interfacing with your SQL database.
You connect to the data, load the Excel file, and loop through each row and extract whichever column data you want. Then you take that data and execute the INSERT statement.
If you have an installation of phpMyAdmin on a server, you can use the Import from CSV option though you would first have to re-save your Excel spreadsheet as a CSV (Comma-Separated Values) file.
Related
I have multiple txt files in a directory and I want to insert all of them into mysql; the content of each file show should occupy a row. In MySQL, I have 2 columns: ID (auto increment), and LastName(nvarchar(45)). First, I used insert into, and I got the error that I cannot insert too many lines! Then, I used the following code, but it gives me error 1064. Here is the code:
import MySQLdb
import sys
import os
result = os.listdir("path")
for x in result:
db = MySQLdb.connect("localhost","root","password","myblog")
cursor = db.cursor()
file = os.path.join('path\\'+x)
cursor.execute("LOAD DATA LOCAL INFILE file INTO clamp_test(ID, LastName)");
file.close()
db.commit()
db.close
Can you please tell me what am I doing wrong? Is this the right way to insert multiples lines of an unstructhered file into MySql?
One issue is the string literal containing the SQL text
"LOAD DATA LOCAL INFILE file INTO ..."
^^^^
Here, file is just characters that are part of the literal string. It's not a reference to a variable to be evaluated.
The SQL statement we need to send to MySQL would need to look like
LOAD DATA LOCAL INFILE '/mydir/myfile.txt' INTO ...
^^^^^^^^^^^^^^^^^^^
In the code, it looks like what we want to happen is to have the variable file evaluated, and the result of the evaluation incorporated that into the SQL text string.
Something like this:
"LOAD DATA LOCAL INFILE '%s' INTO ... " % (file)
I would like to export each table of my SQLite3 database to CSV files for further manipulation with Python and after that I want to export the CSV files into a different database format (PSQL). The ID column in SQLite3 is of type GUID, hence jiberrish when I export tables to CSV as text:
l_yQ��rG�M�2�"�o
I know that there is a way to turn it into a readable format since the SQLite Manager addon for Firefox does this automatically, sadly without reference regarding how or which query is used:
X'35B17880847326409E61DB91CC7B552E'
I know that QUOTE (GUID) displays the desired hexadecimal string, but I don't know how to dump it to the CSV instead of the BLOB.
I found out what my error was - not why it doesn't work, but how I get around it.
So I tried to export my tables as staded in https://www.sqlite.org/cli.html , namely a multiline command, which didn't work:
sqlite3 'path_to_db'
.headers on`
.mode csv
.output outfile.csv
SELECT statement
and so on.
I was testing a few things and since I'm lazy while testing, I used the single line variant, which got the job done:
sqlite3 -header -csv 'path_to_db' "SELECT QUOTE (ID) AS Hex_ID, * FROM Table" > 'output_file.csv'
Of course it would be better if I would specify all column names instead of using *, but this sufices as an example.
How can I load 10,000 rows of test.xls file into mysql db table?
When I use below query it shows this error.
LOAD DATA INFILE 'd:/test.xls' INTO TABLE karmaasolutions.tbl_candidatedetail (candidate_firstname,candidate_lastname);
My primary key is candidateid and has below properties.
The test.xls contains data like below.
I have added rows starting from candidateid 61 because upto 60 there are already candidates in table.
please suggest the solutions.
Export your Excel spreadsheet to CSV format.
Import the CSV file into mysql using a similar command to the one you are currently trying:
LOAD DATA INFILE 'd:/test.csv'
INTO TABLE karmaasolutions.tbl_candidatedetail
(candidate_firstname,candidate_lastname);
To import data from Excel (or any other program that can produce a text file) is very simple using the LOAD DATA command from the MySQL Command prompt.
Save your Excel data as a csv file (In Excel 2007 using Save As) Check
the saved file using a text editor such as Notepad to see what it
actually looks like, i.e. what delimiter was used etc. Start the MySQL
Command Prompt (I’m lazy so I usually do this from the MySQL Query
Browser – Tools – MySQL Command Line Client to avoid having to enter
username and password etc.) Enter this command: LOAD DATA LOCAL INFILE
‘C:\temp\yourfile.csv’ INTO TABLE database.table FIELDS TERMINATED
BY ‘;’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘\r\n’ (field1, field2);
[Edit: Make sure to check your single quotes (') and double quotes (")
if you copy and paste this code - it seems WordPress is changing them
into some similar but different characters] Done! Very quick and
simple once you know it :)
Some notes from my own import – may not apply to you if you run a different language version, MySQL version, Excel version etc…
TERMINATED BY – this is why I included step 2. I thought a csv would default to comma separated but at least in my case semicolon was the deafult
ENCLOSED BY – my data was not enclosed by anything so I left this as empty string ”
LINES TERMINATED BY – at first I tried with only ‘\n’ but had to add the ‘\r’ to get rid of a carriage return character being imported into the database
Also make sure that if you do not import into the primary key field/column that it has auto increment on, otherwhise only the first row will be imported
Original Author reference
I'm trying to migrating some MySQL tables to Amazon Redshift, but met some problems.
The steps are simple:
1. Dump the MySQL table to a csv file
2. Upload the csv file to S3
3. Copy the data file to RedShift
Error occurs in step 3:
The SQL command is:
copy TABLE_A from 's3://ciphor/TABLE_A.csv' CREDENTIALS
'aws_access_key_id=xxxx;aws_secret_access_key=xxxx' delimiter ',' csv;
The error info:
An error occurred when executing the SQL command: copy TABLE_A from
's3://ciphor/TABLE_A.csv' CREDENTIALS
'aws_access_key_id=xxxx;aws_secret_access_key=xxxx ERROR: COPY CSV is
not supported [SQL State=0A000] Execution time: 0.53s 1 statement(s)
failed.
I don't know if there's any limitations on the format of the csv file, say the delimiters and quotes, I cannot find it in documents.
Any one can help?
The problem is finally resolved by using:
copy TABLE_A from 's3://ciphor/TABLE_A.csv' CREDENTIALS
'aws_access_key_id=xxxx;aws_secret_access_key=xxxx' delimiter ','
removequotes;
More information can be found here http://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html
Now Amazon Redshift supports CSV option for COPY command. It's better to use this option to import CSV formatted data correctly. The format is shown bellow.
COPY [table-name] FROM 's3://[bucket-name]/[file-path or prefix]'
CREDENTIALS 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx' CSV;
The default delimiter is ( , ) and the default quotes is ( " ). Also you can import TSV formatted data with CSV and DELIMITER option like this.
COPY [table-name] FROM 's3://[bucket-name]/[file-path or prefix]'
CREDENTIALS 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx' CSV DELIMITER '\t';
There are some disadvantages to use the old way(DELIMITER and REMOVEQUOTES) that REMOVEQUOTES does not support to have a new line or a delimiter character within an enclosed filed. If the data can include this kind of characters, you should use CSV option.
See the following link for the details.
http://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html
If you want to save your self some code/ you have a very basic use case you can use Amazon Data Pipeline.
it stats a spot instance and perform the transformation within amazon network and it's really intuitive tool (but very simple so you can't do complex things with it)
You can try with this
copy TABLE_A from 's3://ciphor/TABLE_A.csv' CREDENTIALS 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx' csv;
CSV itself means comma separated values, no need to provide delimiter with this. Please refer link.
[http://docs.aws.amazon.com/redshift/latest/dg/copy-parameters-data-format.html#copy-format]
I always this code:
COPY clinical_survey
FROM 's3://milad-test/clinical_survey.csv'
iam_role 'arn:aws:iam::123456789123:role/miladS3xxx'
CSV
IGNOREHEADER 1
;
Description:
1- COPY the name of your file store in S3
2- FROM address of file
3- iam_role is a substitution for CREDENTIAL. Note that, iam_role should be defined in iam management menu at your console, and then in trust menu should be assigned to the user as well (That is the hardest part!)
4- CSV uses comma delimiter
5- IGNORHEADER 1 is a must! Otherwise it will throw an error. (skip one row of my CSV and consider it as a header)
Since the resolution has already been provided, I'll not repeat the obvious.
However, in case you receive some more error which you're not able to figure out, simply execute on your workbench while you're connected to any of the Redshift accounts:
select * from stl_load_errors [where ...];
stl_load_errors contains all the Amazon RS load errors in historical fashion where a normal user can view details corresponding to his / her own account but a superuser can have all the access.
The details are captured elaborately at :
Amazon STL Load Errors Documentation
Little late to comment but it can be useful:-
You can use an open source project to copy tables directly from mysql to redshift - sqlshift.
It only requires spark and if you have yarn then it can also be used.
Benefits:- It will automatically decides distkey and interleaved sortkey using primary key.
It looks like you are trying to load local file into REDSHIFT table.
CSV file has to be on S3 for COPY command to work.
If you can extract data from table to CSV file you have one more scripting option. You can use Python/boto/psycopg2 combo to script your CSV load to Amazon Redshift.
In my MySQL_To_Redshift_Loader I do the following:
Extract data from MySQL into temp file.
loadConf=[ db_client_dbshell ,'-u', opt.mysql_user,'-p%s' % opt.mysql_pwd,'-D',opt.mysql_db_name, '-h', opt.mysql_db_server]
...
q="""
%s %s
INTO OUTFILE '%s'
FIELDS TERMINATED BY '%s'
ENCLOSED BY '%s'
LINES TERMINATED BY '\r\n';
""" % (in_qry, limit, out_file, opt.mysql_col_delim,opt.mysql_quote)
p1 = Popen(['echo', q], stdout=PIPE,stderr=PIPE,env=env)
p2 = Popen(loadConf, stdin=p1.stdout, stdout=PIPE,stderr=PIPE)
...
Compress and load data to S3 using boto Python module and multipart upload.
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
bucket = conn.get_bucket(bucket_name)
k = Key(bucket)
k.key = s3_key_name
k.set_contents_from_file(file_handle, cb=progress, num_cb=20,
reduced_redundancy=use_rr )
Use psycopg2 COPY command to append data to Redshift table.
sql="""
copy %s from '%s'
CREDENTIALS 'aws_access_key_id=%s;aws_secret_access_key=%s'
DELIMITER '%s'
FORMAT CSV %s
%s
%s
%s;""" % (opt.to_table, fn, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY,opt.delim,quote,gzip, timeformat, ignoreheader)
I'm working with a client who has an existing system, built on what is apparently a Paradox database. I've got the database, in the form of a zip file containing .DB, .MB and .PX files, one for each table.
I need to take (some) of this data and import it in to a Web application that's using MySQL. Does anybody have a way for me to extract this data, that doesn't involve installing Paradox?
If not, does Paradox export in some readable format? Either as SQL or something that can be parsed reasonably easily? The person in charge of this system for my client is a volunteer (they're a non-profit), so I'd like to go to him with a solution - because last time I asked for the data, I got this, which is clearly no good.
The wikipedia article about Paradox lists two other things, that might be interessant, both under GPL license:
pxlib: Library to read and write Paradox databases
pxtools: convert a Paradox-database into a SQL-database
And if you have Delphi and want to write a converter yourself (which would need the BDE to work) you can take a look at this article or at the source code of ConvertCodeLib on this web site. Both make use of TClientDataset, which can write a CDS (binary format) or an XML file.
Both the Paradox for DOS and Paradox for Windows platforms will export data tables in Delimited Text, Fixed-Length Text, and Lotus 1-2-3 formats. The older Paradox for DOS also writes Lotus Symphony, while the slightly less antique Paradox for Windows does a passable Excel 5.
However, someone will have to sit down and export the tables one by one, or write a script do to it. Of course you'd need to have Paradox installed to write the script.
-Al.
MS has instructions for using the MS Jet driver to read data from files produced by Paradox 3-5. That can act as (at least) an ODBC driver, so you can use it to read a Paradox file from just about anything that knows how to use ODBC.
You have a few options:
Get your hands on the original Paradox software, and use it to export the database into CSV format. Unfortunately, Borland no longer sells it and the most recent version doesn't run well on Windows XP or above.
Access the database using either a Paradox or dBase/xBase ODBC driver. Paradox and xBase are very similar, so you may be able to extract the data using drivers meant for either of them. You may be able to get a Paradox ODBC driver somewhere on firebirdsql.org.
Use Borland Delphi to write a program which will export the data you need. As someone else mentioned, you can get a free version called Turbo Explorer. You will also have to install the BDE seperately, as it doesn't come with Turbo Explorer.
I've been working on a gigantic data migration from Paradox to MySQL. My general approach has been to export CSV files from Paradox, and then import the CSV files from the MySQL command line. However this system breaks down when there are M (memo) fields in Paradox, because that data doesn't get pulled into the CSV file as expected.
Here's my long-winded process for getting Paradox data into MySQL, hopefully it helps somebody!
Open Paradox file in Paradox, export to dbase (.dbf) file. What this does is it exports the memo data into dbase's blob format.
Open the .dbf file in Paradox. It might be necessary to convert double format to long integer or number before opening in dbfviewer. Double format appears to not be working. Save file.
Use this program to open up the dbase file and then export to Excel: http://dbfviewer.org/
Export -> XLS-File … this opens it in Excel
Now we need to create a macro because Excel doesn't have any native way to enclose CSV fields with quotes or anything else. I've pasted the macro below, but here are the reference sites that I found. One site had better instructions but corrupted text:
http://www.mrexcel.com/forum/showthread.php?320531-export-as-csv-file-enclosed-quotes
http://www.markinns.com/articles/full/export_excel_csvs_with_double_quotes/
In Excel replace all " with ' by CTRL-F, replace... any " in records will mess stuff up
In Excel press ALT - F11 to open up macros
Insert -> Module
Create this macro to save CSV files enclosed with double quotes:
Sub CSVFile()
Dim SrcRg As Range
Dim CurrRow As Range
Dim CurrCell As Range
Dim CurrTextStr As String
Dim ListSep As String
Dim FName As Variant
FName = Application.GetSaveAsFilename("", "CSV File (*.csv), *.csv")
If FName <> False Then
ListSep = Application.International(xlListSeparator)
If Selection.Cells.Count > 1 Then
Set SrcRg = Selection
Else
Set SrcRg = ActiveSheet.UsedRange
End If
Open FName For Output As #1
For Each CurrRow In SrcRg.Rows
CurrTextStr = ""
For Each CurrCell In CurrRow.Cells
CurrTextStr = CurrTextStr & """" & CurrCell.Value & """" & ListSep
Next
While Right(CurrTextStr, 1) = ListSep
CurrTextStr = Left(CurrTextStr, Len(CurrTextStr) - 1)
Wend
Print #1, CurrTextStr
Next
Close #1
End If
End Sub
Then Run -> Run Macro
Set up target MySQL db schema with text fields where we want the blobs to go
In MySQL command line here's an example of how to do the import:
LOAD DATA LOCAL INFILE 'C:/data.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(column1, column2)
Paradox is a native format for the Borland Database Engine, which is included with various Delphi programming products. Ownership has changed hands at least once recently, but at one point there were free "Express" versions of Delphi available that would let you write a simple program to export this stuff. If a free version is no longer available, the lowest available SKU should include BDE functionality.
Using MS Access 2007 you can import Paradox 7 and below using the BDE distribution included with the free Paradox Database Editor program (google it). Use a connection such as:
DoCmd.TransferDatabase acImport, "ODBC Database", _
"Paradox 3.X;HDR=NO;IMEX=2;ACCDB=YES;DATABASE=C:\apache\Archive;TABLE=Messages#db", _
acReport, DailyArchiveName, "MyDatabase"