I using postgres and mysql . Now i want migrate data from postgres to mysql.
I export data from postgres to csv and then reimport it to mysql.
My csv export from postgres with name : test.csv . Column data type json of postgres.
Test.csv
id,name,data
"1","someName","{""email"": ""a#gmail.com"", ""country"": ""US""}"
And in mysql i using command reimport to mysql with same column
LOAD DATA LOCAL INFILE '/tmp/test.csv' INTO TABLE someTable FIELDS TERMINATED BY ',' TERMINATED BY '\n' IGNORE 1 LINES;
When i import it import success but column data is null . I don't know why
If i using commmand
LOAD DATA LOCAL INFILE '/tmp/test.csv' INTO TABLE someTable fields terminated by ',' optionally enclosed by '\"' escaped by '\'' lines terminated by '\n' IGNORE 1 LINES;
MYSQL thow me exception :
Invalid JSON text: "The document root must not be followed by other values." at position 4 in value for column 'someTable.data'.
How to import from csv with json column to mysql. Please help . Thanks you
Related
I m trying to import data from a csv file to a database mysql that contains a column with type float.
So when i used the command :
load DATA INFILE 'd:/dropbox/Dropbox/applications/glimp/f.csv'
INTO TABLE iot_dataa
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\n'
(duration,date,fields1,fields2,fields3,fields4,rate,fields5,fields6,deviceid);
I got the field with type float number with values of '0' in all lines.
CSV file
Table definition
mysql table
LOAD DATA INFILE 'returns_inprogress_06-Feb-2016-11-35.csv'
IGNORE INTO TABLE fkreturn
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
(`FSN`,`Product`,`Order Id`,`ORDER ITEM ID`,`SKU Code`,`Return ID`,`Return Created On`,`Return Action`,`Return Type`,`Is FA`,`New Order Item Id`,`Sub Reason`,`Comments`,`Shipment Status`,`Tracking Id`,`Good Quantity`,`Bad Quantity`,`Total Price`,`Quantity`);
when executed it gives a error "#1290 - The MySQL server is running with the --secure-file-priv option" so it cannot execute this statement. I am using windows wamp, CSV file is in www folder. All I want is to ignore the duplicates of primary key (Return ID) and upload others.Should this field be marked as unique also. Or any other easy way to upload csv to table ignoring duplicates
Is there any way to export database records as CSV file from Mysql Workbench by using command line syntax?
SELECT orderNumber, status, orderDate, requiredDate, comments
FROM orders
WHERE status = 'Cancelled'
INTO OUTFILE 'C:/tmp/cancelled_orders.csv'
FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"'
LINES TERMINATED BY '\r\n';
more information here:
http://www.mysqltutorial.org/mysql-export-table-to-csv/
MySQL Workbench comes with a table data export function, which supports csv and json formats. You can start it from the context menu in the schema tree:
The simple wizard then allows you to select the output data type:
Similarly, there's the table data import wizard that can read csv and json data.
I'm using ubuntu 14.04 operating system and I want to import excel sheet(.xls) in to database.
I am using phpmyadmin mysql administration tool. If any one can give me the commands to import xls to mysql I would be grateful.
Import CSV file in to MySQL Table
Change file type in to CSV
Create table according to excel column names
Place the csv file in /var/lib/mysql
Open phpmyadmin sql command pane:
Execute this sql command
LOAD DATA INFILE 'testlist.csv'
INTO TABLE TEST
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Use the load data capability. See
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Export CSV file in to MySQL Table
SELECT ... FROM ... WHERE ...
INTO OUTFILE 'testlist.csv'
FIELDS TERMINATED BY ','
Here is an example that produces a file in the comma-separated values (CSV) format used by many programs:
SELECT * INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;
I'm migrating an old Sybase database to a new MySQL db.
Since the moment that Sybase can export data to .dat files (something similar to new csv), I decided to use it.
The problem is that Sybase uses commas as column separator and commas in strings are ignored because are enclosed in '', but not in MySQL.
Is there a way to solve the problem? Here's my query:
LOAD DATA LOCAL INFILE 'C:\\UNLOAD\\166.dat'
INTO TABLE linea_col
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\r\n';
Thank you in advance.
If they are enclosed by single quotes try this:
LOAD DATA LOCAL INFILE 'C:\\UNLOAD\\166.dat'
INTO TABLE linea_col
COLUMNS TERMINATED BY ','
ENCLOSED BY '\''
LINES TERMINATED BY '\r\n';
If that doesn't work then you should do ENCLOSED BY '''' but I'm 99.99% certain the first is correct.
I'm not familiar with how DAT files may differ to CSV, but you can use CSV as the common format.
You can export from Sybase using BCP and forcing the output to use CSV format
bcp database.dbo.tbl_name out c:\temp\output.csv -c -t, -U user_name -P password -S server_name
Then import into MySQL with
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)