I am trying to import CSV data into MySQL using the LOAD DATA LOCAL INFILE syntax. This is normally a fairly simple task, but in this case the data includes a geometry field that is tripping me up.
When I try to run the import, I'm getting errors like this:
SQLSTATE[HY000]: General error: 4079 Illegal parameter data type longblob for operation 'st_geometryfromwkb'
The records in my CSV file look like this:
'Somewhere', -0.574823, 51.150771, '0x0101000000000000000000F03F000000000000F0BF'
So I have a location name, lat/long coords and a geometry field in binary WKB format. (the example above is a simple geometry that translates to POINT(1,1); the real data has complex polygons, but the content isn't relevant; the issue is the same with this simple example).
My table looks like this:
CREATE TABLE IF NOT EXISTS `mapping` (
`id` int AUTO_INCREMENT PRIMARY KEY,
`location` varchar(80) DEFAULT NULL,
`longitude` double DEFAULT NULL,
`latitude` double DEFAULT NULL,
`geom` geometry NOT NULL,
INDEX mapping_by_location (location),
SPATIAL KEY `mapping_by_geom` (`geom`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
And my import query looks like this:
LOAD DATA LOCAL INFILE '{$file}'
REPLACE INTO TABLE `mapping`
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(#col1, #col2, #col3, #col4)
SET
`location` = #col1,
`latitude` = #col2,
`longitude` = #col3,
`geom` = GeomFromWKB(#col4);
As stated, with this import query, I am getting the Illegal parameter data type error shown at the top of this question.
However, the query works if I replace the final line with a hard-coded geometry string, like this:
`geom` = GeomFromWKB(0x0101000000000000000000F03F000000000000F0BF);
Obviously this isn't any good, as I need the field to load from the CVS not a hard-coded value in the query, but it does work, whereas loading the same value from the CSV in #col4 does not.
I have tried a bunch of variations on this query - with and without the call to GeomFromWKB(), with both X'...' and 0x... notations for the hex value; nothing seems to work.
Can anyone give me some help please?
Related
I'm working on a new web project right now, but the data is stored in the excel program, I don't want to add them to the list manually, do you think this is possible?
You have some ways of doing it:
You can use load data.
Let's say you have the table below:
CREATE TABLE `set_of_data` (
`id` int NOT NULL AUTO_INCREMENT,
`x` varchar(10) DEFAULT NULL,
`y` varchar(10) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB ;
Your excel file should be in .csv file format :
The you can use load data.
LOAD DATA INFILE '/var/lib/mysql/your_data.csv' ---path of your file in server, it could be '/var/lib/mysql-files/your_data.csv'
IGNORE INTO TABLE set_of_data
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(id,x,y);
Another way is that you can create an excel formula for your data and insert it.
This is for small tables, with not so much data.
I am trying to load data for Q1 2012 from the below link
https://s3.amazonaws.com/capitalbikeshare-data/index.html
My code is as follows:-
DROP DATABASE IF EXISTS bike;
CREATE DATABASE bike;
USE bike;
DROP TABLE IF EXISTS bike_2012;
CREATE TABLE bike_2012(
bike_duration INT NULL,
bike_start_date TIMESTAMP NULL,
bike_end_date TIMESTAMP NULL,
bike_s_station_no INT(5) NULL,
bike_s_station_name VARCHAR(255) NULL,
bike_e_station_no INT(5) NULL,
bike_e_station_name VARCHAR(255) NULL,
bike_number CHAR(6) NULL,
bike_member_type VARCHAR(25) NULL,
bike_ride_number INT auto_increment PRIMARY KEY);
LOAD DATA LOCAL INFILE 'C:/LAGASA_2018/MSBA/Data_Sources/2012-capitalbikeshare-tripdata/2012Q1-capitalbikeshare-tripdata.csv'
INTO TABLE bike_2012
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '/n'
('bike_duration', #bike_start_date, #bike_end_date, 'bike_s_station_no','bike_s_station_name',
'bike_e_station_no','bike_e_station_name','bike_number','bike_member_type')
SET 'bike_start_date' = STR_TO_DATE(#bike_start_date, '%c/%e/%Y')
SET 'bike_end_date' = STR_TO_DATE(#bike_end_date, '%c/%e/%Y')
IGNORE 1 LINES;
SELECT * FROM bike_2012 LIMIT 10;
I am facing the following issues:-
Some columns that have integer data also have string data, so those parts are not getting loaded correctly. I tried to add OPTIONALLY ENCLOSED BY '"' but its not working.
Unable to change date to SQL date format
Other errors like Row doesn't contain data for all columns and data truncated for date columns are appearing.
I have been struggling to correct this. Please help.
Thanks and Regards
You won't be able to simply load wrong CSV into DB and fix it.
If you have access to PHP/Python or other language that has a driver to connect to your db engines, load that file into an array, or use something similar to fgets() in php to load it line by line and process each row separately, fix/convert data and then push it to db engine (I would suggest even grouping inserts for speed).
You are dealing not only with conversion, but there might be issues with string encoding (you didn't specify any in your CREATE TABLE which might cause a problem in itself.
I am trying to import a small data set of Berlin street addresses using MySQL's LOAD DATA statement. The problem is that after the import runs, all of the beautiful ß characters in the German street names have become ß sets.
Here's the create-table statement I used for this table:
CREATE TABLE `subway_distances` (
`STN` varchar(255) DEFAULT NULL,
`HNR` int(9) DEFAULT NULL,
`Lat` decimal(36,15) DEFAULT NULL,
`Lon` decimal(36,15) DEFAULT NULL,
`Distance` decimal(45,20) DEFAULT NULL
) ENGINE=MyISAM DEFAULT CHARSET=utf8
... and here is my MySQL shell code:
charset utf8;
TRUNCATE TABLE subway_distances;
LOAD DATA LOCAL INFILE '/path/to/output.csv'
INTO TABLE berlin.subway_distances
FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '\\';
SELECT * FROM subway_distances LIMIT 0,10;
I have looked at output.csv in vim, and the eszett character appears to be fine there.
I am assuming that I simply need a different encoding declaration in MySQL, but I'm not sure where to start.
I am also assuming that collation doesn't matter yet, since I'm not comparing values -- just purely trying to get a valid import.
I found an answer to this relatively quickly. It looks like I just need to specify the CHARACTER SET value in my LOAD DATA statement. So the new statement looks like this:
LOAD DATA LOCAL INFILE '/path/to/output.csv'
INTO TABLE berlin.subway_distances
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '\\';
I've searched all over and can't seem to figure this out, so here is my first Stack Exchange question.
I'm using a java program to run the bulk load process, but I've also tried it straight from my sql client, MySQL Workbench, and I get the same error:
LOAD DATA INFILE '/path/to/file/infile.csv'
INTO TABLE t1
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(category, item, date_time, v1, v2, v3);
Error:
Error Code: 1062. Duplicate entry ''Book'-'Fiction'-2014-04-16 09:33:00' for key 'PRIMARY'
Using my sql client I've confirmed that there is no such current record in the table, in fact I don't have any records for the same category-type pair in the same month. I have many (~16,000) CSV files to load into my MySQL database each month, each file corresponds to a separate category-type pair with different values over the course of the month. I have been successful with this method so far having loaded over 50 million records, however I can't seem to load any more without getting this same error.
My table uses 3 fields to create the PRIMARY key, 2 varchar() and a datetime
'CREATE TABLE `t1` (
`category` varchar(10) NOT NULL,
`item` varchar(15) NOT NULL DEFAULT '''',
`date_time` datetime NOT NULL,
`v1` double DEFAULT NULL,
`v2` double DEFAULT NULL,
`v3` double DEFAULT NULL,
PRIMARY KEY (`category`,`type`,`date_time`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1'
I have worked with databases in the past, but nowhere near this many
records, I don't know if that is the problem.
I could switch to using an auto-incremented id for my primary key,
but it may take up more room considering the large number of records
and I may get duplicates for my (category, item, date_time) which
would be problematic.
I know that MySQL permits a “relaxed” format for values specified as
strings, and I may need to do some additional formatting to figure
this out.
I deleted the first line of my csv file with the value
''Book'-'Fiction'-2014-04-16 09:33:00', but then I get the same 1062
error for the next date time value ''Book'-'Fiction'-2014-04-16
09:35:00'
I thought it might be the way I am formatting my Datetime string
but I am using the "YYYY-MM-DD HH:MM:SS" format which has worked
on thousands of other LOAD DATA INFILE. Just to be safe I tried
using the STR_TO_DATE() function see below
LOAD DATA INFILE '/path/to/file/infile.csv'
INTO TABLE t1
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(category, item, #date_var, v1, v2, v3)
SET date_time = STR_TO_DATE(#date_var, '%Y-%m-%d %H:%i:%s');
Any help would be appreciated.
I was wondering how to import text file into MySQL workbench?
I have a text file delimited by | and the first row are the tables,
FEATURE_ID|FEATURE_NAME|FEATURE_CLASS
then it follows by data information after that
1388627|Etena|Populated Place
What is the best way to import this .txt file into MySQL workbench?
Thanks1
It's not clear what exactly you intend to achieve, but if you want to import delimited text file into db then you can use LOAD DATA INFILE like this:
LOAD DATA INFILE '/path/file.txt'
INTO TABLE tablename
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
UPDATE:
First of cause you need to create the table (if it's not done yet) like this:
CREATE TABLE `tablename` (
`FEATURE_ID` int(11) unsigned NOT NULL,
`FEATURE_NAME` varchar(512) DEFAULT NULL,
`FEATURE_CLASS` varchar(512) DEFAULT NULL,
PRIMARY KEY (`FEATURE_ID`)
)
You might need to adjust data types, lengths, and constraints on that table. For example you might not need a PK on that table.