import text file into mysql workbench? - mysql

I was wondering how to import text file into MySQL workbench?
I have a text file delimited by | and the first row are the tables,
FEATURE_ID|FEATURE_NAME|FEATURE_CLASS
then it follows by data information after that
1388627|Etena|Populated Place
What is the best way to import this .txt file into MySQL workbench?
Thanks1

It's not clear what exactly you intend to achieve, but if you want to import delimited text file into db then you can use LOAD DATA INFILE like this:
LOAD DATA INFILE '/path/file.txt'
INTO TABLE tablename
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
UPDATE:
First of cause you need to create the table (if it's not done yet) like this:
CREATE TABLE `tablename` (
`FEATURE_ID` int(11) unsigned NOT NULL,
`FEATURE_NAME` varchar(512) DEFAULT NULL,
`FEATURE_CLASS` varchar(512) DEFAULT NULL,
PRIMARY KEY (`FEATURE_ID`)
)
You might need to adjust data types, lengths, and constraints on that table. For example you might not need a PK on that table.

Related

LOAD DATA INFILE the entire file into a field

I am storing the contents of text files in a table
CREATE TABLE Pages
(
ID int(11) unsigned NOT NULL,
Text mediumtext COMPRESSED,
PRIMARY KEY(ID)
) ENGINE=ARIA DEFAULT CHARSET=utf8 COLLATE utf8_general_ci ROW_FORMAT=DYNAMIC
I try to INSERT each file's contents directly via LOAD DATA INFILE
LOAD DATA INFILE 'file.txt' INTO TABLE table
FIELDS TERMINATED BY '\0' LINES TERMINATED BY '' (Text)
SET ID=$id
The problem is that if I ideally use TERMINATED BY '', it gives the error
You can't use fixed rowlength with BLOBs; please use 'fields
terminated by'
I used '\0' assuming the null character does not exist in the text file. Although it works, is there a more standard way to do so?

load data local infile imports only 200k out of 400k records

Hello! I am new to MYSQL so kindly explain in as simple language as possible!
I have a csv with 400k rows and want to import it into mysql. I am using LOAD DATA LOCAL INFILE command for this purpose:
LOAD DATA LOCAL INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Comorbidity Covid-19.csv'
INTO TABLE `comorbidity covid-19`
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
The issue is that only about 200k records are being imported while the csv contains 400k records. Why is this happening? I executed the command both in the command prompt and in MySql Workbench but both give the same output. Also the date column is not being imported correctly. Instead of dates being displayed it is showing 0000-00-00 in each rows.
PS: OPT_LOCAL_INFILE=1 in manage database connections!
PS : Here is some sample data
What I did was first I created an empty table in the database with respective column types by. I created an empty table with only the column headers by right clicking on tables and selecting create new table option where I selected the proper type for each columns.. Date as of and Start Date were given Date type and so on. Then I executed the above query both in command prompt and workbench to import the rows.
show create table comorbidity gives this result:
CREATE TABLE `comorbidity` (
`Date as of` date NOT NULL,
`Start Date` date NOT NULL,
`State` varchar(20) NOT NULL,
`Condition group` varchar(50) NOT NULL,
`Condition` varchar(45) NOT NULL,
`Age group` varchar(15) NOT NULL,
`Covid19 deaths` int NOT NULL,
`Number of mentions` int NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci
It may be that the date is not in the correct format that is why it looks wrong. Try to modify the field from excel to a correct format, or perform a DATE() function for the date.
On the subject of importing all the records, check if there is any character that interrupts the execution.
The reason only 200k records were being imported was because I was using:
LINES TERMINATED BY '\n'
When I changed it to:
LINES TERMINATED BY '\r\n'
All 400 k records were imported.

I have a lot of data in Excel and I want to add them to MySQL database, how can I do it?

I'm working on a new web project right now, but the data is stored in the excel program, I don't want to add them to the list manually, do you think this is possible?
You have some ways of doing it:
You can use load data.
Let's say you have the table below:
CREATE TABLE `set_of_data` (
`id` int NOT NULL AUTO_INCREMENT,
`x` varchar(10) DEFAULT NULL,
`y` varchar(10) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB ;
Your excel file should be in .csv file format :
The you can use load data.
LOAD DATA INFILE '/var/lib/mysql/your_data.csv' ---path of your file in server, it could be '/var/lib/mysql-files/your_data.csv'
IGNORE INTO TABLE set_of_data
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(id,x,y);
Another way is that you can create an excel formula for your data and insert it.
This is for small tables, with not so much data.

LOAD DATA LOCAL INFILE COMMAND several errors

I am trying to load data for Q1 2012 from the below link
https://s3.amazonaws.com/capitalbikeshare-data/index.html
My code is as follows:-
DROP DATABASE IF EXISTS bike;
CREATE DATABASE bike;
USE bike;
DROP TABLE IF EXISTS bike_2012;
CREATE TABLE bike_2012(
bike_duration INT NULL,
bike_start_date TIMESTAMP NULL,
bike_end_date TIMESTAMP NULL,
bike_s_station_no INT(5) NULL,
bike_s_station_name VARCHAR(255) NULL,
bike_e_station_no INT(5) NULL,
bike_e_station_name VARCHAR(255) NULL,
bike_number CHAR(6) NULL,
bike_member_type VARCHAR(25) NULL,
bike_ride_number INT auto_increment PRIMARY KEY);
LOAD DATA LOCAL INFILE 'C:/LAGASA_2018/MSBA/Data_Sources/2012-capitalbikeshare-tripdata/2012Q1-capitalbikeshare-tripdata.csv'
INTO TABLE bike_2012
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '/n'
('bike_duration', #bike_start_date, #bike_end_date, 'bike_s_station_no','bike_s_station_name',
'bike_e_station_no','bike_e_station_name','bike_number','bike_member_type')
SET 'bike_start_date' = STR_TO_DATE(#bike_start_date, '%c/%e/%Y')
SET 'bike_end_date' = STR_TO_DATE(#bike_end_date, '%c/%e/%Y')
IGNORE 1 LINES;
SELECT * FROM bike_2012 LIMIT 10;
I am facing the following issues:-
Some columns that have integer data also have string data, so those parts are not getting loaded correctly. I tried to add OPTIONALLY ENCLOSED BY '"' but its not working.
Unable to change date to SQL date format
Other errors like Row doesn't contain data for all columns and data truncated for date columns are appearing.
I have been struggling to correct this. Please help.
Thanks and Regards
You won't be able to simply load wrong CSV into DB and fix it.
If you have access to PHP/Python or other language that has a driver to connect to your db engines, load that file into an array, or use something similar to fgets() in php to load it line by line and process each row separately, fix/convert data and then push it to db engine (I would suggest even grouping inserts for speed).
You are dealing not only with conversion, but there might be issues with string encoding (you didn't specify any in your CREATE TABLE which might cause a problem in itself.

MySQL how to specify string position with LOAD DATA INFILE

I have ASCII files with a static number of characters for each line with no delimiters. I'd like to use LOAD DATA INFILE to import into my table.
Example of file:
USALALABAMA
USARARKANSAS
USFLFLORIDA
The structure for this table:
country Char(2)
state Char(2)
name Varchar(70)
CREATE TABLE `states` (
`country` char(2) COLLATE latin1_general_ci NOT NULL,
`state` char(2) COLLATE latin1_general_ci NOT NULL,
`name` varchar(70) COLLATE latin1_general_ci NOT NULL
) ENGINE=MyISAM DEFAULT CHARSET=latin1_general_ci COLLATE=latin1_general_ci;
Is it possible to specify a start and end position for each column?
According to the documentation, you can load a fixed format file without using a temporary table.
If the FIELDS TERMINATED BY and FIELDS ENCLOSED BY values are both empty (''), a fixed-row (nondelimited) format is used. With fixed-row format, no delimiters are used between fields (but you can still have a line terminator). Instead, column values are read and written using a field width wide enough to hold all values in the field. For TINYINT, SMALLINT, MEDIUMINT, INT, and BIGINT, the field widths are 4, 6, 8, 11, and 20, respectively, no matter what the declared display width is.
The positions are derived from the columns definitions, which in your case match the structure of the file. So you just need to do:
LOAD DATA INFILE 'your_file' INTO TABLE your_table
FIELDS TERMINATED BY ''
LINES TERMINATED BY '\r\n'
SET name = trim(name);
First create a temporary table which you will load all lines into it, then you can load the data from the temporary table into the main table and split to fields using substring
Something like this:
CREATE TEMPORARY TABLE tmp_lines
(countrystring TEXT);
LOAD DATA INFILE 'yourfilegoeshere' INTO TABLE tmp_lines
FIELDS TERMINATED BY ''
LINES TERMINATED BY '\r\n';
INSERT INTO main_table SELECT SUBSTRING(countrystring,1,2), SUBSTRING(countrystring,3, 2), SUBSTRING(countrystring,5) from tmp_lines;
Another way to do this is just assigning a variable and splitting it direct in your load.
LOAD DATA INFILE 'yourfilegoeshere' INTO TABLE main_table
LINES TERMINATED BY '\r\n' (#_var)
set
field1=TRIM(SUBSTR(#_var from 1 for 2)),
field2=TRIM(SUBSTR(#_var from 3 for 2)),
field3=TRIM(SUBSTR(#_var from 5 for 70));
Just be sure not to specify any field separator, otherwise you will have to use more variables, note that I'm using TRIM to clean data in the same statement.