Importing PIPE delimited format txt into MySQL via PHPMyAdmin - mysql

I am importing some thousands lines of Data from a .txt file containing two columns and the format is as it follows:
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
blablablablablablablablablablablablablablablablablablablabla3
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
blablablablablablablablablablablablablablablablablablablabla3
blablablablablablablablablablablablablablablablablablablabla4
etc....
What I have done so far is create a table with the two fields, but when i try to import the .txt file as a CSV and putting / Columns separated By : | /, I get an error:
"Invalid column count in CSV input on line 2."
Which is quite obvious since the second line of the .txt file is empty.
Moreover, I have tried importing the file as a CSV using LOAD DATA, and it didn't work as well it has just filled up the table with random words and phrases from the .txt file .
So my question is : How can I import the data from this file ?

You have to fix your file; in its current state you cannot expect the import module to be able to understand it. First step would be to remove the empty lines: How to remove blank lines from a Unix file

Related

CSV import (neo4j browser) returning empty nodes only i.e. without properties

I am unable to successfully import a csv file in the neo4j browser, as the nodes are created but they do not show the properties. Does anyone see the problem? I will describe how I proceeded:
This is how the csv file looks
I have tested the csv file with LOAD CSV WITH HEADERS FROM "file:///testCSV3.csv" AS line
WITH line LIMIT 4
RETURN line
and the result is ok (I guess?):
I then tried various things, as e.g. this query:
LOAD CSV WITH HEADERS FROM "file:///testCSV3.csv" AS line
CREATE (:Activity {activityName: line.MyActivity, time: toInteger(line.Timestamp)})
The outcome is nodes without properties:
Any ideas what I am missing? Why are the properties activityName and time not showing up? - Thanks in advance!
(You should have shown your raw CSV file, to make the problem clearer.)
I assume your raw file starts out like this:
ID ;Timestamp;MyActivity
1;1;Run
2;2;Talk
3;3;Eat
LOAD CSV is sensitive to extra spaces, so your ID header should not be followed by a space. Also, the default field terminator is a comma not a semicolon, so you need to specify the FIELDTERMINATOR option to override the default.
Your results would be more reasonable if you removed the extra space and changed your query to this:
LOAD CSV WITH HEADERS FROM "file:///testCSV3.csv" AS line FIELDTERMINATOR ';'
WITH line LIMIT 4
RETURN line

Can I selectively import data from a text file into MySQL?

I have a 13gb .txt file which I am importing into MySQL, however I don't want to import all of the data. For example there are many columns that are either completely empty or contain irrelevant information - I only want to import ~100/360 I've been provided. If I only create headers for the columns I want, can I select the specific corresponding data from the .txt file to be uploaded?
Normally I would use a text editor to remove the superfluous data, but I do not possess a text editor that can handle a file of this size.
You can ignore specific columns in the input file by assigning them to a user-defined variable instead of a database column.
For example if you had a CSV file with 4 columns and just wanted to import columns 1 and 4 into your table you could do something like this:
load data infile '/tmp/so42140337.csv'
into table so42140337
fields terminated by ','
lines terminated by '\n'
(c1,#dummy,#dummy,c2);
Given the size of your input file it may be more efficient to import it in chunks rather than importing the entire file in one command. You can use the pt-fifo-split tool for this, following the pattern in this blog post.

cannot load simple csv file into tableau public 9.3

I am trying to load the following simple csv file into tableau public 9.3:
customers,item1,item2,item3,item4
1,0,0,0,0
2,0,0,0,0
3,0,0,0,0
However, it doesn't read the file as separate columns, despite the field separator being Comma. Instead it treats the whole line as one column. Any help would be greatly appreciated :
If you change your locale settings to English US you will be able to load the file. You should also be able to work around this by creating a schema.ini file.
Go to Data > Manage fields > [Field] Options
You can also control imported CSV behavior post import both by splitting individual columns (which will remain split on update as well), or by the image below at the CSV level.
That doesn`t work for me. So I reopen the .csv file in Excel and save it again in .csv format with ',' as the delimeter.
After that my file looks like .csv with ';' delimeter and works with Tableau.

Importing .csv files and saving as .dta

I have a folder containing a number of csv files, e.g. "leeds dz.csv", "leeds gh.csv", "leeds fr.csv". The first part of the file names is constant (i.e. always "leeds").
I want to import each to Stata individually, convert to .dta file and save it. Currently I have this code:
cd "etcetc"
clear
local myfilelist : dir . files"*.csv"
foreach file of local myfilelist {
drop _all
insheet using `file', comma
local outfile = subinstr("`file'",".csv","",.)
save "`outfile'", replace
}
The code works fine if I rename all the .csv files manually to delete the "leeds" part, ie if each .csv is named "dz.csv" instead of "leeds dz.csv" etc.
However, if I do not do this deletion I receive the error "invalid 'dz.csv' "
I'm guessing this has something to do with my 3rd line of code, in particular the "*.csv". But I'm unsure how to adapt the code/ why it won't allow me to import files with a space in the name?
The line
insheet using `file', comma
will be problematic with any filename containing spaces.
Try
insheet using "`file'", comma
The help for insheet is quite explicit on this:
If filename is specified without an extension, .raw is assumed. If your
filename contains embedded spaces, remember to enclose it in double
quotes.

How to import .txt to MySQL table

How do I import a .txt file into a MySQL table?
My .txt file is like this...
ex : AF0856427R1 000002200R HADISUMARNO GGKAMP MALANG WET 3 6 00705 AFAAADF16000-AD-FA P00.001.0 1 000001.00022947.70023290.00 T511060856425A 022014B
There are 39 fields in my file.
Try mysqlimport command
name of the text file should be the name of the table in which you want the data to be imported. For eg, if your file name is patient.txt, data will be imported into patient table
mysqlimport [options] db_name textfile
There are lot of options that you can pass in. Documentation here
Especially since some of your fields are terminated by spaces and some are based on string length, I would definitely first do some string manipulation with your favorite tool (sed, awk, and perl are all likely very good choices).
Create an intermediary comma separated file. If you have commas in the existing file, you can easily use some other character. The goal is to create a file that has one consistent separator.
You've used the phpMyAdmin tag, so from your table go to the Import tab, select the file, and pick CSV from the dropdown of file types. Edit the options according to what your file looks like (for instance, perhaps ยง is your column separator and you might leave the next two options blank). Then try the import and check the data to make sure it all arrived in the columns you expected.
Good luck.