Inserting data from text files into pre-existing columns - mysql

I am trying to insert data from a text file (18.9GB large) that looks like this:
as8dyta89sd6892yhgeui2eg
asoidyaos8yd98t2y492g4n2
as8agfuigf98safg82b1hfdy
They are all a length of 32 characters. Currently I have a database named hashmasher and a table called combinations with columns named unhashed and sha256. Currently I have data stored in the unhashed columns. Looking like:
unhashed | sha256
data | (completely empty)
Now I am wondering, how I could insert the data into the existing columns aswell as only adding the data to the second column, so for example the above would become
unhashed | sha256
data | firstlineoftextfile
data | secondlineoftextfile
If I use LOAD DATA INFILE it will load it into NEW rows (that's what I've been told) and it will load it into the unhashed column aswell as the sha256 column.
TL;DR I want to insert data from a text file into the second column of pre-existing rows.

Insert your data with LOAD DATA INFILE into a new table. It may be temporary, to speed thing up a bit. Use INSERT ... SELECT ... JOIN to merge two tables.
I understand it can take a few hours with 19G table.
Things are more complicated, since your original file contains one value per row. You may want to fix it up with sed/awk script so that there are two values per row, so that LOAD DATA INFILE works.
The other approach is to go on with sed/awk scripting, and convert your original file into a file with a bunch of UPDATE statements, and then pipe the result to MySQL.

Related

Update table from csv file select first field?

I have a series of CSV files that I had wanted to import into MySQL. To first populate the table I did the following;
mysql -u root -p apn -e "LOAD DATA LOCAL INFILE '/opt/cell.csv' INTO TABLE data FIELDS TERMINATED BY ',';"
Where the CSV contents as;
89xx,31xx,88xx,35xx,ACTIVATED,250.0MB,GPRS,96xx,0,0,2,false,DEFAULT
The one unique field is the first starting with '89xx' (which goes into column named 'iccid').
Now I want to do is update the table, but clueless how to use the first entry in the CSV to update the rest of the row? It will be like the 4th field that I need to get updated overtime as that is the value that will change (data usage for specific cellular device). I don't have that much of a problem emptying the table before doing a whole new import, I was thinking though it was a better practice to just update, I will eventually need update several times a day.
Since I have no practical skills in any language, or mysql for that matter, would it be best to just insert into a temp table and update from that?
You can use
REPLACE
before
INTO
keyword to update/replace your row.
LOCAL INFILE '/opt/cell.csv' REPLACE INTO TABLE data FIELDS TERMINATED BY ',';"
To ignore your duplicate index you can use
IGNORE
keyword before
INTO
Load data manual

How to get MySQL to load data from a TEXT column that contains CSV data as multiple columns?

We want our users to be able to upload CSV files into our software and have the files be put into a temporary table so they can be analyzed and populated with additional data (upload process id, user name, etc).
Currently our application allows users to upload files into the system, but the files end up as TEXT values in a MySQL table (technically BLOB, but for our purposes, I will call it TEXT, as the only type of files I am concerned with are CSVs).
After a user uploads the CSV and it becomes a TEXT value, I want to take the TEXT value and interpret it as a CSV import, with multiple columns, to populate another table within MySQL without using a file output.
A simple insert-select into won't work as the TEXT is parsed as one big chunk (as it should be) instead of multiple columns.
insert into db2.my_table (select VALUE from db1.FILE_ATTACHMENT where id = 123456)
Most examples I have found export data from the DB as a file, then import it back in, i.e. something like:
SELECT VALUE INTO OUTFILE '/tmp/test.csv'
followed by something like:
LOAD DATA INFILE '/tmp/test.csv' INTO TABLE db2.my_table;
But I would like to do the entire process within MySQL if possible, without using the above "SELECT INTO OUTFILE/LOAD DATA INFILE" method.
Is there a way to have MySQL treat the TEXT value as multiple columns instead of one big block? Or am I stuck exporting to a file and then re-importing?
Thanks!
There is flaw in your data load approach.
Instead of keep each row in a single column, keep each value in respective column.
For example, suppose csv file contains n column. Create a table with n columns.
LOAD DATA INFILE '/tmp/test.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; // for ignoring header if present in csv file.

How to replace a Column simultaneously with LOAD INFILE in MySQL

Suppose we have table with a DECIMAL column with values, for example: 128.98, 283.98, 21.20.
I want to import some CSV Files to this table. However, in the columns of these files, I have values like 235,69, 23,23, with comma instead of points.
I know I can REPLACE that column, but is there some way of doing that before LOAD INFILE?
I do not believe you can simultaneously replace that column and load the data. Looks like you will have to do multiple steps to get the results you want.
Load the data first into a raw table using the LOAD INFILE command. This table can be identical to the main table. You can use the Create Table like command to create the table.
Process the data (i.e. change the comma to a . where applicable) in the raw table.
select the data from the raw table and insert into main table either with row by row processing or bulk insert.
This can all be done in a stored procedure (SP) or by a 3rd party script written in python, php, etc...
If you want to know more about SP's in Mysql, Here is a useful link.

Import SQL File into existing DB-Table (phpmyadmin)

I have an SQL-File defining a table with 2 columns - product-id and product-reference. I need to import that file into an existing table that has the same structure (plus some extra columns), where product-id corresponds to the product-id from the backup file. Is there a simple way to do that via phpmyadmin?
One approach is to use load data infile (see here) with the set option to assign column values. Columns that are not being set will be given their default values, which is typically NULL.
Personally, I would load the data into a staging table with two columns and then insert the data from the staging table into the final table. This makes it easier to validate the data before putting it into the "real" table.

Load xml data into sql database using phpmyadmin

I know this is a really basic question but I am struggling on my first import of data from an xml file. I have created the table "Regions" which has just two columns - ID and Name. The xml file contains the same column names.
In order to bulk import the data, I am using the following SQL command:
LOAD XML LOCAL INFILE 'C:\Users\Dell\Desktop\regions.xml'
INTO TABLE Regions (ID, Name)
but I am getting the error #1148 - The used command is not allowed with this MySQL version
Now having researched the internet, to allow this command requires a change in one of the command files but my service provider doesn't allow me access to it. Is there an alternative way to write the SQL code and do exactly the same thing as the code above which is basically just import the data from an xml file?
Many thanks
Since LOAD DATA INFILE isn't enabled for you, it appears you have only one more option and that's to create a set of INSERT statements for each row. If you converted your XML file to CSV using Excel, that's an easy step. Assuming you have a rows of data like this
A | B
-----|-------------------------
1 | Region 1
2 | Region 2
I would create a formula like this in column C
=CONCATENATE("INSERT INTO Regions(ID,Name) VALUES(",A1,",'",B1,"');")
This will result in INSERT INTO Regions(ID,Name) VALUES(1,'Region 1'); for your first row. File this down to the last row of your spreadsheet. Select all the insert statements and copy them into a Query text box inside PHPMyAdmin and you should be able to insert your values.
I've used this method many times when I needed to import data into a database.