how to sum varchar values in MySQL - mysql

I am loading multiple text files into a database using the Load Data infile statement. There seems to be an issue when trying to load numeric values into a their respective numeric fields. I did some research and per MySQL documentation, all data loaded in is treated as text, so all the values are being input as null.
LOAD DATA INFILE regards all input as strings, so you cannot use
numeric values for ENUM or SET columns the way you can with INSERT
statements. All ENUM and SET values must be specified as strings.
I tried casting the specific fields as numeric or decimal and I still get null values in the table.
I.E.
LOAD DATA INFILE 'blabla.txt' INTO TABLE example FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '' LINES TERMINATED BY '\r\n' ignore 1 lines
(field1, field2, field3, #problemfield,)
set problemfield= cast((REPLACE(REPLACE(REPLACE(#problemfield,',',''),'(', '-'), ')','')) as DECIMAL(10,0));
I am using the replaces because sometimes negatives are in parentheses in the data.
There were similar questions on stackoverflow about casting when loading in and many responses(cant find the links now) suggest loading in as text, then transferring to new numeric field and deleting old field, is that an optimal solution ? How is this issue usually handled? Since I am sure this scenario must happen a lot (load all this text data and perform operations on them)

Load your data into a staging table. Manipulate as required. Write to your real tables from your staging table.

Related

Which field delimiter can be used when in SELECT... INTO OUTFILE CSV when the fields have different special characters

I want to export the MySQL data from a table into a CSV file using SELECT... INTO OUTFILE command. The rows contain different special characters like comma(,) semicolon(;) and even have tab spaces in some columns which makes the column values not exported to its corresponding columns
Sample query that I executed,
SELECT * FROM table_name INTO OUTFILE '/path/to/outfile.csv' FIELDS TERMINATED BY ';' ESCAPED BY '\"' LINES TERMINATED BY '\r\n';
I have to import these data into BigQuery in a similar table. There are some fields whose values have (;) in a column. When I am using the field separator as (;) it separates the single column value into two values. The same applies for '\t' as well
I also have no idea what kind of special characters these table data will contain
Kindly suggest any of the following ways:
Which delimiter would be used to separate fields that doesn't affect the original data
Is there any other format like parquet can be supported in MySQL OUTFILE command

How to use LOAD DATA INFILE correctly?

I have so many data records and I want to import it into a table in database. I'm using phpmyadmin. I typed
LOAD DATA INFILE 'C:/Users/Asus/Desktop/cobacobaa.csv' INTO TABLE akdhis_kelanjutanstudi
but the result came like this:
I do not know why it said that I have duplicate entry "0" for primary, but actually in my data there is no duplicate entry, here is a part of my data
could you please help me to solve this? what may I do to solve that problem? thanks in advance
I would guess that your primary key is a number. The problem would then be that the value starts with a double quote. When converting a string to a number, MySQL converts the leading numeric characters -- with no such characters, the value is zero.
The following might fix your problem:
LOAD DATA INFILE 'C:/Users/Asus/Desktop/cobacobaa.csv'
INTO TABLE akdhis_kelanjutanstudi
FIELDS TERMINATED BY ',' ENCLOSED BY '"';
I usually load data into a staging table, where all the values are strings, and then convert the staging table into the final version. This may use a bit more space in the database but I find that the debugging process goes much faster when something goes wrong.

How to get MySQL to load data from a TEXT column that contains CSV data as multiple columns?

We want our users to be able to upload CSV files into our software and have the files be put into a temporary table so they can be analyzed and populated with additional data (upload process id, user name, etc).
Currently our application allows users to upload files into the system, but the files end up as TEXT values in a MySQL table (technically BLOB, but for our purposes, I will call it TEXT, as the only type of files I am concerned with are CSVs).
After a user uploads the CSV and it becomes a TEXT value, I want to take the TEXT value and interpret it as a CSV import, with multiple columns, to populate another table within MySQL without using a file output.
A simple insert-select into won't work as the TEXT is parsed as one big chunk (as it should be) instead of multiple columns.
insert into db2.my_table (select VALUE from db1.FILE_ATTACHMENT where id = 123456)
Most examples I have found export data from the DB as a file, then import it back in, i.e. something like:
SELECT VALUE INTO OUTFILE '/tmp/test.csv'
followed by something like:
LOAD DATA INFILE '/tmp/test.csv' INTO TABLE db2.my_table;
But I would like to do the entire process within MySQL if possible, without using the above "SELECT INTO OUTFILE/LOAD DATA INFILE" method.
Is there a way to have MySQL treat the TEXT value as multiple columns instead of one big block? Or am I stuck exporting to a file and then re-importing?
Thanks!
There is flaw in your data load approach.
Instead of keep each row in a single column, keep each value in respective column.
For example, suppose csv file contains n column. Create a table with n columns.
LOAD DATA INFILE '/tmp/test.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; // for ignoring header if present in csv file.

Get data from CSV file and add it to an array, then encrypt

LOAD DATA INFILE '$file'
INTO TABLE table
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(number, type)
if i can't just encrypt the data directly in the query, is it possible to get all the results and add them to an array, and then encrypt them one by one and insert them into the database?
Another way:
Create new CSV file (using original CSV file) with encrypted content then use the same LOAD DATA INFILE query for importing content at once.
It will save one by one insertion.
Correct me if i'm wrong.

bulk insert into mysql

I have a gridview in asp.net page with a checkbox column. I need to insert those checked rows on the gridview, to the mysql table.
One of the most easiest ways would be to find the selected rows and insert them one by one over loop.
However, it is time-consuming considering there may be 10000 rows at any instance of time. Given that this is time-consuming process, there is a risk of losing the connection on the course of insertion.
Is there any way to expedite insertion of huge number of rows?
Thanks,
Balaji G
You can first get all the checked records, then write them into to tab delimited or comma delimited format, then using syntax from LOAD DATA INFILE do the bulk insertion.
Here's the sample format:
LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';