how to create a mysql table by csv file header? - mysql

My goal is to create a MySQL table containing data from my CSV file.
is there any way to do this
i don't know whether it is possible or not thorugh csv. the thing is like i want to generate mysql table using csv headers as filed names i know how to load csv into table which is already created manually but i want to generate table through csv using batch script and below is my batch script to load csv into table through manually
load data local infile "C:\\EQA\\project\\input1.csv"
into table request_table
character set latin1
fields terminated by','
ENCLOSED BY '"'
lines terminated by'\r\n'
IGNORE 1 ROWS
here above code is for to load csv into table in which request_table is existing in db.
but i want to load csv into table dynamically?
is that possible?if so can some one help me out to accomplish this?

Related

How to get MySQL to load data from a TEXT column that contains CSV data as multiple columns?

We want our users to be able to upload CSV files into our software and have the files be put into a temporary table so they can be analyzed and populated with additional data (upload process id, user name, etc).
Currently our application allows users to upload files into the system, but the files end up as TEXT values in a MySQL table (technically BLOB, but for our purposes, I will call it TEXT, as the only type of files I am concerned with are CSVs).
After a user uploads the CSV and it becomes a TEXT value, I want to take the TEXT value and interpret it as a CSV import, with multiple columns, to populate another table within MySQL without using a file output.
A simple insert-select into won't work as the TEXT is parsed as one big chunk (as it should be) instead of multiple columns.
insert into db2.my_table (select VALUE from db1.FILE_ATTACHMENT where id = 123456)
Most examples I have found export data from the DB as a file, then import it back in, i.e. something like:
SELECT VALUE INTO OUTFILE '/tmp/test.csv'
followed by something like:
LOAD DATA INFILE '/tmp/test.csv' INTO TABLE db2.my_table;
But I would like to do the entire process within MySQL if possible, without using the above "SELECT INTO OUTFILE/LOAD DATA INFILE" method.
Is there a way to have MySQL treat the TEXT value as multiple columns instead of one big block? Or am I stuck exporting to a file and then re-importing?
Thanks!
There is flaw in your data load approach.
Instead of keep each row in a single column, keep each value in respective column.
For example, suppose csv file contains n column. Create a table with n columns.
LOAD DATA INFILE '/tmp/test.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; // for ignoring header if present in csv file.

Excel file to mysql

I am creating a web application to collect some specific data from users. What I want is that the user uploads an excel file containing the data on the web page I created and that excel file stores its data on MySQL database.
Is it possible?How?
It's possible.
I would convert the Excel file to a csv file, or make the user upload a csv file instead. Excel already has this feature build in.
Then in MySQL you can turn the csv file into a tmp table with ease:
LOAD DATA LOW_PRIORITY LOCAL INFILE 'C:\\Users\\Desktop\\nameoffile.csv' REPLACE INTO TABLE `tmp_table` CHARACTER SET latin1 FIELDS TERMINATED BY ';' LINES TERMINATED BY '\r\n';
After that you transfer your data from the tmp table into the tables you'd like and finally you delete the temporary table.

Replace contents of MySQL table with contents of csv file on a remote server

I'm a newbie here trying to import some data into my wordpress database (MySQL) and I wonder if any of you SQL experts out there can help?
Database type: MySQL
Table name: wp_loans
I would like to completely replace the data in table wp_loans with the contents of file xyz.csv located on a remote server, for example https://www.mystagingserver.com/xyz.csv
All existing data in the table should be replaced with the contents of the CSV file.
The 1st row of the CSV file is the table headings so can be ignored.
I'd also like to automate the script to run daily at say 01:00 in the morning if possible.
UPDATE
Here is the SQL I'm using to try and replace the table contents:
LOAD DATA INFILE 'https://www.mystagingserver.com/xyz.csv'
REPLACE
INTO TABLE wp_loans
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
IGNORE 1 LINES
I would recommend a cron job to automate the process, and probably use BCP (bulk copy) to insert the data into a table... But seeing as you are using MySQL, instead of BCP, try load data in file - https://mariadb.com/kb/en/load-data-infile/

Get data from CSV file and add it to an array, then encrypt

LOAD DATA INFILE '$file'
INTO TABLE table
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(number, type)
if i can't just encrypt the data directly in the query, is it possible to get all the results and add them to an array, and then encrypt them one by one and insert them into the database?
Another way:
Create new CSV file (using original CSV file) with encrypted content then use the same LOAD DATA INFILE query for importing content at once.
It will save one by one insertion.
Correct me if i'm wrong.

How to export from excel to MySql with dynamic fields inside

I have a problem,
I want to export from excel to MySql database table, but the destination table field is dynamic.
Let's say:
table A for storing field_information (e.g. field_name, field_type)
table B for storing field_answers (e.g. field_info_id, value)
example of excel spreadsheet file (which I convert into cvs format):
name,school,news;
"test","test_school","test_news";
I know I can export from excel to MySql (with static field) using following syntax:
LOAD DATA LOCAL INFILE '\\path' INTO TABLE test FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n'
but, what if dynamic field?
How could I programmatically know which row of the spreadsheet should go to which database table?
Anyone can help?Thanks.
How will you programmatically know which row of the spreadsheet should go to which database table? How about some example rows of the spreadsheet?
Assuming I'm understanding your question correctly,
I'd sort the spreadsheet and then do two imports, to the two db tables.