import multiple table data into mysql using single csv file - mysql

I have this table in csv file
Now i have two mysql tables dealing with this csv file,
Data A, B, C has to get stored in Table1 whereas data D, E, F, G, H has to get stored in Table2.
I have the above formatted csv file, how can i upload its data to MYSQL database??
So that from same file input can be done for different tables.

Try this it's working well, you can add as many values as possible depending on the number of columns you have in the CSV file.
<?php
$fname = $_FILES['csv_file']['name'];
$chk_ext = explode(".",$fname);
$filename = $_FILES['csv_file']['tmp_ name'];
$handle = fopen($filename, "r");
if(!$handle){
die ('Cannot open file for reading');
}
while (($data = fgetcsv($handle, 10000, ",")) !== FALSE)
{
$query = "INSERT INTO tablename (col1_csv, col2_csv)
values ('$data[0]', '$data[1]');
mysql_query($query)
or die(mysql_error());
$query1 = "INSERT INTO tablename (col1_csv, col2_csv)
values ('$data[0]', '$data[1]');
mysql_query($query1)
or die(mysql_error());
}
fclose($handle);
?>

Probably only via custom script (PHP or so)...
But it's usually not hard to split CSV into 2 files and import both into separate tables using LOAD_DATA function from MySQL directly. It will be MUCH faster than using phpMyAdmin or similar scripts.

Load the data in a temporary table, then split up in two tables.
Then drop that temporary table. I somewhere read that php can do this, but I cannot help in php codes.

Related

Importing data from VERY large text file into Mysql [duplicate]

I have a very large CSV file (150 MB). What is the best way to import it to MySQL?
I have to do some manipulation in PHP before inserting it into the MySQL table.
You could take a look at LOAD DATA INFILE in MySQL.
You might be able to do the manipulations once the data is loaded into MySQL, rather than first reading it into PHP. First store the raw data in a temporary table using LOAD DATA INFILE, then transform the data to the target table using a statement like the following:
INSERT INTO targettable (x, y, z)
SELECT foo(x), bar(y), z
FROM temptable
I would just open it with fopen and use fgetcsv to read each line into an array.
pseudo-php follows:
mysql_connect( //connect to db);
$filehandle = fopen("/path/to/file.csv", "r");
while (($data = fgetcsv($filehandle, 1000, ",")) !== FALSE) {
// $data is an array
// do your parsing here and insert into table
}
fclose($filehandle)

How to write value stored in a variable to CSV in MySQL

MySQL: I want to write a timestamp stored in a variable to a CSV file and I only know how to store a query.
Here's how I do it (using my own SQL() function to retrieve the result set via fetch_all(MYSQLI_ASSOC), but you're clearly ok on that bit):
$out = fopen($file, 'w');
$rows = SQL($sqlstatement);
fputcsv($out, array_keys($rows[0]));
foreach($rows as $row) { fputcsv($out, $row); }
fclose($out);
The key is the built-in PHP fputcsv function. The first use of fputcsv writes the column headings (what a fantastic language PHP is...), the others (in the loop) write the rows.

Include spaces while reading CSV file

I have an old server running PHP 5.2. I want to migrate it to a server that uses PHP 5.4
One of my scripts is reading a csv file and the results are different.
I have a line like this in the CSV:
Id, Date, Description
On my old server this returns an array:
array('Id', 'Date', 'Description');
On my new server I get this:
array('Id', ' Date', ' Description');
Which is causing bugs. Now technically I could go in every row, and trim the spaces, but I have files with about 500,000 lines, and adding a simple process might slow down the code.
I was wondering, is there a way to make the new server act as the old one ? (without downgrading PHP obviously)
EDIT: Here is the script itself:
if (($handle = fopen($_FILES["csvfile"]["tmp_name"], 'r')) !== FALSE) {
while (($row = fgetcsv($handle, 10000, ',')) !== FALSE) {
if (!$header)
$header = $row;
else
$filec[] = array_combine($header, $row);
}
fclose($handle);
}
Use one of the following:
Encode the csv file as UTF-8
Preprocess the csv file

How to import Excel file into mysql database using phpmyadmin

I want to import from excel sheet format (.xls) to mysql database through phpmyadmin importing option. I understand that we need to convert the format to csv format first before we can import to the phpmyadmin. But unfortunately if I change to csv some special character or symbol will become question mark (?) or other different character/symbol. Please advise me on this as I am really new to phpmyadmin.
Thanks
I have answered similary question here https://stackoverflow.com/a/16330428/1570901
If you are familiar with html and php, by using this simply library simplex excel library and script you can create your own excel import to mysql. IT may take few minutes to create but once your create you can use it for life time.
// CREATE A HTML FORM TO UPLOAD EXCEL SHEET
// THEN CREATE A PHP SCRIPT LIKE BELOW
require 'simplexlsx.class.php';
if (isset($_FILES['Filedata'])) {
$file = $_FILES['Filedata']['tmp_name']; // UPLOADED EXCEL FILE
$xlsx = new SimpleXLSX($file);
list($cols, $rows) = $xlsx->dimension();
foreach( $xlsx->rows() as $k => $r) { // LOOP THROUGH EXCEL WORKSHEET
$q = "INSERT INTO TABLENAME(COL1, COL2) VALUE(";
$q .= "'".mysql_escape_string($r[0])."', "; // EXCEL DATA
$q .= "'".mysql_escape_string($r[1])."', "; // EXCEL DATA
$q .= ")";
$sql = mysql_query($q);
} // IF ENDS HERE
} // FOR EACH LOOP
}

How could a query fail to insert data into mysql that is retrieved from WEB?

I need to insert some data into mysql. I am not sure if I need to check the inputs OR format/strip them before they could be inserted into database fields as results returned from web may contain characters that mysql do not accept(I think). I have trouble with inserting tweets into mysql table. The type of field is varchar. This is insert statement in php script:
$json = $_POST['msg_top'];
$msg = json_decode($json);
foreach($msg->entry as $status)
{
$t = $status->content;
$query = "INSERT INTO msg2(id,msg,msg_id,depth) VALUES ('','$t','ID','3')";
mysql_query($query);
if(!mysql_query($query, $dbh))
{die('error:' .mysql_error());}
}
Yes, it's very important to escape all values before using them in an SQL command.
$json = $_POST['msg_top'];
$msg = json_decode($json);
foreach($msg->entry as $status) {
$t = mysql_real_escape_string($status->content);
$query = "INSERT INTO msg2(id,msg,msg_id,depth) VALUES ('','$t','ID','3')";
mysql_query($query);
if( !mysql_query($query, $dbh) ) {
die('error:' .mysql_error());
}
}
Also, other possible issues with your query:
If the id field is auto_increment'ing, you don't need it in the field or value list.
I may be missing something, but why are you using the string 'ID' for the msg_id field?
As for help troubleshooting this, I'd recommend just appending all of the $query strings to a log file for later inspection. Then, if problems aren't readily apparent, you can just manually try to run the command on the database (ie: maybe via PhpMyAdmin) and check out any error codes from there.