I want to import from excel sheet format (.xls) to mysql database through phpmyadmin importing option. I understand that we need to convert the format to csv format first before we can import to the phpmyadmin. But unfortunately if I change to csv some special character or symbol will become question mark (?) or other different character/symbol. Please advise me on this as I am really new to phpmyadmin.
Thanks
I have answered similary question here https://stackoverflow.com/a/16330428/1570901
If you are familiar with html and php, by using this simply library simplex excel library and script you can create your own excel import to mysql. IT may take few minutes to create but once your create you can use it for life time.
// CREATE A HTML FORM TO UPLOAD EXCEL SHEET
// THEN CREATE A PHP SCRIPT LIKE BELOW
require 'simplexlsx.class.php';
if (isset($_FILES['Filedata'])) {
$file = $_FILES['Filedata']['tmp_name']; // UPLOADED EXCEL FILE
$xlsx = new SimpleXLSX($file);
list($cols, $rows) = $xlsx->dimension();
foreach( $xlsx->rows() as $k => $r) { // LOOP THROUGH EXCEL WORKSHEET
$q = "INSERT INTO TABLENAME(COL1, COL2) VALUE(";
$q .= "'".mysql_escape_string($r[0])."', "; // EXCEL DATA
$q .= "'".mysql_escape_string($r[1])."', "; // EXCEL DATA
$q .= ")";
$sql = mysql_query($q);
} // IF ENDS HERE
} // FOR EACH LOOP
}
Related
MySQL: I want to write a timestamp stored in a variable to a CSV file and I only know how to store a query.
Here's how I do it (using my own SQL() function to retrieve the result set via fetch_all(MYSQLI_ASSOC), but you're clearly ok on that bit):
$out = fopen($file, 'w');
$rows = SQL($sqlstatement);
fputcsv($out, array_keys($rows[0]));
foreach($rows as $row) { fputcsv($out, $row); }
fclose($out);
The key is the built-in PHP fputcsv function. The first use of fputcsv writes the column headings (what a fantastic language PHP is...), the others (in the loop) write the rows.
I have a query:
"SELECT Time, Date, Name, Email FROM table"
It converts the results to json to be passed via ajax, the problem is I want a new column in the sql, so I add it to the query:
"SELECT Time, Date, Name, Email, Address FROM table"
now the json encode does not work, I have tried changing data types and using UTF-8 however this did not work, none of the others are using UTF-8 but still work anyway, Thanks.
This is my code to encode to json which does work until I add the new collumn from sql
if ($result = $mysqli->query($query)) {
$tempArray = array();
while($row = $result->fetch_object()) {
$tempArray = $row;
array_push($myArray, $tempArray);
}
echo json_encode($myArray);
}
Solved
The problem was the last column i was trying to get was called "Show" for some reason sql does not like this, i renamed this column to "lol" (temporary) and it works!
I have an old server running PHP 5.2. I want to migrate it to a server that uses PHP 5.4
One of my scripts is reading a csv file and the results are different.
I have a line like this in the CSV:
Id, Date, Description
On my old server this returns an array:
array('Id', 'Date', 'Description');
On my new server I get this:
array('Id', ' Date', ' Description');
Which is causing bugs. Now technically I could go in every row, and trim the spaces, but I have files with about 500,000 lines, and adding a simple process might slow down the code.
I was wondering, is there a way to make the new server act as the old one ? (without downgrading PHP obviously)
EDIT: Here is the script itself:
if (($handle = fopen($_FILES["csvfile"]["tmp_name"], 'r')) !== FALSE) {
while (($row = fgetcsv($handle, 10000, ',')) !== FALSE) {
if (!$header)
$header = $row;
else
$filec[] = array_combine($header, $row);
}
fclose($handle);
}
Use one of the following:
Encode the csv file as UTF-8
Preprocess the csv file
I'm working on a bilingual (English & Traditional Chinese) website with the content stored in a db. I usually export the tables as CSV and input the data in bulk an then re-import it back into the table. The Chinese characters display both in the db and the website. However, whenever I export tables with Traditional Chinese characters, they turn into question marks.
I've tried changing the collation of the entire table as well as individual columns into various settings (big, binary, utf8, etc) but nothing seems to work. I've also tried using character sets in the export interface but it doesn't fix the problem either.
Is this a problem with phpmyadmin or are there some settings that could fix this? You help would be much appreciated.
I kinda solved this problem of exporting a table as csv that contains Chinese. But not really with phpmyadmin. You can use this export script to export the table you want that contains Chinese and it'll display properly
<?php
/*
* PHP code to export MySQL data to CSV
* http://salman-w.blogspot.com/2009/07/export-mysql-data-to-csv-using-php.html
*
* Sends the result of a MySQL query as a CSV file for download
*/
/*
* establish database connection
*/
$conn = mysql_connect('MYSQL_HOST', 'MYSQL_USERNAME', 'MYSQL_PASSWORD') or die(mysql_error());
mysql_select_db('MYSQL_DATABASE', $conn) or die(mysql_error($conn));
mysql_query("SET character_set_results=utf8", $conn);
/*
* execute sql query
*/
$query = sprintf('SELECT * FROM MYSQL_TABLE');
$result = mysql_query($query, $conn) or die(mysql_error($conn));
/*
* send response headers to the browser
* following headers instruct the browser to treat the data as a csv file called export.csv
*/
header('Content-Type: text/csv');
header('Content-Disposition: attachment;filename=export.csv');
echo "\xEF\xBB\xBF";
/*
* output header row (if atleast one row exists)
*/
$row = mysql_fetch_assoc($result);
if ($row) {
echocsv(array_keys($row));
}
/*
* output data rows (if atleast one row exists)
*/
while ($row) {
echocsv($row);
$row = mysql_fetch_assoc($result);
}
/*
* echo the input array as csv data maintaining consistency with most CSV implementations
* - uses double-quotes as enclosure when necessary
* - uses double double-quotes to escape double-quotes
* - uses CRLF as a line separator
*/
function echocsv($fields)
{
$separator = '';
foreach ($fields as $field) {
if (preg_match('/\\r|\\n|,|"/', $field)) {
$field = '"' . str_replace('"', '""', $field) . '"';
}
echo $separator . $field;
$separator = ',';
}
echo "\r\n";
}
?>
This code was copied and modified from the following sources:
http://salman-w.blogspot.hk/2009/07/export-mysql-data-to-csv-using-php.html
When exported to CSV russian characters won't display
Turns out the problem wasn't in the csv file but with excel. Not matter what the encoding of the csv file is, excel will always open it as ASCII, which screws up the chinese. The above code will
export the csv as UTF-8 encoding so Chinese can be displayed with this mysql_query("SET character_set_results=utf8", $conn);
force excel to open the csv as UTF-8 with this echo "\xEF\xBB\xBF";
Although this kinda solves the problem, it'd be great someone could figure out how to do this through phpmyadmin without the need for a custom export script.
I have this table in csv file
Now i have two mysql tables dealing with this csv file,
Data A, B, C has to get stored in Table1 whereas data D, E, F, G, H has to get stored in Table2.
I have the above formatted csv file, how can i upload its data to MYSQL database??
So that from same file input can be done for different tables.
Try this it's working well, you can add as many values as possible depending on the number of columns you have in the CSV file.
<?php
$fname = $_FILES['csv_file']['name'];
$chk_ext = explode(".",$fname);
$filename = $_FILES['csv_file']['tmp_ name'];
$handle = fopen($filename, "r");
if(!$handle){
die ('Cannot open file for reading');
}
while (($data = fgetcsv($handle, 10000, ",")) !== FALSE)
{
$query = "INSERT INTO tablename (col1_csv, col2_csv)
values ('$data[0]', '$data[1]');
mysql_query($query)
or die(mysql_error());
$query1 = "INSERT INTO tablename (col1_csv, col2_csv)
values ('$data[0]', '$data[1]');
mysql_query($query1)
or die(mysql_error());
}
fclose($handle);
?>
Probably only via custom script (PHP or so)...
But it's usually not hard to split CSV into 2 files and import both into separate tables using LOAD_DATA function from MySQL directly. It will be MUCH faster than using phpMyAdmin or similar scripts.
Load the data in a temporary table, then split up in two tables.
Then drop that temporary table. I somewhere read that php can do this, but I cannot help in php codes.