Include spaces while reading CSV file - csv

I have an old server running PHP 5.2. I want to migrate it to a server that uses PHP 5.4
One of my scripts is reading a csv file and the results are different.
I have a line like this in the CSV:
Id, Date, Description
On my old server this returns an array:
array('Id', 'Date', 'Description');
On my new server I get this:
array('Id', ' Date', ' Description');
Which is causing bugs. Now technically I could go in every row, and trim the spaces, but I have files with about 500,000 lines, and adding a simple process might slow down the code.
I was wondering, is there a way to make the new server act as the old one ? (without downgrading PHP obviously)
EDIT: Here is the script itself:
if (($handle = fopen($_FILES["csvfile"]["tmp_name"], 'r')) !== FALSE) {
while (($row = fgetcsv($handle, 10000, ',')) !== FALSE) {
if (!$header)
$header = $row;
else
$filec[] = array_combine($header, $row);
}
fclose($handle);
}

Use one of the following:
Encode the csv file as UTF-8
Preprocess the csv file

Related

CSV import problem since converting to PHP 8.1

I have the following Wordpress function that worked in PHP 7. Since converting to 8.1, it's not working.
function dropdown_handler() {
$output = drop_function();
//send back text to replace shortcode in post
return $output;
}
function drop_function() {
//get the csv file with amounts
if ($file_handle = fopen("wp-content/plugins/drop/amounts.csv", "r")) {
while (!feof($file_handle) ) {
$lines[] = fgetcsv($file_handle, 1024);
}
fclose($file_handle);
$lines = str_replace ("£","£",$lines);
}
else {
echo "Sorry, something went wrong";
}
In my error log I'm seeing "PHP Warning: Array to string conversion in" relating to the $lines = str_replace line but I think there's something wrong with the fopen statement.
Basically, the word Array is being stored in the $lines variable rather than the contents of the CSV file.
Any ideas please?
Your code was always broken, it's just broken in a slightly more obvious way than it used to be...
$lines[] = fgetcsv($file_handle, 1024);
fgetcsv, unless it fails, returns an array; you then add this array as a new item to another array, $lines. The result is an array of arrays, like this:
$lines = [
['line 1 first item', 'line 1 second item'],
['line 2 first item', 'line 2 second item'],
];
Later, you pass this whole array to str_replace; but str_replace only knows how to deal with a single dimension of array.
So this works:
$singleLine = ['line 1 first item', 'line 1 second item'];
var_dump(str_replace('item', 'ITEM', $singleLine));
But this doesn't:
var_dump(str_replace('item', 'ITEM', $lines));
Running that example on multiple versions of PHP reveals that under PHP 7.x, str_replace reacted by simply leaving the inner arrays untouched - in other words, it did nothing.
In PHP 8, it instead tries to turn each inner array into a string, issuing the warning and producing the word "Array" (which will then have any substitutions applied to it).
The fix for both PHP versions is to run the str_replace on each of the inner arrays, most simply by using array_map:
var_dump(
array_map(
fn($innerArray) => str_replace('item', 'ITEM', $innerArray),
$lines
)
);
Alternatively, you can just delete the str_replace line completely, since you were apparently happy enough when it wasn't actually doing anything.

How to write value stored in a variable to CSV in MySQL

MySQL: I want to write a timestamp stored in a variable to a CSV file and I only know how to store a query.
Here's how I do it (using my own SQL() function to retrieve the result set via fetch_all(MYSQLI_ASSOC), but you're clearly ok on that bit):
$out = fopen($file, 'w');
$rows = SQL($sqlstatement);
fputcsv($out, array_keys($rows[0]));
foreach($rows as $row) { fputcsv($out, $row); }
fclose($out);
The key is the built-in PHP fputcsv function. The first use of fputcsv writes the column headings (what a fantastic language PHP is...), the others (in the loop) write the rows.

How to import Excel file into mysql database using phpmyadmin

I want to import from excel sheet format (.xls) to mysql database through phpmyadmin importing option. I understand that we need to convert the format to csv format first before we can import to the phpmyadmin. But unfortunately if I change to csv some special character or symbol will become question mark (?) or other different character/symbol. Please advise me on this as I am really new to phpmyadmin.
Thanks
I have answered similary question here https://stackoverflow.com/a/16330428/1570901
If you are familiar with html and php, by using this simply library simplex excel library and script you can create your own excel import to mysql. IT may take few minutes to create but once your create you can use it for life time.
// CREATE A HTML FORM TO UPLOAD EXCEL SHEET
// THEN CREATE A PHP SCRIPT LIKE BELOW
require 'simplexlsx.class.php';
if (isset($_FILES['Filedata'])) {
$file = $_FILES['Filedata']['tmp_name']; // UPLOADED EXCEL FILE
$xlsx = new SimpleXLSX($file);
list($cols, $rows) = $xlsx->dimension();
foreach( $xlsx->rows() as $k => $r) { // LOOP THROUGH EXCEL WORKSHEET
$q = "INSERT INTO TABLENAME(COL1, COL2) VALUE(";
$q .= "'".mysql_escape_string($r[0])."', "; // EXCEL DATA
$q .= "'".mysql_escape_string($r[1])."', "; // EXCEL DATA
$q .= ")";
$sql = mysql_query($q);
} // IF ENDS HERE
} // FOR EACH LOOP
}

perl script to create xml from mysql query - out of memory

I need to generate an XML file from database records, and I get the error "out of memory". Here's the script I am using, it's found on Google, but it's not suitable for me, and it's also killing the server's allocated memory. It's a start though.
#!/usr/bin/perl
use warnings;
use strict;
use XML::Simple;
use DBI;
my $dbh = DBI->connect('DBI:mysql:db_name;host=host_address','db_user','db_pass')
or die DBI->errstr;
# Get an array of hashes
my $recs = $dbh->selectall_arrayref('SELECT * FROM my_table',{ Columns => {} });
# Convert to XML where each hash element becomes an XML element
my $xml = XMLout( {record => $recs}, NoAttr => 1 );
print $xml;
$dbh->disconnect;
This script only prints the records, because I tested with a where clause for a single row id.
First of all, I couldn't manage to make it to save the output to a file.xml.
Second, I need somehow to split the "job" in multiple jobs and then put together the XML file all in one piece.
I have no idea how to achieve both.
Constraint: No access to change server settings.
These are problem lines:
my $recs = $dbh->selectall_arrayref('SELECT * FROM my_table',{ Columns => {} });
This reads the whole table into memory, representing every single row as an array of values.
my $xml = XMLout( {record => $recs}, NoAttr => 1 );
This is probably even larger structure, it is a the whole XML string in one go.
The lowest memory-use solution needs to involve loading the table one item at a time, and printing that item out immediately. In DBI, it is possible to make a query so that you fetch one row at a time in a loop.
You will need to play with this before the result looks like your intended output (I haven't tried to match your XML::Simple output - I'm leaving that to you:
print "<records>\n";
my $sth = $dbh->prepare('SELECT * FROM my_table');
$sth->execute;
while ( my $row = $sth->fetchrow_arrayref ) {
# Convert db row to XML row
print XMLout( {row => $row}, NoAttr => 1 ),"\n";
}
print "</records>\n";
Perl can use e.g. open( FILEHANDLE, mode, filename ) to start access to a file and print FILEHANDLE $string to print to it, or you could just call your script and pipe it to a file e.g. perl myscript.pl > table.xml
It's the select * with no contraints that will be killing your memory. Add some constraint to your query ie date or id and use a loop to execute the query and do your output in chunks. That way you won't need to load the whole table in mem before your get started on the output.

import multiple table data into mysql using single csv file

I have this table in csv file
Now i have two mysql tables dealing with this csv file,
Data A, B, C has to get stored in Table1 whereas data D, E, F, G, H has to get stored in Table2.
I have the above formatted csv file, how can i upload its data to MYSQL database??
So that from same file input can be done for different tables.
Try this it's working well, you can add as many values as possible depending on the number of columns you have in the CSV file.
<?php
$fname = $_FILES['csv_file']['name'];
$chk_ext = explode(".",$fname);
$filename = $_FILES['csv_file']['tmp_ name'];
$handle = fopen($filename, "r");
if(!$handle){
die ('Cannot open file for reading');
}
while (($data = fgetcsv($handle, 10000, ",")) !== FALSE)
{
$query = "INSERT INTO tablename (col1_csv, col2_csv)
values ('$data[0]', '$data[1]');
mysql_query($query)
or die(mysql_error());
$query1 = "INSERT INTO tablename (col1_csv, col2_csv)
values ('$data[0]', '$data[1]');
mysql_query($query1)
or die(mysql_error());
}
fclose($handle);
?>
Probably only via custom script (PHP or so)...
But it's usually not hard to split CSV into 2 files and import both into separate tables using LOAD_DATA function from MySQL directly. It will be MUCH faster than using phpMyAdmin or similar scripts.
Load the data in a temporary table, then split up in two tables.
Then drop that temporary table. I somewhere read that php can do this, but I cannot help in php codes.