phpmyadmin import csv file. How to skip error line - mysql

I´m trying to import about 3gb csv files to phpmyadmin. Some of them contains more terminated chars and then importing stops because of wrong fields.
I have two colums which i want to fill. Im using : as terminanting char but when there is more of them in line it just stops. I cannot manage csv files they are too big. I want to skip error lines or look for other solutions. How can i do this ?
csv files looks like this
ahoj123:dublin
cat:::dog
pes::lolko

As a solution to your problem, I have written a simple PHP file that will "fix" your file for you ..
It will open "test.csv" with contents of:
ahoj123:dublin
cat:::dog
pes::lolko
And convert it to the following and save to "fixed_test.csv"
ahoj123:dublin
cat:dog
pes:lolko
Bear in mind that I am basing this on your example, so I am letting $last keep it's EOL character since there is no reason to remove or edit it.
PHP file:
<?php
$filename = "test.csv";
$handle = fopen($filename, "r+") or die("Could not open $filename" . PHP_EOL);
$keep = '';
while(!feof($handle)) {
$line = fgets($handle);
$elements = explode(':', $line);
$first = $elements[0];
$key = (count($elements) - 1);
$last = $elements[$key];
$keep .= "$first:$last";
}
fclose($handle);
$new_filename = "better_test.csv";
$new_handle = fopen("fixed_test.csv", "w") or die("Could not open $new_filename" . PHP_EOL);
fwrite($new_handle, $keep);
fclose($new_handle);

Related

How to extract row data from a csv file using perl?

I have a csv file like
Genome Name,Resistance_phenotype,Amikacin,Gentamycin,Aztreonam
AB1,,Susceptible,Resistant,Resistant
AB2,,Susceptible,Susceptible,Susceptible
AB3,,Resistant,Resistant,NA
I need to fill 2nd column i.e. Resistant phenotype with MDR, XDR and susceptible. for which I have to match antibiotic resistance profile like if in first row gentamycin & antreanam both are resistant the 2nd column will be filled with MDR and in 3nd row if all 3 are susceptible the 2nd column of 3rd row will be filled with susceptible.
I have written a code mentioned below which only display columns of the csv file. I got stuck what to do further.
#!/usr/bin/perl
use strict;
use warnings;
my $file = 'text.csv';
my #data;
open(my $fh, '<', $file) or die "Can't read file '$file' [$!]\n";
while (my $line = <$fh>) {
chomp $line;
my #fields = split(/,/, $line);
print $fields[0], "\n";
#print $fields[1], "\n";
}
close $file;
Genome Name,Resistance_phenotype,Amikacin,Gentamycin,Aztreonam
AB1,MDR,Susceptible,Resistant,Resistant
AB2,Susceptible,Susceptible,Susceptible,Susceptible
AB3,MDR,Resistant,Resistant,NA
Use the Text::CSV_XS module. Read a line, assign the right value to the that column, then print it again. In your sample code, you were only writing one column instead of all of them; the module will handle all of that for you:
use Text::CSV_XS;
my $csv = Text::CSV_XS->new;
# replace *DATA and *STDOUT with whatever filehandles you want
# to read then write.
while( my $row = $csv->getline(*DATA) ) {
$row->[1] = 'Some value';
$csv->say( *STDOUT, $row );
}
__DATA__
Genome Name,Resistance_phenotype,Amikacin,Gentamycin,Aztreonam
AB1,,Susceptible,Resistant,Resistant
AB2,,Susceptible,Susceptible,Susceptible
AB3,,Resistant,Resistant,NA
The output is:
"Genome Name","Some value",Amikacin,Gentamycin,Aztreonam
AB1,"Some value",Susceptible,Resistant,Resistant
AB2,"Some value",Susceptible,Susceptible,Susceptible
AB3,"Some value",Resistant,Resistant,NA

How to import data mocked into an Excel file into a MySql database table?

I am not so into database and I have the following situation: I am working using MySQL and I have to import some data moacked into a Microsoft Excel file.
In this file the first line cells represents a table fields (each cell is a field), the below rows cells contains the value related to these fields.
At the beginning I had thought to develop a Java program that access to this Excel file, parse it and populate my DB. But this solution is unsustainable because I had a lot of Excel files (each file contains the mocked data for a specific table).
Can I directly use an Excel file (or the converted version in .csv) to populate a MySql table? Exist an easy way to do it?
If so what are the precautions to be taken into account?
Mysql Query (if you have admin rights)
SELECT * INTO OUTFILE "/mydata.csv"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY "\n"
FROM test;
Or an easy PHP script
$result = $db_con->query('SELECT * FROM `some_table`');
if (!$result) die('Couldn\'t fetch records');
$num_fields = mysql_num_fields($result);
$headers = array();
for ($i = 0; $i < $num_fields; $i++) {
$headers[] = mysql_field_name($result , $i);
}
$fp = fopen('php://output', 'w');
if ($fp && $result) {
header('Content-Type: text/csv');
header('Content-Disposition: attachment; filename="export.csv"');
header('Pragma: no-cache');
header('Expires: 0');
fputcsv($fp, $headers);
while ($row = $result->fetch_array(MYSQLI_NUM)) {
fputcsv($fp, array_values($row));
}
die;
}

Yii2. Not able to save big file to BLOB database field

I use following code in order to save a file:
$file = UploadedFile::getInstance($model, 'uploadedFile');//Get the uploaded file
$fp = fopen($file->tempName, 'r');
//$content = fread($fp, filesize($file->tempName));
$content = file_get_contents($file->tempName);
fclose($fp);
$model->content = $content;
$model->save();
With mentioned code I can save files up to approximately 1 MB. But larger files throw an error after $model->save():
PDOStatement::execute(): MySQL server has gone away
I use mediumblob type. What can be a problem?
The problem was a max_allowed_packet = 1M inside my.ini.

Linux- Breaking up a large JSON file into chunks to minify

I am working with a 6.0 MB JSON file that is being used with about 100 other scripts on a server that will soon be set up. I wish to compress the file by deleting all of the extra spaces, tabs, returns, etc., but all of the sources I've found for compressing the file can't handle the file's size (it's around 108,000 lines of code). I need to break the file up in a way that it will be easy to reassemble once each chunk has been compressed. Does anyone know how to break it up in an efficient way? Help would be much appreciated!
Because python scripts could already handle the large file, I ended up using ipython and writing a .py script that would dump the script without spaces. To use this script, one would type:
$ ipython -i compression_script.py
This is the code within compression_script.py:
import json
filename= raw_input('Enter the file you wish to compress: ')# file name we want to compress
newname = 'compressed_' + filename # by default i have set the new filename to be 'compressed_' + filename
fp = open(filename)
jload = json.load(fp)
newfile = json.dumps(jload, indent = None, separators = (',', ':'))
f = open(newname, 'wb')
f.write(newfile)
f.close()
print('Compression complete! Type quit to exit IPython')
you can be done in php also like ....
//
$myfile = fopen("newfile.txt", "w") or die("Unable to open file!");
$handle = fopen("somehugefile.json", "r");
if ($handle) {
$i = 0;
while (!feof($handle)) {
$buffer = fgets($handle, 5096);
$buffer = str_replace("\r\n","", $buffer);
$buffer = str_replace("\t","", $buffer);
fwrite($myfile, $buffer);
$i++;
//var_dump($buffer);
/*
if ($i == 1000) {
die('stop');
}
*/
}
fclose($handle);
fclose($myfile);
}

populate Html table with remote files

I want to create a HTML page that will have a table that will populate itself with info from 2 .txt files that are on a remote Linux Server.
or populate a html page on that remote server with the same info from those 2 .txt files and then access that html page using apache's webserver.
something as basic as possible would be nice but I can understand if it's complicated to do with html
honestly, any help at all would be nice.
I would personally do it in PHP. You can read the file and echo it into a table. You can then use the lines of the file for anything you want. I put comments in explaining each step. All you have to do is change $filepath to point at your text file:
Edited: Edited the code to add constraints mentioned by OG poster in comments. There is probably a more optimized way of performing your task, but this works and should introduce some new concepts to you if you are new to PHP
<?php
$filepath = 'files/the_file.txt';
if (file_exists($filepath)) {
$file = fopen($filepath, 'r');
echo '<table border=1>';
while (!feof($file)) {
$line = fgets($file);
$first_char = $line[0];
if ($first_char != '*' && $first_char != '^' && trim($line) != '') {
if (strstr($line, '|')) {
$split = explode('|', $line);
echo '<tr>';
foreach($split as $line) {
echo '<td>'.$line.'</td>';
}
echo '</tr>';
} else {
echo '<tr><td>'.$line.'</td></tr>';
}
}
}
echo '</table>';
} else {
echo 'the file does not exist';
}
?>
I'll do my best to explain it line by line instead of flooding the scrip with comments:
set your file path
If the file exists, continue on. If not, throw the error located at the bottom of the script
open the file
create the table ('<table>')
while the text file is being read, do a series of things: First, get the line. If the first character of the line is a * or ^, or when the line is trimmed there are no characters, skip it completely. Otherwise, continue on
if the line contains a | character, split (explode) the line at all of the | characters. Use this array of split up content and for each piece of content, echo out a new column in the existing row with the current content. Otherwise, there is not | found and you can just echo the line into a row normally
once you are finished up, end the table ('</table>')
Edit #2: The original solution I posted:
<?php
$filepath = '/var/www/files/the_file.txt';
if (file_exists($filepath)) {
$file = fopen($filepath, 'r');
echo '<table border=1>';
while (!feof($file)) {
$line = fgets($file);
echo '<tr><td>'.$line.'</td></tr>';
}
echo '</table>';
} else {
echo 'the file does not exist';
}
?>
HTML can't do anything, HTML is a presentation format.
PHP, Javascript, BASH could do the job in very different ways :
PHP : the server calls the 2 remote files and output the assembled html file into a webpage, then send it to the client
Javascript : the page itself calls the 2 files and add them in itself.
Bash + CURL : a BASH (or PHP, Python...) script creates a .html file containing the data of the 2 files.
One of these might help you, if you can precreate the HTML rather than doing it dynamically. These scripts take CSV as input and output an HTML table:
http://stromberg.dnsalias.org/svn/to-table/
http://stromberg.dnsalias.org/svn/to-table2/