Extract Pdf from MySql Dump Saved as Text - mysql

I have a MySql database dump saved as a text file and am trying to find a way of extracting the indivdual pdf documents stored in one of the tables. All my research online so far has drawn a blank.
The data in the exported text file is in the following format:
DROP TABLE IF EXISTS `codocs`;
CREATE TABLE `codocs` (
`ID` int(10) unsigned NOT NULL AUTO_INCREMENT,
`COCODE` varchar(8) NOT NULL,
`FILENAME` varchar(100) NOT NULL,
`DATE` date NOT NULL,
`USER` varchar(10) NOT NULL,
`DOCUMENT` mediumblob NOT NULL,
PRIMARY KEY (`ID`),
KEY `oc` (`COCODE`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
LOCK TABLES `codocs` WRITE;
/*!40000 ALTER TABLE `codocs` DISABLE KEYS */;
INSERT INTO `codocs` (`ID`, `COCODE`, `FILENAME`, `DATE`, `USER`, `DOCUMENT`)
VALUES
(1,’123456’,’document-2016-01-18.pdf','2016-01-21’,’user1’,X’8CB7638C2840B32D3AB66DDBB66DDBB66DDBB6BDC7B66DDBB6B1C7336F9F736EDECD4DBE1FE7477752D555ABBB562A59D5A40A2262B48C74CC0450A48747734B04508C040C04F64656 …
D2495CC3D8C1FCB8845D1D6F6C717E5EFB493B431B1250782FFFC12FD518D0E4EBF951D3B98F3C7971C1235F54B793172A427FF0F'),
(2,’234567’,’document-2016-01-18.pdf','2016-01-22’,’user1’,X’8CF763702E4EF02D0AC7B6ED64C7B66DDB7E62DBB6EDECD8C98E6DDBB66D3B797FE79C5BEFAD5BF5FF70AA66BAAA7B7AD674AD999A5A4DAE282A4EC744CF4204437E7038BB4804C344C448646F6C4504C3CB4B04C3A0EAE900206210317231B2B137FFCF57343207381331FF9 …
971C1235F54B793172A427FF0F'),
(3,’…
Any assistance would be greatly appreciated.
Update: 20220112
I have since restored the database from the sql dump and have subsequently created the following php files to try to display the pdfs stored in the codocs table:
db.php - contains the mysql database connection - this is working
records_list.php - lists all the records in the codocs table including a button on each returned row to view the stored pdf - this is working
view_pdf.php - receives the ID for the record clicked on from the records_list.php file and passes the selected record ID to the SELECT statement and displays the correct (presumably, as different data is returned for each separate record clicked on in the records_list.php file) raw mediumblob code stored in the database -
this is not working as intended
The following code is for the view_pdf.php file:
<?php
$pdf_id = $_REQUEST['pdfID'];
require_once "db.php";
if(isset($pdf_id)) {
$myID = $pdf_id;
$sql = "select * from codocs where ID='" . $myID . "'";
if (!$result=mysqli_query($con, $sql)){
echo mysqli_error($con);
} else {
$row = mysqli_fetch_array($result);
echo $row["DOCUMENT"];
mysqli_close($con);
}
}
?>
As mentioned just the raw mediumblob data appears to be being returned.
If the following line is replaced:
echo $row["DOCUMENT"];
with
echo '<object data="data:application/pdf;base64,'.base64_decode($row['DOCUMENT']).'" type="application/pdf" style="height:1000px;width:100%"></object>';
or
echo base64_decode($row['DOCUMENT']);
it makes no difference. Raw code continues to be returned.
If the original line of code referred to above is replaced with
header('Content-type: application/pdf');
echo $row["DOCUMENT"];
a downloadable pdf is offered and can be saved but is unreadable with the following warning: "This PDF document might not be displayed correctly." and the following error: "Unable to open document file:///...document.pdf. File type unknown (application/octet-stream) is not supported."
Can anyone advise how the code above can be amended to allow the retrieval of the stored pdf files?
Is the X that precedes the single quotations marks shown surrounding the mediumblob data in the sql dump file of any significance?
Any assistance would be greatly appreciated.
Further Update 20220112:
The following are example unreadable pdf restores but generate 'pdf' files of differing sizes:
Record 554:
Using the following replacement code:
header('Content-type: application/pdf');
echo $row["DOCUMENT"];
generates an unreadable file 82.2Kb in size.
Using the following replacement code:
header('Content-type: application/pdf');
echo '<object data="data:application/pdf;base64,'.base64_decode($row['DOCUMENT']).'" type="application/pdf" style="height:1000px;width:100%"></object>';
generates an unreadable file 15.6Kb in size.
Using the following replacement code:
header('Content-type: application/pdf');
echo '<object data="data:application/pdf;base64,'.base64_encode($row['DOCUMENT']).'" type="application/pdf" style="height:1000px;width:100%"></object>';
generates an unreadable file 109.7Kb in size.
Any thoughts on helping to resolve the issue would be very welcome.

Related

Website Display's ?'s instead of Korean

I uploaded my website to the new server. It works perfectly on my test server at home, there is not a different setup the databases were copied over word for word. But on the site anything that is in Korean displays as ??????. The database stored it correctly and the pages all have <meta charset="UTF-8"> I can not figure out what I am missing.
EDIT: The text displays fine in the database when I use phpMyADMIN
In PDO (php api), you need set charset $conn->exec('SET CHARACTER SET utf8');.
PHP example:
<?php
//한국어/조선말
header('Content-Type: text/html; charset=utf8');
$username = 'user';
$password = 'password';
$host = 'domain';
$db = 'dbtest';
try {
$conn = new PDO('mysql:host=' . $host . ';dbname=' . $db . ';charset=utf-8', $username, $password);
$conn->exec('SET CHARACTER SET utf8');//This solve the problem
$stmte = $conn->prepare('SELECT id, text FROM test LIMIT 10');
$exec = $stmte->execute();
if ($exec) {
while($reg = $stmte->fetch(PDO::FETCH_OBJ)){
echo 'id: ' . $reg->id . '<br />';
echo 'text: ' . $reg->text . '<br /><hr />';
}
} else {
echo 'Error SELECT';
}
} catch(PDOException $e){
echo 'PDOException: ', $e->getMessage();
}
?>
Mysql example:
CREATE DATABASE `dbtest` DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci;
USE `dbtest`;
CREATE TABLE IF NOT EXISTS `test` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`text` varchar(300) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
INSERT INTO `test` (`id`, `text`) VALUES (1, '한국어/조선말');
Use phpmyadmin in your server to verify that the DATABASE on your server is "utf8", see:
In your database must be something else.
if your database is correct (after checking with the utf8) then the problem is in some PHP file.
To resolve you should save all php files (both the major and the includes)
Save your html file (or php file) in "utf8 without boom", using notepad++, see:
Add in your PHP files (in top file):
<?php
header('Content-Type: text/html; charset=utf8');
?>
Files included should be saved in utf8-without-boom also, example:
<?php
include('YOUR FILE INCLUDED.php');// Save "YOUR FILE INCLUDED.php" in UTF8-without-boom
?>
Maybe its some page is in ANSI (for example "form.php").
Note: All PHP files must be in UTF8-without-boom format
Try adding lang attribute to your html tag
<html lang="ko">
The issue is most likely a different in database collation settings between your home test server & your new remote server. Meaning that while your database was transferred correctly, the way that data is then spit out of database is a whole other thing.
What is the data collation of the database giving you an issue? By default, most MySQL installs set latin1_swedish_ci instead of utf8_general_ci for newly created databases.
Change the collation of the database & try again.
ALTER DATABASE [name of your database] CHARACTER SET utf8;
If this is a specific table, the collation can be changed as so:
ALTER TABLE [name of your table] CONVERT TO CHARACTER SET utf8;
And if it is a specific column in a table:
ALTER TABLE [name of your table] MODIFY [name of your column] [other settings] CHARACTER SET utf8 COLLATE utf8_general_ci;
Or perhaps you could export the current database, create a new database with this command & reimport the data:
CREATE DATABASE [name of your database] CHARACTER SET utf8 COLLATE utf8_general_ci;
And if you want to make a permanent change to the MySQL install on the machine giving you an issue, go and edit my.cnf. The following would set the whole chain to UTF-8:
[client]
default-character-set=utf8
[mysql]
default-character-set=utf8
[mysqld]
collation-server = utf8_unicode_ci
init-connect='SET NAMES utf8'
character-set-server = utf8
EDIT: The original poster states that the connection & DB are all UTF8 clean. But what about trying an edit to the Apache default character set. Go here & open the character set file for Apache like so:
sudo nano /etc/apache2/conf.d/charset
And uncomment the line that looks like this:
#AddDefaultCharset UTF-8
So it looks like this:
AddDefaultCharset UTF-8
And restart Apache. This is not a great idea for a long term setup in my humble opinion, but if t solves the issue it indicates there is something in your codebase that can be changed to affect the same results without having to force Apache to force UTF8.

CakePHP can't find table after creating a table

I create a table directly by a query. I only want to Import some data. Therefor i execute a query which is built dynamicly and i try execute this query in a Component-class. (I use a random existing model to execute this query is there a better why?)
$query= "CREATE TABLE IF NOT EXISTS testerdbs (
'Ü1' varchar(6) COLLATE utf8_swedish_ci DEFAULT NULL,
'Ü2' varchar(6) COLLATE utf8_swedish_ci DEFAULT NULL,
'Ü3' int(3) DEFAULT NULL,
'Ü4' varchar(6) COLLATE utf8_swedish_ci DEFAULT NULL,
'Ü5' date DEFAULT NULL
)"
$data = ClassRegistry::init('import_files');
$data->query($query);
This works fine.
In the same request i want to access the created table in the controller.
App::import('Model', "testerdb");
//$this->loadModel("testerdb");
$newTable = ClassRegistry::init("testerdb");
echo '<pre>', print_r($newTable->getColumnTypes()), '</pre>';
If I try to execute this in same request i always get the error:
Error: Table testerdbs for model testerdb was not found in datasource default.
If I do exactly the same request again, everything works fine...
I google about an hour and it seemed that cake cache the model. If I execute this request again cake cache again all the tables and than cake find my new table. So I hoped to load or import the created Table in the same request, but i don't work.
Is there another way to load the table? Where is my mistake?
Thanks for help!
This might be a bit stale, but I just spent the last week trying to work around the problem and maybe this will help someone.
The root problem is that the cache of table names is initialized before you created the temporary table, so the 'setSource' function returns an error that the temporary table does not exist.
The solution is to overrid the 'setSource' function for the Model that you are creating for 'testerdb' and remove the check on table existence (i.e. everything within the test:
if (method_exists($db, 'listSources'))' )
Your model definition should look something like this:
App::uses('AppModel', 'Model');
class testerdb extends AppModel {
public function setSource($tableName) {
$this->setDataSource($this->useDbConfig);
$db = ConnectionManager::getDataSource($this->useDbConfig);
$this->table = $this->useTable = $tableName;
$this->tableToModel[$this->table] = $this->alias;
$this->schema();
}
}
Many thanks to whomever posted the link below. This has worked with my CakePHP 2.0 instance.
http://web2.0goodies.com/blog/uncategorized/mysql-temporary-tables-and-cakephp-1-3/
Why would you only want to have a temporary table? I would just temporarily store whatever data you are importing in an in-memory model or data-structure.
If the table is not temporary, then just create it statically before you run your program.

DBI::mysql and File::Temp

I'm trying to load data into a MySQL database using the LOAD DATA LOCAL INFILE statement. On normal files, this works fine.
If I create a temporary file with File::Temp, store CSV data in it, close the file and then directly LOAD it into the database using
$dbh->do("LOAD DATA LOCAL INFILE '$tempfile' INTO TABLE $temptable" FIELDS TERMINATED BY ',');
the last two records are reproducibly omitted. However, if I do anything with the tempfile between creation and LOADing, for example with
`touch $tempfile`;
everything works as expected.
Is this an issue with the MySQL driver having trouble with freshly created tempfiles? Is it a filesystem (ext4) issue, maybe a cache flush not happening in time? Am I missing something here?
EDIT: Actually, all records are omitted if the temporary CSV file is not created by a format-converter subroutine, but by hand as shown below. I also included the code for the database interaction. Note the commented touch $tmpfh, which, when uncommented, would make the example work.
Adding UNLINK => 0 to File::Temp->new() does not make a difference.
my $tmpfh = File::Temp->new();
print $tmpfh <<EOT;
record1,textfield1
record2,textfield2
record3,textfield3
record4,textfield4
record5,textfield5
EOT
# `touch $tmpfh`; # uncomment this line to make it work
# get db handle
my $dbh = DBI->connect("DBI:mysql:$dbname:$dbserver", $username, $pwd);
# drop and recreate temp table
$dbh->do("DROP TABLE IF EXISTS $temptable") or die;
$dbh->do("CREATE TABLE $temptable (
`id` INT(11) NOT NULL PRIMARY KEY AUTO_INCREMENT,
`header` VARCHAR(255) NOT NULL,
`sequence` MEDIUMBLOB)")
or die;
# load data into temp table
my $nrecords = $dbh->do("LOAD DATA LOCAL INFILE '$tmpfh'
INTO TABLE $temptable
FIELDS TERMINATED BY ','
(header, sequence)")
or die;
$dbh->disconnect();
printf "Loaded %d records from %s into %s on %s.\n", $nrecords, $tmpfh, $dbname, $dbserver;
Close the file handle to flush the buffer. Keep the "UNLINK => 0" if you want the file to remain when the object goes out of scope.

Help Importing an Excel File into MySQL using phpMyAdmin

I am uploading Excel files (.xls) containing numeric values such as 884.557 and 731.0547 into a MySQL database using phpMyAdmin's built-in Import function. However, I am having horrible rounding/truncation issues. For some reason, some values like 884.557 and 731.0547 are changed to 99.99999 or 9.99999. However, other values like 127.0947 are imported correctly. Can anyone help? If possible, I would still like to use the built-in phpMyAdmin Import function because it is useful.
If you are familiar with html and php, by using this script simplex excel library you can create your own excel import to mysql. It may take few minutes to create but once your create you can use it for life time.
CREATE A HTML FORM TO UPLOAD EXCEL SHEET
THEN CREATE A PHP SCRIPT LIKE BELOW
require 'simplexlsx.class.php';
if (isset($_FILES['Filedata'])) {
$file = $_FILES['Filedata']['tmp_name']; // UPLOADED EXCEL FILE
$xlsx = new SimpleXLSX($file);
list($cols, $rows) = $xlsx->dimension();
foreach( $xlsx->rows() as $k => $r) { // LOOP THROUGH EXCEL WORKSHEET
$q = "INSERT INTO TABLENAME(COL1, COL2) VALUE(";
$q .= "'".mysql_escape_string($r[0])."', "; // EXCEL DATA
$q .= "'".mysql_escape_string($r[1])."', "; // EXCEL DATA
$q .= ")";
$sql = mysql_query($q);
} // IF ENDS HERE
} // FOR EACH LOOP
}
This is what i normally do:
Save the excel file as CSV format
I will manually create the database table by indicating the data-types for every column of my interest.
I will upload the csv file to the selected table by ignoring the "column names" as i have defined it at step 2. Decimals are truncated because phpmyadmin has some unexplained algorithm to determine the data type and the size allocated to a column. To prevent that, you create the table as mentioned above at step 2.
Hope it helps!

Can I Import an updated structure into a MySQL table without losing its current content?

We use MySQL tables to which we add new fields from time to time as our product evolves.
I'm looking for a way to export the structure of the table from one copy of the db, to another, without erasing the contents of the table I'm importing to.
For example say I have copies A and B of a table, and I add fields X,Y,Z to table A. Is there a way to copy the changed structure (fields X,Y,Z) to table B while keeping its content intact?
I tried to use mysqldump, but it seems I can only copy the whole table with its content, overriding the old one, or I can use the "-d" flag to avoid copying data (dumping structure only), but this will create an empty table when imported, again overriding old data.
Is there any way to do what I need with mysqldump, or some other tool?
What I usually do is store each and every ALTER TABLE statement run on the development table(s), and apply them to the target table(s) whenever necessary.
There are more sophisticated ways to do this (like structure comparison tools and such), but I find this practice works well. Doing this on a manual step by step basis also helps prevent accidental alteration or destruction of data by structural changes that change a field's type or maximum length.
I just had the same problem and solved it this way:
Export the structure of the table to update.
Export the structure of the development table.
run this code for the first file "update.sql" needs to be changed according to your exported filename.
cat update.sql|awk -F / '{
if(match($0, "CREATE TABLE")) {
{ FS = "`" } ; table = $2
} else {
if(match($0," `")) {
gsub(",",";",$0)
print "ALTER TABLE `" table "` ADD" $0
}
}
}' > update_alter.sql
run the same command for the second file
cat development.sql|awk -F / '{
if(match($0, "CREATE TABLE")) {
{ FS = "`" } ; table = $2
} else {
if(match($0," `")) {
gsub(",",";",$0)
print "ALTER TABLE `" table "` ADD" $0
}
}
}' > development_alter.sql
run this command to find the differences in the output files
diff --changed-group-format='%<' --unchanged-group-format='' development_alter.sql update_alter.sql > update_db.sql
In the file update_db.sql there will now be the code you are looking for.
Lazy way: export your old data and struct, import your actual struct, import only your old data. Works to me in the test.
for your case, it might just need to perform an update
alter table B add column x varchar(255);
alter table B add column y varchar(255);
alter table B add column z varchar(255);
update A,B
set
B.x=A.x,
B.y=A.y,
B.z=A.z
where A.id=B.id; <-- a key that exist on both tables
There is a handy way of doing this but need a little bit editing in a text editor :
This takes about Max 10Min in Gedit Under Linux !!
Export you table & save it in : localTable.sql
Open it in a text edior (Gedit) You will see something like this :
CREATE TABLE IF NOT EXISTS `localTable` (
`id` int(8) NOT NULL AUTO_INCREMENT,
`date` int(10) NOT NULL,
# Lot more Fields .....
#Other Fields Here
After Just Remove :
Anything after the closing ) parenthese
CREATE TABLE IF NOT EXISTS localTable (
Change all , to ; in each line like thats you execute all this once (,\n to ;\n)
remove all ADDPRIMARY KEY (id);ADDKEY created_by (created_by) !
And just Keep Fields you are interested in
You will have this
`id` int(8) NOT NULL AUTO_INCREMENT,
`date` int(10) NOT NULL,
# Lot more Fields .....
#Other Fields Here
Add to the begining of each line ALTER TABLE localTable ADD
ALTER TABLE `localTable` ADD `id` int(8) NOT NULL AUTO_INCREMENT,
ALTER TABLE `localTable` ADD `date` int(10) NOT NULL,
ALTER TABLE `localTable` ADD #to each more Fields .....
#Other Fields Here
That's it we can make this ab Automated Script by adding a Shell Script to do this job .
After you know what you have to do Import it in the 'remoteTable' ;)
Thanks
No it isn't possible because MySql is using mariaDB version. In mariaDB version structure of a table are arranged in memory and that memory shared with your byte data.
So when we try to import a structure (or a table) it alter that whole memory block.