I have a xml file like this:
test.xml
<?xml version="1.0" encoding="utf-8" ?>
<plugin name="tree">
<title>Test</title>
<description>some description</description>
<files>
<file>test.tmp</file>
</files>
<install><![CDATA[
global $test;
]]></install>
<hooks>
<hook name="hookname"><![CDATA[
global $local;
]]></hook>
</hooks>
<phrases>
<phrase key="category"><![CDATA[Show categories]]></phrase>
</phrases>
</plugin>
and i like to import it into a MySQL Table like 'mytable'
CREATE TABLE mytable (plugin varchar(255),title varchar(255),description varchar(255), file varchar(255),install varchar(255),hook varchar(255),phrase varchar(255));
I used below command
LOAD XML LOCAL INFILE 'test.xml'
INTO TABLE mytable(plugin,title,description,file,install,hook,phrase);
it runs successfully but with 0 rows!
The query has been successfully implemented, 0 rows have been
affected.
Thank you
Include this line ROWS IDENTIFIED BY '<plugin>'. with that your query should look like
LOAD XML LOCAL INFILE "D:\\test.xml"
INTO TABLE mytable
ROWS IDENTIFIED BY '<plugin>';
Looks like your XML file formation is not correct and so even though 1 row gets inserted; all the values doesn't gets extracted (remains NULL).
Do little changes as below
Create table structure
CREATE TABLE mytable (
plugin_name varchar(255),
title varchar(255),
description varchar(255),
`file` varchar(255),
`install` varchar(255),
hook varchar(255),
phrase varchar(255));
Change your XML file
<?xml version="1.0" encoding="utf-8" ?>
<plugin plugin_name="tree">
<title>Test</title>
<description>some description</description>
<file>test.tmp</file>
<install>![CDATA[
global $test;
]]</install>
<hook name="hookname">![CDATA[
global $local;
]]</hook>
<phrase key="category">![CDATA[Show categories]]</phrase>
</plugin>
Now if you use
LOAD XML LOCAL INFILE "D:\\test.xml"
INTO TABLE mytable
ROWS IDENTIFIED BY '<plugin>';
All data gets extracted fine
Beside the answer given above; I just wanted to add that the database fields character casing must be same as that of the XML field casing. Or else load XML wouldn't be able to import that specific column(s).
Related
I have a MySql database dump saved as a text file and am trying to find a way of extracting the indivdual pdf documents stored in one of the tables. All my research online so far has drawn a blank.
The data in the exported text file is in the following format:
DROP TABLE IF EXISTS `codocs`;
CREATE TABLE `codocs` (
`ID` int(10) unsigned NOT NULL AUTO_INCREMENT,
`COCODE` varchar(8) NOT NULL,
`FILENAME` varchar(100) NOT NULL,
`DATE` date NOT NULL,
`USER` varchar(10) NOT NULL,
`DOCUMENT` mediumblob NOT NULL,
PRIMARY KEY (`ID`),
KEY `oc` (`COCODE`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
LOCK TABLES `codocs` WRITE;
/*!40000 ALTER TABLE `codocs` DISABLE KEYS */;
INSERT INTO `codocs` (`ID`, `COCODE`, `FILENAME`, `DATE`, `USER`, `DOCUMENT`)
VALUES
(1,’123456’,’document-2016-01-18.pdf','2016-01-21’,’user1’,X’8CB7638C2840B32D3AB66DDBB66DDBB66DDBB6BDC7B66DDBB6B1C7336F9F736EDECD4DBE1FE7477752D555ABBB562A59D5A40A2262B48C74CC0450A48747734B04508C040C04F64656 …
D2495CC3D8C1FCB8845D1D6F6C717E5EFB493B431B1250782FFFC12FD518D0E4EBF951D3B98F3C7971C1235F54B793172A427FF0F'),
(2,’234567’,’document-2016-01-18.pdf','2016-01-22’,’user1’,X’8CF763702E4EF02D0AC7B6ED64C7B66DDB7E62DBB6EDECD8C98E6DDBB66D3B797FE79C5BEFAD5BF5FF70AA66BAAA7B7AD674AD999A5A4DAE282A4EC744CF4204437E7038BB4804C344C448646F6C4504C3CB4B04C3A0EAE900206210317231B2B137FFCF57343207381331FF9 …
971C1235F54B793172A427FF0F'),
(3,’…
Any assistance would be greatly appreciated.
Update: 20220112
I have since restored the database from the sql dump and have subsequently created the following php files to try to display the pdfs stored in the codocs table:
db.php - contains the mysql database connection - this is working
records_list.php - lists all the records in the codocs table including a button on each returned row to view the stored pdf - this is working
view_pdf.php - receives the ID for the record clicked on from the records_list.php file and passes the selected record ID to the SELECT statement and displays the correct (presumably, as different data is returned for each separate record clicked on in the records_list.php file) raw mediumblob code stored in the database -
this is not working as intended
The following code is for the view_pdf.php file:
<?php
$pdf_id = $_REQUEST['pdfID'];
require_once "db.php";
if(isset($pdf_id)) {
$myID = $pdf_id;
$sql = "select * from codocs where ID='" . $myID . "'";
if (!$result=mysqli_query($con, $sql)){
echo mysqli_error($con);
} else {
$row = mysqli_fetch_array($result);
echo $row["DOCUMENT"];
mysqli_close($con);
}
}
?>
As mentioned just the raw mediumblob data appears to be being returned.
If the following line is replaced:
echo $row["DOCUMENT"];
with
echo '<object data="data:application/pdf;base64,'.base64_decode($row['DOCUMENT']).'" type="application/pdf" style="height:1000px;width:100%"></object>';
or
echo base64_decode($row['DOCUMENT']);
it makes no difference. Raw code continues to be returned.
If the original line of code referred to above is replaced with
header('Content-type: application/pdf');
echo $row["DOCUMENT"];
a downloadable pdf is offered and can be saved but is unreadable with the following warning: "This PDF document might not be displayed correctly." and the following error: "Unable to open document file:///...document.pdf. File type unknown (application/octet-stream) is not supported."
Can anyone advise how the code above can be amended to allow the retrieval of the stored pdf files?
Is the X that precedes the single quotations marks shown surrounding the mediumblob data in the sql dump file of any significance?
Any assistance would be greatly appreciated.
Further Update 20220112:
The following are example unreadable pdf restores but generate 'pdf' files of differing sizes:
Record 554:
Using the following replacement code:
header('Content-type: application/pdf');
echo $row["DOCUMENT"];
generates an unreadable file 82.2Kb in size.
Using the following replacement code:
header('Content-type: application/pdf');
echo '<object data="data:application/pdf;base64,'.base64_decode($row['DOCUMENT']).'" type="application/pdf" style="height:1000px;width:100%"></object>';
generates an unreadable file 15.6Kb in size.
Using the following replacement code:
header('Content-type: application/pdf');
echo '<object data="data:application/pdf;base64,'.base64_encode($row['DOCUMENT']).'" type="application/pdf" style="height:1000px;width:100%"></object>';
generates an unreadable file 109.7Kb in size.
Any thoughts on helping to resolve the issue would be very welcome.
I am using a Gradle Spring project with liquibase. To run liquibase I am running the jar created by compiling the project. I am trying to use the liquibase "includeAll" tag in an xml changelog to run all formatted sql changelog scripts inside a directory I've called includeAllScriptsTest (currently contains only one .sql file named test.sql with one changeset).
If I try to use includeAll in my master db-changelog file, at run time liquibase returns the error: file:///.../conf/db/db.changelog-master.xml/ is not a recognized file type.
In an attempt to get around this, I reference from my db-changelog-master.xml another xml called includeAll-changelog.xml. In this file I have the includeAll tag, below are the contents of this file.
<?xml version="1.0" encoding="utf-8"?>
<databaseChangeLog
xmlns="https://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="https://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog conf/xsd/dbchangelog-3.8.xsd">
<includeAll path="***/includeAllScriptsTest/" relativeToChangelogFile="true"/>
</databaseChangeLog>
Inside the folder includeAllScriptsTest is a single file called test.sql, with the contents as below:
--liquibase formatted sql
--changeset author:1 dbms:MySQL splitStatements:true endDelimiter://
DROP PROCEDURE IF EXISTS `TESTINCLUDEALL`;//
CREATE PROCEDURE `TESTINCLUDEALL`()
BEGIN
SELECT * FROM TABLE;
END;//
However, this gives me a different error at run time: cvc-elt.1.a: Cannot find the declaration of element 'databaseChangeLog'. I've found online conflicting information regarding whether it is possible to use includeAll with the version of liquibase used by my project. I'm currently using 3.4.1.
cvc-elt.1.a: Cannot find the declaration of element
'databaseChangeLog'
This error is an XML Parsing error, pointing databaseChangeLog is not declared in the xml schema.
May be you can try using following XSD in your changelog:
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.8.xsd">
<includeAll path="***/includeAllScriptsTest/" relativeToChangelogFile="true"/>
</databaseChangeLog>
If this still doesn't help, please have a look at few points in answer on this post.
I have a simple XML file. I need to read the data and save them to mysql database table (1 or 2 tables). The file is like following :
<?xml version="1.0" encoding="utf-8" ?>
<rss version="2.0" xmlns:g="http://">
<myfile>
<title><![CDATA[All data]]></title>
<stock>
<name><![CDATA[my name]]></name>
<qty><![CDATA[0]]></qty>
<price><![CDATA[4.99]]></price>

</stock>
</myfile>
</rss>
I am trying to do that in Symfony 4 using crawler. my codes in my controller are following
$crawler = new Crawler();
$crawler->addContent(file_get_contents('http://localhost/XML/myxml.xml'));
foreach ($crawler as $domElement) {
var_dump($domElement->nodeValue);
}
return new JsonResponse($domElement->nodeValue);
It displays data with errors. Now I need to save those data in mysql database tables. Could you please tell me how to proceed further ?
Many thanks in advance !
Ok. If you want to save data to DB you have to do the following:
Add Doctrine to your project: https://symfony.com/doc/current/doctrine.html#installing-doctrine
Create an entity: https://symfony.com/doc/current/doctrine.html#creating-an-entity-class
Make migrations: https://symfony.com/doc/current/doctrine.html#migrations-creating-the-database-tables-schema and execute them
Save your data to DB https://symfony.com/doc/current/doctrine.html#persisting-objects-to-the-database
I'm struggling parsing XML into MySQL. The other option to parse XML to csv is not feasible as it looks that xmlstarlet is not available for Aix 7.1.0.0.
Investigating MySql Reference Manual I realized that the XML I'm dealing with is not fully supported. I have four different types of XML files. Let's take one as example.
<MovementReport version="0100">
<ControlArea>
<Sender>
<Division>WCS1</Division>
<Confirmation>2</Confirmation>
</Sender>
<CreationDateTime>2018-04-17T15:39:32Z</CreationDateTime>
<RefId>
<Id>6897731</Id>
</RefId>
</ControlArea>
<DataArea>
<RequestId>080030603</RequestId>
<FromLocation>
<MHA>ID1</MHA>
<Rack></Rack>
<X></X>
<Y></Y>
</FromLocation>
<StUnit>
<StUnitId>M1813236 </StUnitId>
</StUnit>
<ToLocation>
<MHA>A</MHA>
<Rack>011</Rack>
<X>065</X>
<Y>019</Y>
</ToLocation>
<ReasonCode>00</ReasonCode>
<StandAloneFlag>W</StandAloneFlag>
<Information>No info!</Information>
</DataArea>
</MovementReport>
I have to use ROWS IDENTIFIED BY in order to have some columns populated. I tried almost all tags in the above command and I came up with the following sql
USE xml_lcs; TRUNCATE TEST01;
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<DataArea>'
SET N_ID='A';
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<ToLocation>'
(#MHA, #Rack, #X, #Y)
SET t_MHA=#MHA, t_Rack=#Rack,t_X=#X, t_Y=#Y;
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<StUnit>'
SET N_ID='A';
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<FromLocation>'
(#MHA, #Rack, #X, #Y)
SET f_MHA=#MHA, f_Rack=#Rack,f_X=#X, f_Y=#Y;
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<Sender>'
SET N_ID='A';
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<RefId>'
SET N_ID='A';
LOAD XML LOCAL INFILE '33770626.xml'
INTO TABLE TEST01 ROWS IDENTIFIED BY '<ReasonCode>'
SET N_ID='A';
The above sql code result in the following table
All columns are VARCHAR.
I would like to have one row for each file, so at the end the above XML file would result in..
Any idea how to achieve that ?
Thanks a lot for your time and help.
Ema
You should use another language to parse the file and insert the data. But if you want a MySQL only solution, you can use your query to store the data into a temprary table and then use an aggregation query to combine the data into one row and copy it into the real table.
First create the temporary table:
CREATE TEMPORARY TABLE TEMP01 LIKE TEST01;
Use your code to load the data from XML to the temporary table. Change TEST01 to TEMP01.
After that copy the data with:
INSERT INTO TEST01 (N_ID, Division, RequestId, ... , StUnitId)
SELECT N_ID
, MIN(Division)
, MIN(RequestId)
...
, MIN(StUnitId)
FROM TEMP01
GROUP BY N_ID;
Since you have only one distinct value per column, it doesn't matter if you use MIN or MAX here. In MySQL 5.7 you can also use ANY_VALUE instead.
You also have the option to use functions like LOAD_FILE() and ExtractValue(). Be aware of the necessary privileges.
Example:
SELECT LOAD_FILE('/path/to/file/33770626.xml') INTO #`xml`;
INSERT INTO `TEST01`
SELECT
NULLIF(TRIM(ExtractValue(#`xml`, 'MovementReport/DataArea/ToLocation/MHA')), ''),
NULLIF(TRIM(ExtractValue(#`xml`, 'MovementReport/ControlArea/Sender/Division')), ''),
NULLIF(TRIM(ExtractValue(#`xml`, 'MovementReport/ControlArea/RefId/Id')), ''),
NULLIF(TRIM(ExtractValue(#`xml`, 'MovementReport/DataArea/FromLocation/Rack')), ''),
.
.
.
NULLIF(ExtractValue(#`xml`, 'MovementReport/DataArea/StUnit/StUnitId'), '');
I have the following XML file and I need load into mysql using LOAD XML LOCAL INFILE.
I´m using the following line:
LOAD XML LOCAL INFILE 'C:/_BORBA_/xml/importxml.xml' INTO TABLE importxml ROWS IDENTIFIED BY '<ide>'
If I declare "ide" as delimiter, I can import with sucess, but I have to import all tags (ide, info and proc) as one record row.
How can I explicit that I have 3 groups of information into my XML file?
<?xml version="1.0" encoding="utf-8"?>
<infNFe versao="3.10" Id="NFe351710006">
<ide>
<cUF>35</cUF>
<cNF>99999</cNF>
<natOp>Venda de terceiros</natOp>
<indPag>1</indPag>
<mod>55</mod>
<serie>1</serie>
<nNF>888888</nNF>
<dhEmi>2017-10-26T16:50:52-02:00</dhEmi>
</ide>
<info>
<tpNF>1</tpNF>
<idDest>2</idDest>
<cMunFG>3525904</cMunFG>
<tpImp>1</tpImp>
<tpEmis>1</tpEmis>
</info>
<proc>
<cDV>8</cDV>
<tpAmb>1</tpAmb>
<finNFe>1</finNFe>
<indFinal>0</indFinal>
<indPres>9</indPres>
<procEmi>3</procEmi>
<verProc>008</verProc>
<CNPJ>99999888844</CNPJ>
</proc>
</infNFe>
Thank you for any help.