MySQL to XML file - mysql

I am trying to get MySQL database into an xml file; here is my code:
<?php
header("Content-type: text/xml");
include 'dbc.php';
$query = "SELECT * FROM airports LIMIT 50";
$result = mysql_query($query, $link)
or die('Error querying database.');
$xml = new SimpleXMLElement('<xml/>');
while($row = mysql_fetch_assoc($result)) {
$draw = $xml->addChild('draw');
$draw->addChild('ident',htmlentities(iconv("UTF-8", "ISO-8859-1//IGNORE",$row['ident'])));
$draw->addChild('name',htmlentities(iconv("UTF-8", "ISO-8859-1//IGNORE",$row['name'])));
}
mysql_close($link);
$fp = fopen("links2.xml","wb");
fwrite($fp,$xml->asXML());
fclose($fp);
Here is the error Im getting:
XML Parsing Error: no element found
Location: /sql2xml2.php
Line Number 1, Column 2:
-^
What am I doing wrong???

Your XML is considered invalid in your XML reader because of the thrown warning, thus the XML Parsing Error: junk after document element issue.
As for the warning itself, you need to escape special entities (namely &, < and > in your content when adding it like that (using str_replace usually works well for only those 3 when it comes to XML, htmlentities may yield undesired effects, unless you supply PHP 5.4's ENT_XML1 mode).
Refer to a related answer for more information of why this happens.

If you want just to export MySQL database to local XML file you can use mysqldump tool:
mysqldump --xml -u username -p databasename [tablename] > filename.xml

Got it to work with this code:
<?
header("content-type:text/xml");
function getXML($query="SELECT * FROM airports limit 50")
{
include 'dbc.php';
$result = mysql_query($query, $link)
or die('Error querying database.');
$columns="";
echo "<xml>\n";
while($row=mysql_fetch_assoc($result))
{
$columns.="\t<airport>\n";
foreach($row as $key => $value)
{
$value = htmlentities(iconv("UTF-8", "ISO-8859-1//TRANSLIT",$value));
$value = htmlentities(iconv("UTF-8", "ISO-8859-1//IGNORE",$value));
$columns.="\t\t<$key>$value</$key>\n";
}
$columns.="\t</airport>\n";
}
echo $columns;
echo "</xml>\n";
}
getXML();
?>

Related

how to import json file from https web page into a mariadb or mysql database?

I am trying to import a file into a mariadb (mysql), database. A proof of concept file is in .json format on the web at this location. https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
I know how to do this on db2 for i.
select * from JSON_TABLE(
SYSTOOLS.HTTPGETCLOB('https://earthquake.usgs.gov' ||
'/earthquakes/feed/v1.0/summary/all_week.geojson',null),
'$.features[*]'
COLUMNS( MILLISEC BIGINT PATH '$.properties.time',
MAG DOUBLE PATH '$.properties.mag',
PLACE VARCHAR(100) PATH '$.properties.place'
)) AS X;
This reads a from the web and lists 3 fields. Surrounding it with an insert clause will put it into a database file for me.
I would like to do exactly the same thing on my home server using mariadb. Ultimately a script will run unattended on a hosted server.
A .json segment of the earthquake data looks like this:
… "features":[
{"type":"Feature",
"properties":{
"mag":1.1,
"place":"58 km WNW of Anchor Point, Alaska",
"time":1640472257402,
"updated":1640472615410,
"tz":null,
"url":"https://earthquake.usgs.gov/earthquakes/eventpage/ak021gi3al5x",
"detail":"https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ak021gi3al5x.geojson",
"felt":null,
"cdi":null,
"mmi":null,
"alert":null,
"status":"automatic",
"tsunami":0,
"sig":19,
"net":"ak",
"code":"021gi3al5x",
"ids":",ak021gi3al5x,",
"sources":",ak,",
"types":",origin,",
"nst":null,
"dmin":null,
"rms":0.79,
"gap":null,
"magType":"ml",
"type":"earthquake",
"title":"M 1.1 - 58 km WNW of Anchor Point, Alaska"},
"geometry":{
"type":"Point",
"coordinates":[-152.8406,59.9119,89.7]
},
"id":"ak021gi3al5x"
}, ...
Just a thought, but you could maybe just write a BASH script, e.g.
#!/bin/sh
cd "$(dirname "$0")"
export PATH=/bin:/usr/bin:/usr/local/bin
TODAY=`date +"%d%b%Y-%H%M"`
MYSQL_HOST='127.0.0.1'
MYSQL_PORT='3306'
MYSQL_USER='root'
MYSQL_PASSWORD='root'
url=https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
DATA=$(curl ${url} 2>/dev/null)
printf '%s' "$DATA" | awk '{print $0}'
exit
...
...
...
and then use a tool like https://webinstall.dev/jq/, Python and whatever other tools you have on your system to extract the data that you want and then update the DB.
I am more familiar with PHP, which would work also. You could tidy that up a bit, but seems to work actually.
earthquake.php
<?php
$database = false;
try {
$options = array(PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_OBJ, PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING, PDO::ATTR_EMULATE_PREPARES => true );
$conn = new PDO('mysql:host=127.0.0.1;dbname=test;port=3306;charset=utf8','root','root', $options);
} catch (PDOException $e) {
// Echo custom message. Echo error code gives you some info.
echo '[{"error":"Database connection can not be estabilished. Please try again later. Error code: ' . $e->getCode() . '"}]';
exit;
}
$url = "https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson";
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$result = curl_exec($ch);
$features = json_decode($result)->features;
foreach ($features as $feature) {
echo 'Mag: '.$feature->properties->mag.', Place: '.$feature->properties->place.', '.gmdate("Y-m-d H:i:s", $feature->properties->time/1000).PHP_EOL;
$query = 'INSERT INTO features (mag, place, time) VALUES (?, ?, ?)';
$params = [$feature->properties->mag, $feature->properties->place, gmdate("Y-m-d H:i:s", $feature->properties->time/1000)];
$stmt = $conn->prepare($query) or die ('["status":{"error":"Prepare Statement Failure","query":"' .$query . '"}]');
$stmt->execute($params) or die('[{"error":"' . $stmt->errorInfo()[2] . '","query":"' .$query . '","params":' .json_encode($params) . '}]');
}
?>
create a local DB called features with mag, place and time columns. If you have php on your system just run it from the CLI, php earthquake.php
e.g. Insert into mysql from Bash script
I use Laravel a bit, and would probably actually build my own model and use Eloquent and a little UI to handle that, but using a script seems like an option.

Fetching data from database in php file

I am trying to fetch data from table. Table contains the data and query is true. Even why following query says $u and $t are not define. While condition becoming false.
I manually checked in database, it was showing results.
$url = "http://paulgraham.com/";
$user_id = "123";
$con = mysqli_connect('127.0.0.1', 'root', '', 'mysql');
if (mysqli_connect_errno())
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
return;
}
$result = mysqli_query($con,"SELECT * FROM post_data WHERE userid =".$url." and url=".$user_id."");
while ($row = #mysqli_fetch_array($result))
{
echo "hi";
$t = $row['title'];
$u = $row['url'];
}
echo "title is : $t";
echo "url is : $u";
Giving your SQL query :
"SELECT * FROM post_data WHERE userid =".$url." and url=".$user_id.""
You can see you are mixing url and userid... Change to :
"SELECT * FROM post_data WHERE userid =".$user_id." and url=".$url.""
Also define your $t and $u variables before your loop in case you have no record.
Next time, try to var_dump your generated query to test it.
If you were able to see the errors the DBMS is reporting back to PHP then you'd probably be able to work out what's wrong with the code.
Before the 'while' loop try...
print mysql_error();
(the obvious reason it's failing is that strings mut be quoted in SQL, and you've got the parameters the wrong way around)

Sql to JSON/XML [duplicate]

I am trying to export my MySQL tables from my database to a JSON file, so I can list them in an array.
I can create files with this code no problem:
$sql=mysql_query("select * from food_breakfast");
while($row=mysql_fetch_assoc($sql))
{
$ID=$row['ID'];
$Consumption=$row['Consumption'];
$Subline=$row['Subline'];
$Price=$row['Price'];
$visible=$row['visible'];
$posts[] = array('ID'=> $ID, 'Consumption'=> $Consumption, 'Subline'=> $Subline, 'Price'=> $Price, 'visible'=> $visible);
}
$response['posts'] = $posts;
$fp = fopen('results.json', 'w');
fwrite($fp, json_encode($response));
fclose($fp);
Now this reads a table and draws it's info from the fields inside it.
I would like to know if it is possible to make a JSON file with the names of the tables, so one level higher in the hierarchy.
I have part of the code:
$showtablequery = "
SHOW TABLES
FROM
[database]
LIKE
'%food_%'
";
$sql=mysql_query($showtablequery);
while($row=mysql_fetch_array($sql))
{
$tablename = $row[0];
$posts[] = array('tablename'=> $tablename);
}
$response['posts'] = $posts;
But now i am stuck in the last line where is says: $ID=$row['ID']; This relates to the info inside the Table and I do not know what to put here.
Also as you can see, I need to filter the Tables to only list the tables starting with food_ and drinks_
Any help is greatly appreciated:-)
There is no 'table id' in MySQL and therefore the result set from SHOW TABLES has no index id. The only index in the resultset is named 'Tables_in_DATABASENAME'.
Also you should use the mysqli library as the good old mysql library is depreacted. Having prepared an example:
<?php
$mysqli = new mysqli(
'yourserver',
'yourusername',
'yourpassword',
'yourdatabasename'
);
if ($mysqli->connect_errno) {
echo "Failed to connect to MySQL: (" . $mysqli->connect_errno . ") "
. $mysqli->connect_error;
}
$result = $mysqli->query('SHOW TABLES FROM `yourdatabasename` LIKE \'%food_%\'');
if(!$result) {
die('Database error: ' . $mysqli->error);
}
$posts = array();
// use fetch_array instead of fetch_assoc as the column
while($row = $result->fetch_array()) {
$tablename = $row[0];
$posts []= array (
'tablename' => $tablename
);
}
var_dump($posts);

Perl Import large .csv to MySQL, don't repeat data

I am trying to import several .csv files into a mysql database, the script below works except that it only imports the first row of my csv data into the database. Both my tables are populated with exactly one data entry.
Any help would be appreciated.
Thank you
#!/usr/bin/perl
use DBI;
use DBD::mysql;
use strict;
use warnings;
# MySQL CONFIG VARIABLES
my $host = "localhost";
my $user = "someuser";
my $pw = "somepassword";
my $database = "test";
my $dsn = "DBI:mysql:database=" . $database . ";host=" . $host;
my $dbh = DBI->connect($dsn, $user, $pw)
or die "Can't connect to the DB: $DBI::errstr\n";
print "Connected to DB!\n";
# enter the file name that you want import
my $filename = "/home/jonathan/dep/csv/linux_datetime_test_4.26.13_.csv";
open FILE, "<", $filename or die $!;
$_ = <FILE>;
$_ = <FILE>;
while (<FILE>) {
my #f = split(/,/,$_);
if (length($f[4]) < 10) {
print "No Weight\n";
}
else {
#insert the data into the db
print "insert into datetime_stamp\n";
}
my $sql = "INSERT INTO datetime_stamp (subject, date, time, weight)
VALUES('$f[1]', '$f[2]', '$f[3]', '$f[4]')";
print "$sql\n";
my $query = $dbh->do($sql);
my $sql = "INSERT INTO subj_weight (subject, weight) VALUES('$f[1]', '$f[2]')";
my $query = $dbh->do($sql);
close(FILE);
}
As has been commented, you close the input file after reading the first data entry, and so only populate your database with a single record.
However there are a few problems with your code you may want to consider:
You should set autoflush on the STDOUT file handle if you are printing diagnostics as the program runs. Otherwise perl won't print the output until either it has a buffer full of text to print or the file handle is closed when the program exits. That means you may not see the messages you have coded until long after the event
You should use Text::CSV to parse CSV data instead of relying on split
You can interpolate variables into a double-quoted string. That avoids the use of several concatenation operators and makes the intention clearer
Your open is near-perfect - an unusual thing - because you correctly use the three-parameter form of open as well as testing whether it succeeded and putting $! in the die string. However you should also always use a lexical file handle as well instead of the old-fashioned global ones
You don't chomp the lines you read from the input, so the last field will have a trailing newline. Using Text::CSV avoids the need for this
You use indices 1 through 4 of the data split from the input record. Perl indices start at zero, so that means you are droppping the first field. Is that correct?
Similarly you are inserting fields 1 and 2, which appear to be subject and date, into fields called subject and weight. It seems unlikely that this can be right
You should prepare your SQL statements, use placeholders, and provide the actual data in an execute call
You seem to diagnose the data read from the file ("No Weight") but insert the data into the database anyway. This may be correct but it seems unlikely
Here is a version of your program that includes these amendments. I hope it is of use to you.
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
use Text::CSV;
use IO::Handle;
STDOUT->autoflush;
# MySQL config variables
my $host = "localhost";
my $user = "someuser";
my $pw = "somepassword";
my $database = "test";
my $dsn = "DBI:mysql:database=$database;host=$host";
my $dbh = DBI->connect($dsn, $user, $pw)
or die "Can't connect to the DB: $DBI::errstr\n";
print "Connected to DB!\n";
my $filename = "/home/jonathan/dep/csv/linux_datetime_test_4.26.13_.csv";
open my $fh, '<', $filename
or die qq{Unable to open "$filename" for input: $!};
my $csv = Text::CSV->new;
$csv->getline($fh) for 1, 2; # Drop header lines
my $insert_datetime_stamp = $dbh->prepare( 'INSERT INTO datetime_stamp (subject, date, time, weight) VALUES(?, ?, ?, ?)' );
my $insert_subj_weight = $dbh->prepare( 'INSERT INTO subj_weight (subject, weight) VALUES(?, ?)' );
while (my $row = $csv->getline($fh)) {
if (length($row->[4]) < 10) {
print qq{Invalid weight: "$row->[4]"\n};
}
else {
#insert the data into the db
print "insert into datetime_stamp\n";
$insert_datetime_stamp->execute(#$row[1..4]);
$insert_subj_weight->execute(#$row[1,4]);
}
}

MySQL concatenation and Illegal mix of collations error

I keep getting an error using MySQL 5.5.27 when trying to concatenate some values. I've searched and seen a bunch of charset answers (which admittedly is a TAD over my head), but I've converted all my tables to Charset utf8-unicode-ci and still get the error.
Surely there is a way to concatenate these values, but I just don't know how. I'm an Oracle guy that is relatively new to MySQL.
Here is the SQL line:
concat(pl.last_name,'-',format(money,0))
I get:
#1270 - Illegal mix of collations (latin1_swedish_ci,IMPLICIT), (utf8_unicode_ci,COERCIBLE), (utf8_unicode_ci,COERCIBLE) for operation 'concat'
Any ideas?
If money is indeed a number inside a VARCHAR you could use cast.
Try this:
concat_ws(pl.last_name,'-',cast(money AS unsigned)); // This is with decimals.
concat(`pl.last_name,'-',substring_index(money,',',1)) // Without decimals. If you use . i.e. the American currency notation you can substitute , with an .
Edit
Your should first try: concat(pl.last_name,'-',format(money,0));
This a very basic php code you could use.
<?php
function selecting_data(){
$host = "host";
$user = "username";
$password = "password";
$database = "database";
$charset = "utf8";
$link = mysqli_connect($host, $user, $password, $database);
mysqli_set_charset($charset, $link);
IF (!$link) {
echo('Unable to connect to the database!');
} ELSE {
$query = "SELECT lastname, format(money,0) FROM mytable"; //Select query
$result = mysqli_query($link, $query);
while ($rows = mysqli_fetch_array($result, MYSQLI_BOTH)){
echo $rows['lastname']."<br>".$rows['money'] ;
}
}
mysqli_close($link);
}
?>
<html>
<head><title>title</title></head>
<body>
<?PHP echo selecting_data(); ?>
</body>
</html>