how to import json file from https web page into a mariadb or mysql database? - mysql

I am trying to import a file into a mariadb (mysql), database. A proof of concept file is in .json format on the web at this location. https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
I know how to do this on db2 for i.
select * from JSON_TABLE(
SYSTOOLS.HTTPGETCLOB('https://earthquake.usgs.gov' ||
'/earthquakes/feed/v1.0/summary/all_week.geojson',null),
'$.features[*]'
COLUMNS( MILLISEC BIGINT PATH '$.properties.time',
MAG DOUBLE PATH '$.properties.mag',
PLACE VARCHAR(100) PATH '$.properties.place'
)) AS X;
This reads a from the web and lists 3 fields. Surrounding it with an insert clause will put it into a database file for me.
I would like to do exactly the same thing on my home server using mariadb. Ultimately a script will run unattended on a hosted server.
A .json segment of the earthquake data looks like this:
… "features":[
{"type":"Feature",
"properties":{
"mag":1.1,
"place":"58 km WNW of Anchor Point, Alaska",
"time":1640472257402,
"updated":1640472615410,
"tz":null,
"url":"https://earthquake.usgs.gov/earthquakes/eventpage/ak021gi3al5x",
"detail":"https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ak021gi3al5x.geojson",
"felt":null,
"cdi":null,
"mmi":null,
"alert":null,
"status":"automatic",
"tsunami":0,
"sig":19,
"net":"ak",
"code":"021gi3al5x",
"ids":",ak021gi3al5x,",
"sources":",ak,",
"types":",origin,",
"nst":null,
"dmin":null,
"rms":0.79,
"gap":null,
"magType":"ml",
"type":"earthquake",
"title":"M 1.1 - 58 km WNW of Anchor Point, Alaska"},
"geometry":{
"type":"Point",
"coordinates":[-152.8406,59.9119,89.7]
},
"id":"ak021gi3al5x"
}, ...

Just a thought, but you could maybe just write a BASH script, e.g.
#!/bin/sh
cd "$(dirname "$0")"
export PATH=/bin:/usr/bin:/usr/local/bin
TODAY=`date +"%d%b%Y-%H%M"`
MYSQL_HOST='127.0.0.1'
MYSQL_PORT='3306'
MYSQL_USER='root'
MYSQL_PASSWORD='root'
url=https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
DATA=$(curl ${url} 2>/dev/null)
printf '%s' "$DATA" | awk '{print $0}'
exit
...
...
...
and then use a tool like https://webinstall.dev/jq/, Python and whatever other tools you have on your system to extract the data that you want and then update the DB.
I am more familiar with PHP, which would work also. You could tidy that up a bit, but seems to work actually.
earthquake.php
<?php
$database = false;
try {
$options = array(PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_OBJ, PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING, PDO::ATTR_EMULATE_PREPARES => true );
$conn = new PDO('mysql:host=127.0.0.1;dbname=test;port=3306;charset=utf8','root','root', $options);
} catch (PDOException $e) {
// Echo custom message. Echo error code gives you some info.
echo '[{"error":"Database connection can not be estabilished. Please try again later. Error code: ' . $e->getCode() . '"}]';
exit;
}
$url = "https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson";
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$result = curl_exec($ch);
$features = json_decode($result)->features;
foreach ($features as $feature) {
echo 'Mag: '.$feature->properties->mag.', Place: '.$feature->properties->place.', '.gmdate("Y-m-d H:i:s", $feature->properties->time/1000).PHP_EOL;
$query = 'INSERT INTO features (mag, place, time) VALUES (?, ?, ?)';
$params = [$feature->properties->mag, $feature->properties->place, gmdate("Y-m-d H:i:s", $feature->properties->time/1000)];
$stmt = $conn->prepare($query) or die ('["status":{"error":"Prepare Statement Failure","query":"' .$query . '"}]');
$stmt->execute($params) or die('[{"error":"' . $stmt->errorInfo()[2] . '","query":"' .$query . '","params":' .json_encode($params) . '}]');
}
?>
create a local DB called features with mag, place and time columns. If you have php on your system just run it from the CLI, php earthquake.php
e.g. Insert into mysql from Bash script
I use Laravel a bit, and would probably actually build my own model and use Eloquent and a little UI to handle that, but using a script seems like an option.

Related

Perl - passing user inputted cgi form data to perl program to be then put into mysql database

I've tried searching forums for a solution, however most of the answers were either too difficult to understand. SO I'm in the process of making a website for a small community, and currently we have our database and html design layout, but I'm getting snagged on how to push my Perl CGI form into another Perl program to then alter my database.
Here is the Perl controller that alters the database tables (and it works):
#!/usr/bin/perl -w
#!/usr/bin/perl -wT
# DBI is the standard database interface for Perl
# DBD is the Perl module that we use to connect to the <a href=http://mysql.com/>MySQL</a> database
use DBI;
use DBD::mysql;
use warnings;
#----------------------------------------------------------------------
# open the accessDB file to retrieve the database name, host name, user name and password
open(ACCESS_INFO, "accessDB.txt") || die "Can't access login credentials";
# assign the values in the accessDB file to the variables
my $database = <ACCESS_INFO>;
my $host = <ACCESS_INFO>;
my $userid = <ACCESS_INFO>;
my $passwd = <ACCESS_INFO>;
my $tablename = "Article";
# the chomp() function will remove any newline character from the end of a string
chomp ($database, $host, $userid, $passwd);
# close the accessDB file
close(ACCESS_INFO);
#----------------------------------------------------------------------
# invoke the ConnectToMySQL sub-routine to make the database connection
$connection = ConnectToMySql($database);
if ($tablename == "Article"){
$connection = InsertArticle($database);
}
elsif ($tablename == "Category"){
$connection = InsertCategory($database);
}
elsif ($tablename == "Comment"){
$connection = InsertComment($database);
}
elsif ($tablename == "User"){
$connection = InsertUser($database);
}
else {
print "No such table found. Contact website administrator.\n"
}
sub InsertArticle{
$query = "insert into $tablename (Id, CategoryId, UserId, Title, Content) values(?, ?, ?, ?, ?)";
$statement = $connection->prepare($query);
$statement->execute('undef', '1', '1029', 'Dota2>League', 'textfromarticle');
}
sub InsertCategory{
$query = "insert into $tablename (Id, CategoryId, UserId, Title, Content) values(?, ?, ?, ?, ?)";
$statement = $connection->prepare($query);
$statement->execute('undef', '1', '1029', 'Dota2>League', 'textfromarticle');
}
sub InsertComment{
$query = "insert into $tablename (Id, CategoryId, UserId, Title, Content) values(?, ?, ?, ?, ?)";
$statement = $connection->prepare($query);
$statement->execute('undef', '1', '1029', 'Dota2>League', 'textfromarticle');
}
sub InsertUser{
$query = "insert into $tablename (Id, CategoryId, UserId, Title, Content) values(?, ?, ?, ?, ?)";
$statement = $connection->prepare($query);
$statement->execute('undef', '1', '1029', 'Dota2>League', 'textfromarticle');
}
exit;
#--- start sub-routine ------------------------------------------------
sub ConnectToMySql {
#----------------------------------------------------------------------
my ($db) = #_;
# assign the values to your connection variable
my $connectionInfo="dbi:mysql:$db;$host";
# make connection to database
my $l_connection = DBI->connect($connectionInfo,$userid,$passwd);
# the value of this connection is returned by the sub-routine
return $l_connection;
}
#--- end sub-routine --------------------------------------------------
In the future, I'll define the other tables in my database through global variables that depend on what button the user presses on the correct webpage. As in, if they're viewing a list of articles, an option at the top would be "submit an article". And from there, the form CGI would be sent to them that they can fill out.
And here is the CGI that makes the form that would be submitted to the above controller script to alter the table:
#!/usr/bin/perl
#!/usr/bin/perl -wT
use strict;
use warnings;
use CGI;
use CGI::Carp qw(fatalsToBrowser); #remove this in production
my $q = new CGI;
print $q->header; #Content-Type: text/html; charset=ISO-8859-1
print $q->start_html(
-title => 'submit an Article', #page name
-style => {'src' => '/dmrwebsite/dmrwebsite/userinterface'}, #link to style sheet
);
print $q->start_form(
-name => 'submitting an Article',
-method => 'POST',
enctype => &CGI::URL_ENCODED,
-onsubmit => 'return true',
-action => '/dmrwebsite/dmrwebsite/controller.addtotable.pl',
);
print $q-.textfield(
-name => 'title',
-value => 'default value',
-required,
-size => 20,
-maxlength =>50,
);
print $q->textarea(
-name => 'content',
-value => 'default value',
-required,
-maxlength => 1000,
-cols => 60,
);
print $q->textarea(
-name => 'url',
-value => 'default value',
maxlength => 100,
cols =>60,
);
print $q-checkbox(
-name => 'humancheck',
-checked => 1,
-value => 'two',
-label => 'The number two',
);
print $q-submit(
-name => 'submit_Article',
-value => 'submit Article',
-onsumbit => 'javascript: validate_form()',
);
if ($q->param()) {
submit_form($q);
} else {
print "Please check your form for inaccuracies.";
}
sub submit_form($){
my ($q) = #_;
}
print $q->end_form; #ends the form html
print $q->end_html; #end the html document
So basically what I'm stuck at is understand how to send the form data to the perl script in which I can then define the table data in my $tablename = "Article"; and $statement->execute('undef', '1', '1029', 'Dota2>League', 'textfromarticle');.
Also, I don't have a javascript program to send to the parameter -onsubmit => javaapplication(),. Is that needed? Can I substitute my own Perl program to check the user inputted fields? And how would I call this function? IN the same file or can it just be in the parent directory like /website/perlchecker.pl?
Any help would be greatly appreciated as I'm only a couple days into using Perl let alone CGI and html. Got a couple people helping me on the front end of the website though.
Thanks,
-Ori
So many suggestions...
Firstly, your DB insertion program seems to just insert fixed data, so I'm not sure how you think that it works. Also, the if ($tablename == "Article") (and similar) line doesn't do what you want it to. You need to use eq instead of ==.
To answer the question that you asked - you need to change your database program so that it accepts input (probably command line arguments) containing the data that you want inserted into the database. You would then add to your CGI program a line that calls this program (probably using system()) passing it the data from the CGI parameters on the command line.
The code would look something like this:
my $title = $q->param('title');
my $content = $q->param('title');
# ... other params ...
system('db_script.pl', 'Article', $title, $content, ...)';
But please don't do that. That's a terrible idea.
Instead, I highly recommend that you re-write your database manipulation program as a module. That way, you can load the module into any program that needs to talk to the database and access the database by calling functions rather than by calling an external program. If it was down to me, then I'd definitely use DBIx::Class to produce this library - but I realise that might well be seen as rather advanced.
Then there's the elephant in the room. You're still using CGI to write your web interface. The CGI module has been removed from the latest version of Perl as it is no longer considered best practice for writing web applications. I recommend looking at CGI::Alternatives to find out about other, more modern, tools.
But if you're determined to carry on writing your program as a CGI program, then at the very least, please don't use the HTML generation functions. We've known that including your HTML in your program source code is a terrible idea for at least fifteen years. There's no reason to still be doing it in 2015. You should really be using some kind of templating engine to separate your HTML from your Perl code. I recommend the Template Toolkit.
I'm not sure where you are learning these techniques from, but your source seems to be a good ten years behind accepted best practice.

MySQL 4.0 Gibberish Cannot Convert to Newer MySQL

One of the websites I host still runs on PHP 5.2 with MySQL 4.0. Its text is in Hebrew (displays just fine on the site), but in the DB the text appears as gibberish containing question marks—but not exclusively. It looks some like: ?????£ ?§???¥ ???£ ?×???
I am trying to move this DB to MySQL 5.x with the same website, with no luck so far. I have tried using the MYSQL 40 compatibility mode, as well as other compatibility modes. I have made sure that the destination DB has the hebrew_bin collation as the old one, and I've played around with SET NAMES. The problem is, this is a type of gibberish I am not yet familiar with and therefore have no idea how to convert it to readable text.
I didn't get another answer here, and therefore had no choice but to write a simple script that manually does the migration process through PHP and not through MySQL dumps.
Here is the script, with some obvious modifications. Please note that the script is "dirty code", it might not be secure and it's not the most efficient, but it should get the job done if you're using it internally. Do not use in an environment where the script is accessible to the public without further modifications.
<?php
$local = mysqli_connect([source server], [source username], [source password], [source DB name]) or die('No local connection');
mysqli_query($local, "SET NAMES 'hebrew'");
$remote = mysqli_connect([destination server], [destination username], [destination password], [destination DB name]) or die('No remote connection');
mysqli_query($remote, "SET NAMES 'utf8'");
$table_list = array('table1', 'table2', 'table3'); // you can get a list of tables automatically too if the list is very long
foreach ($table_list as $table)
{
$query_local = "SELECT * FROM `{$table}`";
$result_local = mysqli_query($local, $query_local) or die('Error in q1: '.mysqli_error($local));
$delete_remote = mysqli_query($remote, "DELETE FROM `{$table}` WHERE 1") or die('Error deleting table: '.$table); // necessary only if you plan to run it more than once
$i = 0;
while ($row_local = mysqli_fetch_assoc($result_local))
{
foreach ($row_local as $key => $value)
$row_local[$key] = mysqli_real_escape_string($remote, $value);
$query_remote = "INSERT INTO `{$table}` (`".implode('`, `', array_keys($row_local))."`) VALUES ('".implode("', '", $row_local)."')";
$result_remote = mysqli_query($remote, $query_remote)
or die('Error in remote q'.$i.' in table '.$table.':<br /> '.mysqli_error($remote).'<br /> Query: '.$query_remote);
echo 'Successfully transferred row '.$i.' in table '.$table;
echo '<br />'.PHP_EOL;
$i++;
}
}
?>

Perl param() receiving from its own print HTML

I have a Perl script that reads in data from a database and prints out the result in HTML forms/tables. The form of each book also contains a submit button.
I want Perl to create a text file (or read into one already created) and print the title of the book that was inside the form submitted. But I can't seem to get param() to catch the submit action!
#!/usr/bin/perl -w
use warnings; # Allow for warnings to be sent if error's occur
use CGI; # Include CGI.pm module
use DBI;
use DBD::mysql; # Database data will come from mysql
my $dbh = DBI->connect('DBI:mysql:name?book_store', 'name', 'password')
or die("Could not make connection to database: $DBI::errstr"); # connect to the database with address and pass or return error
my $q = new CGI; # CGI object for basic stuff
my $ip = $q->remote_host(); # Get the user's ip
my $term = $q->param('searchterm'); # Set the search char to $term
$term =~ tr/A-Z/a-z/; # set all characters to lowercase for convenience of search
my $sql = '
SELECT *
FROM Books
WHERE Title LIKE ?
OR Description LIKE ?
OR Author LIKE ?
'; # Set the query string to search the database
my $sth = $dbh->prepare($sql); # Prepare to connect to the database
$sth->execute("%$term%", "%$term%", "%$term%")
or die "SQL Error: $DBI::errstr\n"; # Connect to the database or return an error
print $q->header;
print "<html>";
print "<body>";
print " <form name='book' action='bookcart.php' method=post> "; # Open a form for submitting the result of book selection
print "<table width=\"100%\" border=\"0\"> ";
my $title = $data[0];
my $desc = $data[1];
my $author = $data[2];
my $pub = $data[3];
my $isbn = $data[4];
my $photo = $data[5];
print "<tr> <td width=50%>Title: $title</td> <td width=50% rowspan=5><img src=$photo height=300px></td></tr><tr><td>Discreption Tags: $desc</td></tr><tr><td>Publication Date: $pub</td></tr><tr><td>Author: $author</td></tr><tr><td>ISBN: $isbn</td> </tr></table> <br>";
print "Add this to shopping cart:<input type='submit' name='submit' value='Add'>";
if ($q->param('submit')) {
open(FILE, ">>'$ip'.txt");
print FILE "$title\n";
close(FILE);
}
print "</form>"; # Close the form for submitting to shopping cart
You haven't used use strict, to force you to declare all your variables. This is a bad idea
You have used remote_host, which is the name of the client host system. Your server may not be able to resolve this value, in which case it will remain unset. If you want the IP address, use remote_addr
You have prepared and executed your SQL statement but have fetched no data from the query. You appear to expect the results to be in the array #data, but you haven't declared this array. You would have been told about this had you had use strict in effect
You have used the string '$ip'.txt for your file names so, if you were correctly using the IP address in stead of the host name, your files would look like '92.17.182.165'.txt. Do you really want the single quotes in there?
You don't check the status of your open call, so you have no idea whether the open succeeded, or the reason why it may have failed
I doubt if you have really spent the last 48 hours coding this. I think it is much more likely that you are throwing something together in a rush at the last minute, and using Stack Overflow to help you out of the hole you have dug for yourself.
Before asking for the aid of others you should at least use minimal good-practice coding methods such as applying use strict. You should also try your best to debug your code: it would have taken very little to find that $ip has the wrong value and #data is empty.
Use strict and warnings. You want to use strict for many reasons. A decent article on this is over at perlmonks, you can begin with this. Using strict and warnings
You don't necessarily need the following line, you are using DBI and can access mysql strictly with DBI.
use DBD::mysql;
Many of options are available with CGI, I would recommend reading the perldoc on this also based on user preferences and desired wants and needs.
I would not use the following:
my $q = new CGI;
# I would use as so..
my $q = CGI->new;
Use remote_addr instead of remote_host to retrieve your ip address.
The following line you are converting all uppercase to lowercase, unless it's a need to specifically read from your database with all lowercase, I find this useless.
$term =~ tr/A-Z/a-z/;
Next your $sql line, again user preference, but I would look into sprintf or using it directly inside your calls. Also you are trying to read an array of data that does not exist, where is the call to get back your data? I recommend reading the documentation for DBI also, many methods of returning your data. So you want your data back using an array for example...
Here is an untested example and hint to help get you started.
use strict;
use warnings;
use CGI qw( :standard );
use CGI::Carp qw( fatalsToBrowser ); # Track your syntax errors
use DBI;
# Get IP Address
my $ip = $ENV{'REMOTE_ADDR'};
# Get your query from param,
# I would also parse your data here
my $term = param('searchterm') || undef;
my $dbh = DBI->connect('DBI:mysql:db:host', 'user', 'pass',
{RaiseError => 1}) or die $DBI::errstr;
my $sql = sprintf ('SELECT * FROM Books WHERE Title LIKE %s
OR Description LIKE %s', $term, $term);
my $sth = $dbh->selectall_arrayref( $sql );
# Retrieve your result data from array ref and turn into
# a hash that has title for the key and a array ref to the data.
my %rows = ();
for my $i ( 0..$#{$sth} ) {
my ($title, $desc, $author, $pub, $isbn, $pic) = #{$sth->[$i]};
$rows{$title} = [ $desc, $author, $pub, $isbn, $pic ];
}
# Storing your table/column names
# in an array for mapping later.
my #cols;
$cols[0] = Tr(th('Title'), th('Desc'), th('Author'),
th('Published'), th('ISBN'), th('Photo'));
foreach (keys %rows) {
push #cols, Tr( td($_),
td($rows{$_}->[0]),
td($rows{$_}->[1]),
td($rows{$_}->[2]),
td($rows{$_}->[3]),
td(img({-src => $rows{$_}->[4]}));
}
print header,
start_html(-title => 'Example'),
start_form(-method => 'POST', -action => 'bookcart.php'), "\n",
table( {-border => undef, -width => '100%'}, #cols ),
submit(-name => 'Submit', -value => 'Add Entry'),
end_form,
end_html;
# Do something with if submit is clicked..
if ( param('Submit') ) {
......
}
This assumes that you're using the OO approach to CGI.pm, and that $q is the relevant object. This should work, assuming that you have $q = new CGI somewhere in your script.
Can you post the rest of the script?
I've created a mockup to test this, and it works as expected:
#!/usr/bin/perl
use CGI;
my $q = new CGI;
print $q->header;
print "<form><input type=submit name=submit value='add'></form>\n";
if ($q->param('submit')) {
print "submit is \"" . $q->param('submit') . "\"\n";
}
After the submit button is clicked, the page displays that submit is "add" which means the evaluation is going as planned.
I guess what you need to do is make sure that $q is your CGI object, and move forward from there.

Perl Import large .csv to MySQL, don't repeat data

I am trying to import several .csv files into a mysql database, the script below works except that it only imports the first row of my csv data into the database. Both my tables are populated with exactly one data entry.
Any help would be appreciated.
Thank you
#!/usr/bin/perl
use DBI;
use DBD::mysql;
use strict;
use warnings;
# MySQL CONFIG VARIABLES
my $host = "localhost";
my $user = "someuser";
my $pw = "somepassword";
my $database = "test";
my $dsn = "DBI:mysql:database=" . $database . ";host=" . $host;
my $dbh = DBI->connect($dsn, $user, $pw)
or die "Can't connect to the DB: $DBI::errstr\n";
print "Connected to DB!\n";
# enter the file name that you want import
my $filename = "/home/jonathan/dep/csv/linux_datetime_test_4.26.13_.csv";
open FILE, "<", $filename or die $!;
$_ = <FILE>;
$_ = <FILE>;
while (<FILE>) {
my #f = split(/,/,$_);
if (length($f[4]) < 10) {
print "No Weight\n";
}
else {
#insert the data into the db
print "insert into datetime_stamp\n";
}
my $sql = "INSERT INTO datetime_stamp (subject, date, time, weight)
VALUES('$f[1]', '$f[2]', '$f[3]', '$f[4]')";
print "$sql\n";
my $query = $dbh->do($sql);
my $sql = "INSERT INTO subj_weight (subject, weight) VALUES('$f[1]', '$f[2]')";
my $query = $dbh->do($sql);
close(FILE);
}
As has been commented, you close the input file after reading the first data entry, and so only populate your database with a single record.
However there are a few problems with your code you may want to consider:
You should set autoflush on the STDOUT file handle if you are printing diagnostics as the program runs. Otherwise perl won't print the output until either it has a buffer full of text to print or the file handle is closed when the program exits. That means you may not see the messages you have coded until long after the event
You should use Text::CSV to parse CSV data instead of relying on split
You can interpolate variables into a double-quoted string. That avoids the use of several concatenation operators and makes the intention clearer
Your open is near-perfect - an unusual thing - because you correctly use the three-parameter form of open as well as testing whether it succeeded and putting $! in the die string. However you should also always use a lexical file handle as well instead of the old-fashioned global ones
You don't chomp the lines you read from the input, so the last field will have a trailing newline. Using Text::CSV avoids the need for this
You use indices 1 through 4 of the data split from the input record. Perl indices start at zero, so that means you are droppping the first field. Is that correct?
Similarly you are inserting fields 1 and 2, which appear to be subject and date, into fields called subject and weight. It seems unlikely that this can be right
You should prepare your SQL statements, use placeholders, and provide the actual data in an execute call
You seem to diagnose the data read from the file ("No Weight") but insert the data into the database anyway. This may be correct but it seems unlikely
Here is a version of your program that includes these amendments. I hope it is of use to you.
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
use Text::CSV;
use IO::Handle;
STDOUT->autoflush;
# MySQL config variables
my $host = "localhost";
my $user = "someuser";
my $pw = "somepassword";
my $database = "test";
my $dsn = "DBI:mysql:database=$database;host=$host";
my $dbh = DBI->connect($dsn, $user, $pw)
or die "Can't connect to the DB: $DBI::errstr\n";
print "Connected to DB!\n";
my $filename = "/home/jonathan/dep/csv/linux_datetime_test_4.26.13_.csv";
open my $fh, '<', $filename
or die qq{Unable to open "$filename" for input: $!};
my $csv = Text::CSV->new;
$csv->getline($fh) for 1, 2; # Drop header lines
my $insert_datetime_stamp = $dbh->prepare( 'INSERT INTO datetime_stamp (subject, date, time, weight) VALUES(?, ?, ?, ?)' );
my $insert_subj_weight = $dbh->prepare( 'INSERT INTO subj_weight (subject, weight) VALUES(?, ?)' );
while (my $row = $csv->getline($fh)) {
if (length($row->[4]) < 10) {
print qq{Invalid weight: "$row->[4]"\n};
}
else {
#insert the data into the db
print "insert into datetime_stamp\n";
$insert_datetime_stamp->execute(#$row[1..4]);
$insert_subj_weight->execute(#$row[1,4]);
}
}

MySQL to XML file

I am trying to get MySQL database into an xml file; here is my code:
<?php
header("Content-type: text/xml");
include 'dbc.php';
$query = "SELECT * FROM airports LIMIT 50";
$result = mysql_query($query, $link)
or die('Error querying database.');
$xml = new SimpleXMLElement('<xml/>');
while($row = mysql_fetch_assoc($result)) {
$draw = $xml->addChild('draw');
$draw->addChild('ident',htmlentities(iconv("UTF-8", "ISO-8859-1//IGNORE",$row['ident'])));
$draw->addChild('name',htmlentities(iconv("UTF-8", "ISO-8859-1//IGNORE",$row['name'])));
}
mysql_close($link);
$fp = fopen("links2.xml","wb");
fwrite($fp,$xml->asXML());
fclose($fp);
Here is the error Im getting:
XML Parsing Error: no element found
Location: /sql2xml2.php
Line Number 1, Column 2:
-^
What am I doing wrong???
Your XML is considered invalid in your XML reader because of the thrown warning, thus the XML Parsing Error: junk after document element issue.
As for the warning itself, you need to escape special entities (namely &, < and > in your content when adding it like that (using str_replace usually works well for only those 3 when it comes to XML, htmlentities may yield undesired effects, unless you supply PHP 5.4's ENT_XML1 mode).
Refer to a related answer for more information of why this happens.
If you want just to export MySQL database to local XML file you can use mysqldump tool:
mysqldump --xml -u username -p databasename [tablename] > filename.xml
Got it to work with this code:
<?
header("content-type:text/xml");
function getXML($query="SELECT * FROM airports limit 50")
{
include 'dbc.php';
$result = mysql_query($query, $link)
or die('Error querying database.');
$columns="";
echo "<xml>\n";
while($row=mysql_fetch_assoc($result))
{
$columns.="\t<airport>\n";
foreach($row as $key => $value)
{
$value = htmlentities(iconv("UTF-8", "ISO-8859-1//TRANSLIT",$value));
$value = htmlentities(iconv("UTF-8", "ISO-8859-1//IGNORE",$value));
$columns.="\t\t<$key>$value</$key>\n";
}
$columns.="\t</airport>\n";
}
echo $columns;
echo "</xml>\n";
}
getXML();
?>