php mysql simplexml_load_string() error mysql_escape_string - mysql

I save a XML file content in MySQL database with:
$content = mysql_escape_string($content);
$insert = mysql_query("insert into $db_table_xml (url,content) values ('$url','$content')" );
//content type : TEXT in MySQL
simplexml_load_string($content);
it returns an error:
Warning: simplexml_load_string()
[function.simplexml-load-string]:
Entity: line 361: parser error :
AttValue: ' expected in
D:\mkw\dev\Web\PHPnow-1.5.6\htdocs\yt2\common.php
on line 84
Notice: Trying to get property of
non-object in
D:\mkw\dev\Web\PHPnow-1.5.6\htdocs\yt2\common.php
on line 146
Warning: Invalid argument supplied for
foreach() in
D:\mkw\dev\Web\PHPnow-1.5.6\htdocs\yt2\common.php
on line 146

This error means it is not valid xml, inspect the xml file to see if there is anything wrong with it.
Look for line 361 in it, there is probably special character or similar error
Edit.
Obviously when you escape the xml, you introduced invalid characters in your xml,
use.
$content_escape = mysql_escape_string($content);
$insert = mysql_query("insert into $db_table_xml (url,content) values ('$url','$content_escape')" ); //content type : TEXT in MySQL
now your $contents is not affected

Looks to me like your variable $content contains single quotes that are affecting the termination of the string. This is what your can do if this is the case:
$content= addslashes($content);
Then you write this into your record.

Related

Why does reading JSON from database with Perl warn about "wide characters"?

I'm reading data from a database using Perl (I'm new to Perl), one of the columns is a JSON array. The problem I'm having is that when I try to read the data in the JSON I get an error "Wide character in subroutine entry".
Table:
id | name | date | data
Sample data.
{ "duration":"24", "name":"My Test","visible":"1" }
use JSON qw(decode_json);
my $current_connection = $DATABASE_CONNECTION->prepare( "SELECT * FROM data WHERE syt = 'area1' " );
$current_connection->execute();
while( my $db_data = $current_connection->fetchrow_hashref() )
{
my $name = $db_data->{name};
my $object = decode_json($db_data->{data});
foreach my $key (sort keys %{$object}) {
my $result;
$pages .= "<p> $result->{$key}->{name} </p>";
}
}
That error means a character greater than 255 was passed to a sub expecting a string of bytes.
When stored in the database, the string is encoded using some character encoding, possibly UTF-8. You appear to have decoded the text (e.g. by using mysql_enable_utf8mb4 => 1), producing a string of Unicode Code Points. However, decode_json expects UTF-8.
The solution is to use from_json or JSON->new->decode instead of decode_json; these expect decoded text (a string of UCP).
You can verify that this is the issue using sprintf '%vX', $json.
For example,
If you get E9 and 2660 for "é" and "♠",
That's their UCP. Use from_json or JSON->new->decode.
If you get C3.A9 and E2.99.A0 for "é" and "♠",
That's their UTF-8. Use decode_json or JSON->new->utf8->decode.

putting commas separated values into a new file line by line

From command line, we are passing multiple values separated by commas such as sydney,delhi,NY,Russia as an option. These values are getting stored under $runTest in the perl script. Now I want to create a new file under the script with contents of $runTest but as line by line. For example:
INPUT (passed values from command line):
sydney,delhi,NY,Russia
OUTPUT (under new file: myfile):
sydney
delhi
NY
Russia
In this simple example, it is better to use split on a delimiter than tr in such case. A few minor points: use snake_case for names instead of CamelCase, and use autodie to make open, close, etc, fatal, without the need to clutter the code with or die "...":
use autodie;
my $run_test = 'sydney,delhi,NY,Russia';
open my $out, '>', 'myFile';
print {$out} map { "$_\n" } split /,/, $run_test;
close $out;
For more robust parsing in general, beyond this simple example, prefer specialized modules, such as Text::CSV or Text::CSV_XS for csv parsing. Compared to the overly simplistic split, Text::CSV_XS enables correct input/output of quoted fields, fields containing the delimiter (comma), binary characters, provides error messages and more. Example:
use Text::CSV_XS;
use autodie;
open my $out, q{>}, q{myFile};
# All of these input strings are parsed correctly, unlike when using "split":
# my $run_test = q{sydney,delhi,NY,Russia};
# my $run_test = q{sydney,delhi,NY,Russia,"field,with,commas"};
my $run_test = q{sydney,delhi,NY,Russia,"field,with,commas","field,with,missing,quote};
# binary => 1 : enable parsing binary characters in quoted fields.
# auto_diag => 1 : print the internal error code and the associated error message to STDERR.
my $csv = Text::CSV_XS->new( { binary => 1, auto_diag => 1 } );
if ( $csv->parse( $run_test ) ) {
print {$out} map { "$_\n" } $csv->fields;
}
else {
print STDERR q{parse() failed on: }, $csv->error_input, qq{\n};
}

I have a warning on all of my posts on my page after I updated my theme

I have just updated my theme and the following error occurs:
Warning: sprintf(): Too few arguments in /home/itsallab/public_html/wp-content/themes/covernews/lib/breadcrumb-trail/inc/breadcrumbs.php on line 254
On this line of the file it writes :
:sprintf('%s', $item );
What part could cause this error?
i found the resolution. it had two '' more that needed in here "'.esc_url( $link_item ).'"
dont know why after the update it changed but i
String concatenation and sprintf should not been combined in a single line. Try using the following:
sprintf('%s', esc_url( $link_item ), $item );

Reading from Json File garbage after json object

I'm using json file in a Perl program. I'm unable to parse the json file.
It is giving following error:
garbage after JSON object, at character offset 2326471 (before "{"response":{"numFou...") at /usr/local/share/perl5/JSON.pm line 171, <$f> line 1.
Here is the code:
print "input json";
open(my $f, "<", "$ARGV[1]");
my $content=<$f>;
my $structured;
eval {
$structured = from_json($content, {utf8 => 1});
};
if ($#) {
$content =~ s/\n/ /g;
my $errMsg = $#;
$errMsg =~ s/\n/ /g;
WriteInfo("Unparseable result for url=$url, error: $errMsg\n") ;
};
How can I fix this error?
How can I fix this error?
You can't fix JSON data automatically. There could be many "fixes" that will get the data through the parser, but it may be difficult to tell which of them is the correct one. You should talk to the source of the data and ask for a correct version
It may be possible to fix the data manually, but you should only attempt this if there is no correct version of the data available. Finding the error in a 2.2MB+ text file by hand isn't a trivial job, and the character position 2326471 is only where the parser found an error, not where the correction should be made
garbage after JSON object ...
This implies that from_json has found the end of the JSON data -- i.e. the final closing brace } or bracket ] -- but there is data in the string after that character. It may be that the file has been written correctly, but there really is spurious data after the end of the JSON. If so then that should be obvious just by examining the data file
Note
Unless you have redefined the $/ variable, these lines
open(my $f, "<", "$ARGV[1]");
my $content = <$f>;
will read just the first line of the file into $content. It may be that the file contains just a single very long line of tex (i.e. it contains no newline characters) but this line in your error handler
$content =~ s/\n/ /g;
implies that ther are newlines in there.
Reading only the first line of a multi-line JSON file wouldn't cause the error that you're seeing, but it is best to read the entire file into memory before decoding it as JSON data, just in case unexpected newlines have crept into the data
Here is a better way of writing your code segment
print "Input JSON\n";
my $content - do {
open my $fh, '<', $ARGV[1] or die qq{Unable to open "$ARGV[1]" for input: $!};
local $/;
<$fh>;
};
my $structured = eval {
from_json( $content, {utf8 => 1} );
};
if ( my $err_msg = $# ) {
$content =~ tr/\n/ /;
$err_msg =~ tr/\n/ /;
WriteInfo("Unparsable result for URL=$url, error: $err_msg\n") ;
};

perl eval throwing blank exception

I am updating a few records in database. And all the processing is done in eval block.
The problem is that even if the records are successfully updated , I still see a exception being raised.
To debug the exception, I tried printing it using Data Dumper but the exception is blank.
Can any one please help me identify what is this error and why is it thrown every time ?
Environment Details (Perl 5.8 and Unix SUSE)
Dump from Data Dumper:
$VAR1 = '
';
I am using various internal APIs, to update these records.. so I have modified my code to look similar:
sub main{
eval{
DB->updateRecord($value)
};
if($#){
Mail->SendMail(__PACKAGE__,$#):
}
}
package DB;
sub updateRecord{
my ($self , $value) = #_;
my $query = "update set column_value = $value ..<update query> ";
API->processQuery($query );
}
Does your code use warnings;?
The symptom your describing indicated that in your code you are passing die the string "\n". My guess would be that in your source you have a line that is trying to die with an error message but your error message was not initialized. It could be something like
my $error;
if (some_test()) {
$error = 'Some String';
}
if (some_other_test()) {
die "$error\n";
}
If some_test() passes but some_other_test() fails the die will report an error containing only a new line. It would also emit an warning if warnings are enabled.
Another possibility is a typo. If you don't use strict; the error variable might not be correct.
my $error = 'Some String';
if ($error) {
#note the typo (transposed ro to or)
die "$erorr\n";
}
Without use strict; this can be an easy mistake to miss.