I have a PHP Warning.
fread() expects parameter 1 to be resource, boolean given in /tmp/func.php on line 12
My code :
<?php
$sqlite = 'teleinfo.sqlite';
function getTeleinfo () {
$handle = fopen ('/dev/ttyACM0', "r"); // ouverture du flux
while (fread($handle, 1) != chr(2));
--------
Thank you !
php --version
PHP 5.4.45-0+deb7u2 (cli) (built: Oct 27 2015 23:22:07)
Copyright (c) 1997-2014 The PHP Group
Zend Engine v2.4.0, Copyright (c) 1998-2014 Zend Technologies
'Boolean given' suggests the return value from your fopen is the value false, so your attempt to open /dev/ttyACM0 for reading failed.
You should check to see if your handle is valid after the fopen before trying to use it
$handle = fopen ('/dev/ttyACM0', "r"); // ouverture du flux
if ($handle) {
while (fread($handle, 1) != chr(2));
...
}
As for why? Perhaps the location doesn't exist or you do not have permission or maybe something else has it open with an exclusive lock.
If your are following this tutorial : http://www.magdiblog.fr/gpio/teleinfo-edf-suivi-conso-de-votre-compteur-electrique/
Data is recivied on /dev/ttyAMA0
stty -F /dev/ttyAMA0 1200 sane evenp parenb cs7 -crtscts
But in php script is parsed on /dev/ttyACM0
You should change the filename in php script !
Related
I am trying to import a file into a mariadb (mysql), database. A proof of concept file is in .json format on the web at this location. https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
I know how to do this on db2 for i.
select * from JSON_TABLE(
SYSTOOLS.HTTPGETCLOB('https://earthquake.usgs.gov' ||
'/earthquakes/feed/v1.0/summary/all_week.geojson',null),
'$.features[*]'
COLUMNS( MILLISEC BIGINT PATH '$.properties.time',
MAG DOUBLE PATH '$.properties.mag',
PLACE VARCHAR(100) PATH '$.properties.place'
)) AS X;
This reads a from the web and lists 3 fields. Surrounding it with an insert clause will put it into a database file for me.
I would like to do exactly the same thing on my home server using mariadb. Ultimately a script will run unattended on a hosted server.
A .json segment of the earthquake data looks like this:
… "features":[
{"type":"Feature",
"properties":{
"mag":1.1,
"place":"58 km WNW of Anchor Point, Alaska",
"time":1640472257402,
"updated":1640472615410,
"tz":null,
"url":"https://earthquake.usgs.gov/earthquakes/eventpage/ak021gi3al5x",
"detail":"https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ak021gi3al5x.geojson",
"felt":null,
"cdi":null,
"mmi":null,
"alert":null,
"status":"automatic",
"tsunami":0,
"sig":19,
"net":"ak",
"code":"021gi3al5x",
"ids":",ak021gi3al5x,",
"sources":",ak,",
"types":",origin,",
"nst":null,
"dmin":null,
"rms":0.79,
"gap":null,
"magType":"ml",
"type":"earthquake",
"title":"M 1.1 - 58 km WNW of Anchor Point, Alaska"},
"geometry":{
"type":"Point",
"coordinates":[-152.8406,59.9119,89.7]
},
"id":"ak021gi3al5x"
}, ...
Just a thought, but you could maybe just write a BASH script, e.g.
#!/bin/sh
cd "$(dirname "$0")"
export PATH=/bin:/usr/bin:/usr/local/bin
TODAY=`date +"%d%b%Y-%H%M"`
MYSQL_HOST='127.0.0.1'
MYSQL_PORT='3306'
MYSQL_USER='root'
MYSQL_PASSWORD='root'
url=https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
DATA=$(curl ${url} 2>/dev/null)
printf '%s' "$DATA" | awk '{print $0}'
exit
...
...
...
and then use a tool like https://webinstall.dev/jq/, Python and whatever other tools you have on your system to extract the data that you want and then update the DB.
I am more familiar with PHP, which would work also. You could tidy that up a bit, but seems to work actually.
earthquake.php
<?php
$database = false;
try {
$options = array(PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_OBJ, PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING, PDO::ATTR_EMULATE_PREPARES => true );
$conn = new PDO('mysql:host=127.0.0.1;dbname=test;port=3306;charset=utf8','root','root', $options);
} catch (PDOException $e) {
// Echo custom message. Echo error code gives you some info.
echo '[{"error":"Database connection can not be estabilished. Please try again later. Error code: ' . $e->getCode() . '"}]';
exit;
}
$url = "https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson";
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$result = curl_exec($ch);
$features = json_decode($result)->features;
foreach ($features as $feature) {
echo 'Mag: '.$feature->properties->mag.', Place: '.$feature->properties->place.', '.gmdate("Y-m-d H:i:s", $feature->properties->time/1000).PHP_EOL;
$query = 'INSERT INTO features (mag, place, time) VALUES (?, ?, ?)';
$params = [$feature->properties->mag, $feature->properties->place, gmdate("Y-m-d H:i:s", $feature->properties->time/1000)];
$stmt = $conn->prepare($query) or die ('["status":{"error":"Prepare Statement Failure","query":"' .$query . '"}]');
$stmt->execute($params) or die('[{"error":"' . $stmt->errorInfo()[2] . '","query":"' .$query . '","params":' .json_encode($params) . '}]');
}
?>
create a local DB called features with mag, place and time columns. If you have php on your system just run it from the CLI, php earthquake.php
e.g. Insert into mysql from Bash script
I use Laravel a bit, and would probably actually build my own model and use Eloquent and a little UI to handle that, but using a script seems like an option.
I'm trying to parse some json data with the fandom wikia API. When I browse to my marvel.fandom.com/api request I get following JSON output: {"batchcomplete":"","query":{"pages":{"45910":{"pageid":45910,"ns":0,"title":"Uncanny X-Men Vol 1 171"}}}}
Nothing to fancy to begin with and running it through a JSON parser online gives following output:
{
"batchcomplete":"",
"query":{
"pages":{
"45910":{
"pageid":45910,
"ns":0,
"title":"Uncanny X-Men Vol 1 171"
}
}
}
}
which seems to be ok as far as I can see
I want to get the pageid for several other requests but I can't seem to get the same output through Perl.
The script:
#!/usr/bin/perl
use strict;
use warnings;
use LWP::Simple;
use JSON;
use Data::Dumper;
my $url = "https://marvel.fandom.com/api.php?action=query&titles=Uncanny%20X-Men%20Vol%201%20171&format=json";
my $json = getprint( $url);
die "Could not get $url!" unless defined $json;
my $decoded_json = decode_json($json);
print Dumper($decoded_json);
but this gives following error:
Could not get https://marvel.fandom.com/api.php?action=query&titles=Uncanny%20X-Men%20Vol%201%20171&format=json! at ./marvelScraper.pl line 11.
When I change the get to getprint for some extra info, I get this:
500 Can't connect to marvel.fandom.com:443
<URL:https://marvel.fandom.com/api.php?action=query&titles=Uncanny%20X-Men%20Vol%201%20171&format=json>
malformed JSON string, neither tag, array, object, number, string or atom, at character offset 0 (before "(end of string)") at ./script.pl line 13.
I tried this on another computer and still get the same errors.
The versions of LWP::Simple and LWP::Protocol::https
/usr/bin/perl -MLWP::Simple -E'say $LWP::Simple::VERSION'
6.15
/usr/bin/perl -MLWP::Protocol::https -E'say $LWP::Protocol::https::VERSION'
6.09
Appearantly it has something to do with the Bash Ubuntu on Windows since on a Ubuntu 18.04 I get (with the same script) following response:
JSON text must be an object or array (but found number, string, true, false or null, use allow_nonref to allow this) at ./test.pl line 13.
{"batchcomplete":"","query":{"pages":{"45910":{"pageid":45910,"ns":0,"title":"Uncanny X-Men Vol 1 171"}}}}
Actually, the very same script works from my Bash Ubuntu on Windows with the get() command instead of the getprint() you gave after editing your question.
orabig#Windows:~/DEV$ ./so.pl
$VAR1 = {
'query' => {
'pages' => {
'45910' => {
'pageid' => 45910,
'ns' => 0,
'title' => 'Uncanny X-Men Vol 1 171'
}
}
},
'batchcomplete' => ''
};
So maybe you have another issue that has nothing to do with Perl or Ubuntu.
Can you try this for example ?
curl -v 'https://marvel.fandom.com/api.php?action=query&titles=Uncanny%20X-Men%20Vol%201%20171&format=json'
Maybe you just hit the site too much, and the 500 error is just a result of some anti-leech protection ?
just got the DFrobot relay and trying to communicate with it using json commands
pdf is here
any help please how to send an on/off command from terminal?
That is pretty easy, just open a telnet to the device, an example:
telnet 192.168.1.10 2000
and copy paste one of the JSON commands from their pdf document, like:
{"relay1":"on","relay2":"on","relay3":"off","relay4":"off","relay5":"off", "relay6":"off","relay7":"off","relay8":"off"}
and hit enter. In case if you want more than telnet commands you can check this Github project.
please check : https://github.com/Dzduino/DFRobot-RLY-8-Web-Control/blob/master/index.php
You can use PHP Socket like bellow in an html or PHP page, please note that you have to use some local php server (Ex: XAMPP) to test it:
<?php
$addr = "192.168.1.10"; // RLY-8 Default Adress
$port = 2000; // RLY-8 Default port
$timeout = 30; // Connection Time out in Sec
if (isset($_POST["cmd"])){ // check if a submit was done, otherwise the communicatino will start after page loading
$cmd = $_POST["cmd"] ; // Capture the input Command
$fp = fsockopen ($addr, $port, $errno, $errstr, $timeout ); // initiate a socket connection
if (!$fp) {
echo "($errno) $errstr\n"; // return the error if no connection was established
} else {
fwrite ($fp, $cmd); // Send the command to the connected device
echo fread($fp, 128); // Echo the return string from the device
fclose ($fp); // close the connection
}
}
?>
i have to create a little perl script which is creating an tcp socket and put the input from this socket into an mysql table.
Background: I get Call Data Records from a Siemens Hipath phone system via this TCP Socket.
I have to write the data, which is CSV in an DB for future use.
The format i get is following: 13.05.14;15:01:14;3;10;00:26;00:00:45;0123456789;;1;
Im new to perl, an have a few "noob" questions :)
1: How is it possible to run that script in background (as deamon) ?
2: The Script is only handling the last line of an deliverd csv line. If there are two lines, it ignores the first line. How can i fix that ?
3: Today i got this output from my script: DBD::mysql::st execute failed: MySQL server has gone away at ./hipath-CDR-Server.pl line 41. What happend there?
Can anyone help me with that maybe easy questions.
This is what i got till now:
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
use IO::Socket::INET;
my $dbh = DBI->connect('DBI:mysql:database=hipathCDR;host=localhost','***','***',{ RaiseError => 1, AutoCommit => 1 }, );
my $sql = 'INSERT INTO calls (Datum,Zeit,Leitung,Teilnehmer,Rufzeit,Gespraechszeit,RufNr,Gebuehr,Typ) VALUES (?,?,?,?,?,?,?,?,?)';
my $sth = $dbh->prepare($sql);
# auto-flush on socket
$| = 1;
# creating a listening socket
my $socket = new IO::Socket::INET (
LocalHost => '0.0.0.0',
LocalPort => '4444',
Proto => 'tcp',
Listen => 5,
Reuse => 1
);
die "cannot create socket $!\n" unless $socket;
while(1)
{
# waiting for a new client connection
my $client_socket = $socket->accept();
# get information about a newly connected client
my $client_address = $client_socket->peerhost();
my $client_port = $client_socket->peerport();
# read up to 1024 characters from the connected client
my $data = "";
$client_socket->recv($data, 1024);
chomp ($data);
my #values = split(';', $data);
print "Array: #values\n";
$sth->execute(#values);
# write response data to the connected client
$data = "ok";
$client_socket->send($data);
# notify client that response has been sent
shutdown($client_socket, 1);
}
$socket->close();
You have more than one question, I am going to answer the second: Here is howto get all data.
my $data = "";
while ($client_socket->recv($data, 1024)){
$data .= $_;
}
It would more safer to parse the incoming CSV like data by Text::CSV.
Answer to question three:
Most probably your mysql was unavailable that time.
Answer to question one:
nohup ./script.pl &
Okay, I am just kidding: How can I run a Perl script as a system daemon in linux?
use Proc::Daemon;
Proc::Daemon::Init;
my $continue = 1;
$SIG{TERM} = sub { $continue = 0 };
...
while($continue)
{
...
}
I have the following script to update one of my FTP passwords every 15 days through a cronjob and e-mail the appropriate people after the attempt has been made. It randomly will fail and so I will run it again manually and it will work. I can't seem to find where it's going wrong.
The script is connecting to a local mysql database grabbing the login and password for an account and then changing that password on FTP. Everything is successful up until the changing the password part. Again it's random, sometimes it works, sometime it doesn't.
Thanks!
#!/usr/bin/perl -w
#
use DBI;
use Net::FTP;
our $dbh = DBI->connect('DBI:mysql:database:127.0.0.1','user','password') or die "Aargh $!\n";
$transquery=q{SELECT dest_login,dest_password FROM list where id=123};
$sth=$dbh->prepare($transquery);
$sth->execute();
while($co=$sth->fetchrow_hashref){
$login=$co->{'dest_login'};
$pass=$co->{'dest_password'};
}
$changeresult='FAIL';
$actionlog='';
$newstring='';
$upperchars='ABCDEFGHIJKLMNOPQRSTUVWXYZ';
$lowerchars='abcdefghijklmnopqrstuvwxyz';
$allowedchars='ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789##$';
$l=length($upperchars);
$newstring.=substr($upperchars,int(rand($l)),1);
$newstring.=substr($lowerchars,int(rand($l)),1);
$l=length($allowedchars);
for ($i=0;$i<6;$i++){
$newstring.=substr($allowedchars,int(rand($l)),1);
}
print "$newstring\n";
$actionlog .= "Setting Password for $login from $pass to $newstring\n";
$username=
eval{
$ftp=Net::FTP->new('x.x.x.x',Timeout=>480,Debug=>1) or die "Error connecting FTP $!\n";
$changepassword="$pass/$newstring/$newstring";
$ftp->login($login,$changepassword) or die "Error changing password $!\n";
#If we are here, time to update the password
$changeresult='SUCCESS';
$actionlog .= "Password successfully updated\n";
$transquery=q{UPDATE list set dest_password=(?) where id=123};
$sth=$dbh->prepare($transquery);
$sth->execute($newstring);
};
if ($#) {
$actionlog = $actionlog . "$#\n";
};
if($actionlog ne ""){
#print $actionlog;
#my $send_to = "To: someone\#example.com\n";
my $send_to = "To: databaseusers\#example.com\n";
my $sendmail = "/usr/sbin/sendmail -t";
open(SENDMAIL, "|$sendmail") or die "Cannot open $sendmail: $!";
print SENDMAIL "Reply-to: databasepassword\#example.com\n";
print SENDMAIL "Subject: Password Change Information [$changeresult]\n";
print SENDMAIL $send_to;
print SENDMAIL "Content-type: text/plain\n\n";
print SENDMAIL $send_to;
print SENDMAIL "Content-type: text/plain\n\n";
print SENDMAIL $actionlog;
close(SENDMAIL);
$actionlog='';
}
else{
#print "Nothing done this session\n";
USUW might tell you something. ( use strict; use warnings; )
Does anything print?
You don't do much error checking in the DBI part at the beginning, perhaps you're getting a connect error. AIX boxes used to have this problem of getting a client port that the system was unsure about whether or not it was in use. When that happened, it would just fail to connect to the database.
I finally fixed that problem for our scripts by examining the $OS_ERROR ( aka $! ) for that particular code ( Errno::EADDRINUSE ) and then waiting and retrying, with an exponential falloff ( wait 2 seconds, then 4, then 8 ... ).
If your script "dies for some reason" then it's important the script can tell you that reason. I would investigate the topic of error reporting in the various modules you are using.
For example Net::FTP allows you to pass a Debug => 1 switch, and then you'll see the whole conversation.
And I know that there is a whole lot more with DBI where you can get error reporting.