Perl fetch without execute error - at end of execution - mysql

My current perl code is running through all of the rows of the database query and then throwing a
DBD::mysql::st fetchrow_hashref failed: fetch() without execute() at ./recieveTxn.cgi
line 168.
At the end. It almost is like there's something that's not telling the loop to stop at the end of the rows, but I have written it just as the others.
So for example: The query would pull up
shortcode1
shortcode2
shortcode3
then throw the error here
$sql = "
SELECT
aPI.folder AS aPIfolder,
aPI.rowNum AS aPIrowNum,
hasInput,
evalCode
FROM
aPersonalItems as aPI
LEFT JOIN
pItems_special_inputs as SI
ON
aPI.folder = SI.folder AND
aPI.rowNum = SI.rowNum
WHERE
hasInput=1 AND
aPI.folder='$FORM{'folder'}'
ORDER BY
aPI.rowNum
";
$sth = $dbh->prepare( $sql );
$sth->execute();
my ($shortcoderow, $row);
my $shortcodeSQL = "
SELECT
*
FROM
pItemsShortCodes
WHERE
folder='$FORM{'folder'}'
ORDER BY
rowNum
";
my $shortcodeSTH = $dbh->prepare( $shortcodeSQL );
$shortcodeSTH->execute();
while( my $ref = $sth->fetchrow_hashref() ) {
my $shortCode;
my $isblank = 1;
my $rowNum = $ref->{'aPIrowNum'};
while(my $shortcodeRef = $shortcodeSTH->fetchrow_hashref())
#&& $rowNum == $shortcodeRef->{'rowNum'}) #line 168 above
{
$shortCode=$shortcodeRef->{'shortcode'};
print $shortCode."\n";
}
$shortcodeSTH->finish();
}

The problem is that you are processing more than one row from $sth.
Your code fetches a row from $sth, and then your code loops through every row from $shortcodeSTH, until there are no more rows. Then your code calls the finish() method on $shortcodeSTH. (Which is the normative pattern, since you've already fetched all the rows.)
Then, your code starts through the outer loop a second time, fetching a second row from $sth. When your code attempts to start through the $shortcodeSTH loop a second time, you've already fetched all of the rows and closed the statement handle. There aren't any more rows to retrieve. (The error returned would be different if you hadn't issued the call to the finish() method; the error message would be something about fetching past the end of the cursor, or already fetched last row, or something to that effect.)

Related

How can I grab multiple records from a MySQL query in Perl using array pointers?

I can do this all as one function, but in trying to port it over to my packages of functions (library) I am missing something.
Here's what I want to do from my main Perl script
my #rows;
$result = Funx::dbcdata($myConnection,
"SELECT * FROM Inv where name like \"%DOG%\";", \#rows);
Then in my library package I am attempting this
sub dbcdata
{
my ($connection, $command, $array) = #_;
my $query = $connection->prepare($command);
my $result = $query->execute();
my $i =0;
while(my $row = $query->fetchrow_arrayref() )
{
#{$array}[$i] = $row;
$i++;
}
$query->finish;
return $result;
}
I was hoping to get back pointers or references to each row (which was 4in this case) but am not. Every element in #rows is the same:
ARRAY(0x5577a0f77ec0) ARRAY(0x5577a0f77ec0) ARRAY(0x5577a0f77ec0)
ARRAY(0x5577a0f77ec0)
Nor do I know how to turn each one into the original separate row. Any help would be appreciated, thanks.
From the documentation for fetchrow_arrayref:
Note that the same array reference is returned for each fetch, so don't store the reference and then use it after a later fetch. Also, the elements of the array are also reused for each row, so take care if you want to take a reference to an element.
Sounds like you want fetchall_arrayref:
The fetchall_arrayref method can be used to fetch all the data to be returned from a prepared and executed statement handle. It returns a reference to an array that contains one reference per row.
After executing the statement, you can do something like
#{$array} = $query->fetchall_arrayref->#*;
instead of that ugly loop.
But selectall_array might be even better. Your whole function can be replaced by a call to it:
my #rows =
$myConnection->selectall_array(q/SELECT * FROM Inv WHERE name LIKE '%DOG%'/);

Issue with cakePHP running unit tests

I've have the weirdest issue while trying to test something in my cakePHP 2.0 app. I have a function inside a model that queries the database to check if the app has already sent a notification in the last 25 days:
public function checkIfNotified($userId){
$query = 'SELECT count(`user_id`) AS notify '.
'FROM `churn_stats` '.
'WHERE `user_id` = '. $userId.' '.
'AND `notified` = 1 '.
'AND TIME_TO_SEC(TIMEDIFF(NOW(),`created`)) <= 2160000 ';
$this->log($query);
$result = $this->query($query);
return $result;
}
I'm doing some Unit tests to check if the method works, so I'm creating a record and trying to test it return return true like so:
$data['notified'] = 1;
$data['user_id'] = $userId;
$this->ChurnStats->create();
$this->ChurnStats->save($data);
$notified = $this->ChurnStats->checkIfNotified($userId);
print_r($notified);
After the result is (which is the wrong result since I've already inserted a row!):
Array
(
[0] => Array
(
[0] => Array
(
[notify] => 0
)
)
)
However I run the exact query generated by the code in the DB and the result is:
I've already lost a lot of time and I don't have any idea what's wrong :(.
After testing and checking everything it was another test function that somehow was changing the DB or the query of the next test, however the weird part was that the other test called the same query but didn't have any insert, update or delete that could modify the results or enviroment of the next test.
After checking everything it all reduced to something: Query Cache, by default CakePHP caches all the $this->query("..."); calls so the fix was quite easy: Deactivate it!
$result = $this->query($query, false);

Loop through query results without loading them all in array in Codeigniter [duplicate]

The normal result() method described in the documentation appears to load all records immediately. My application needs to load about 30,000 rows, and one at a time, submit them to a third-party search index API. Obviously loading everything into memory at once doesn't work well (errors out because of too much memory).
So my question is, how can I achieve the effect of the conventional MySQLi API method, in which you load one row at a time in a loop?
Here is something you can do.
while ($row = $result->_fetch_object()) {
$data = array(
'id' => $row->id
'some_value' => $row->some_field_name
);
// send row data to whatever api
$this->send_data_to_api($data);
}
This will get one row at the time. Check the CodeIgniter source code, and you will see that they will do this when you execute the result() method.
For those who want to save memory on large result-set:
Since CodeIgniter 3.0.0,
There is a unbuffered_row function,
All the methods above will load the whole result into memory (prefetching). Use unbuffered_row() for processing large result sets.
This method returns a single result row without prefetching the whole result in memory as row() does. If your query has more than one row, it returns the current row and moves the internal data pointer ahead.
$query = $this->db->query("YOUR QUERY");
while ($row = $query->unbuffered_row())
{
echo $row->title;
echo $row->name;
echo $row->body;
}
You can optionally pass ‘object’ (default) or ‘array’ in order to specify the returned value’s type:
$query->unbuffered_row(); // object
$query->unbuffered_row('object'); // object
$query->unbuffered_row('array'); // associative array
Official Document: https://www.codeigniter.com/userguide3/database/results.html#id2
Well, the thing is that result() gives away the entire reply of the query. row() simply fetches the first case and dumps the rest. However the query can still fetched 30 000 rows regardles of which function you use.
One design that would fit your cause would be:
$offset = (int)#$_GET['offset'];
$query = $this-db->query("SELECT * FROM table LIMIT ?, 1", array($offset));
$row = $query->row();
if ($row) {
/* Run api with values */
redirect(current_url().'?offset'.($offset + 1));
}
This would take one row, send it to api, update the page and use the next row. It will alos prevent the page from having a timeout. However it would most likely take a while with 30 000 records and refreshes, so you may wanna adjust your LIMIT ?, 1 to a higher number than 1 and go result() and foreach() multiple apis per pageload.
Well, there'se the row() method, which returns just one row as an object, or the row_array() method, which does the same but returns an array (of course).
So you could do something like
$sql = "SELECT * FROM yourtable";
$resultSet = $this->db->query($sql);
$total = $resultSet->num_rows();
for($i=0;$i<$total;$i++) {
$row = $resultSet->row_array($i);
}
This fetches in a loop each row from the whole result set.
Which is about the same as fetching everyting and looping over the $this->db->query($sql)->result() method calls I believe.
If you want a row at a time either you make 30.000 calls, or you select all the results and fetch them one at a time or you fetch all and walk over the array. I can't see any way out now.

mysql , perl script freezing when retrieving data

Hi I have this code. my perl scripts hangs somewhere in the #row section. I printed the next query statement and it works in SQL.
Why is it hanging?
foreach $device (...)
$sth2 = $dbh->prepare(qq|SELECT DISTINCT S,`W(m)`,`L(m)`,V FROM `$SQL_TABLE_NAME` WHERE DEVICE='$device'| );
$sth2->execute( );
my %TEMP = ();
while ( my #row = $sth2->fetchrow_array( ))
{
$TEMP{S}{$row[0]} = 1;
$TEMP{W}{$row[1]} = 1;
$TEMP{L}{$row[2]} = 1;
$TEMP{V}{$row[3]} = 1;
}
You seem to have a syntax error in your query (VFROM rather than V FROM).
If the query fails, there's no result set to fetch a row from. You might want to build some error handling into your code.
my $select_line = qq|SELECT DISTINCT S,`W(m)`,`L(m)`,V
FROM `$SQL_TABLE_NAME`
WHERE DEVICE='$device'|;
Have you tried printing the $select_line and then running it directly in mysql?
my $sth2 = $dbh->prepare( $select_line )
or die $DBI::errstr.' at my query: $select_line\n';
Add in or die $DBI::errstr.' at my query: $select_line' to verify if your syntax is correct.
Add in die $sth2->errstr if $sth2->err; after your while:
while (fetch) {
//stuff in here with rows
} die $sth2->errstr if $sth2->err;
Review CPAN DBI docs # http://metacpan.org/pod/DBI
By the way it's better to use placeholders:
$sth2 = $dbh->prepare(qq|SELECT DISTINCT S,`W(m)`,`L(m)`,V FROM `$SQL_TABLE_NAME` WHERE DEVICE= ?| );
$sth2->execute($device);

How to delete multiple rows (using named parameters) in Adobe AIR

I am trying to delete multiple rows in the sqlite table in my Adobe AIR app (runtime 2.5).
Here is the statement, using the "IN" operator:
"DELETE FROM mylist WHERE tdId IN (tdId1, tdId2, tdId3, ...)";
Where tdId1, tdId2, etc. will be determined at runtime based on which row(s) the user chooses to delete. The user can delete arbitrary number of rows.
I've tried something like:
//delete statement text
"DELETE FROM mylist WHERE tdId IN :tdId";
//delete statement parameters: take 1.
//Got "argument error: near ':tdId': syntax error"
deleteStmt.parameters[":tdId"] = "(26, 32)";
//delete statement parameters: take 2.
//Also got "argument error: near ':tdId': syntax error"
var arr:Array = [26, 32];
deleteStmt.parameters[":tdId"] = arr;
How I go about deleting multiple rows?
[Edit] So it looks like the aforementioned cached statement with parameter [":tdId"] doesn't work when deleting multiple rows. When attempting to execute the delete statement multiple times in asynchronous mode, after the very first row in the queue is deleted, Flash throws the following error:
"Error #3110: Operation cannot be
performed while SQLStatement.executing
is true."
It would seem too much of trouble to chain these deletes with a callback. So I guess I am using my last resort: building the sql at runtime. Conclusion: Cached statements can't be used in these kind of situations...
The problem occurs when you insert the parameter "(26,32)". As the parameter is not purely a substitution of value, it represents a variable to SQL, NOT A STRING. Hence you statement effectively became (or roughly) in your first take...
"DELETE FROM mylist WHERE tdId IN '(26,32)'"
Hence your error, due to the syntax... In your second take it gets worse...
"DELETE FROM mylist WHERE tdId IN *Array(26,32)*"
As the variable does not convert to a string value, this does not actually happen. But what happens is that when the interprater (SQL) tries to understand the code after the 'IN' text, it gets an ARRAY object, which it has completely no idea on what to do... Its not even a valid SQL type....
Solution? [I yet to fully test it, so please do]
var toDel:Array = [26,32]
//delete statement text
var baseStr:String = "DELETE FROM mylist WHERE tdId IN (";
var midStr:String = '';
//delete statement parameters: Processing parameter
for( var i = 0; i < toDel.length; i++ ) {
deleteStmt.parameters[i] = toDel[i];
if(midStr.length > 0) { midStr += ' , '; }
midStr += '?';
}
deleteStmt.text = baseStr + midStr + ' )';
//Then execute
So what happens in this case is that u effectively execute...
"DELETE FROM mylist WHERE tdId IN ( :val1 , :val2 )"
In this way you still maintain the safe (good practice) use of parameters, without converting everything to a string.
EDIT: if you dun understand the use of parameter / '?' refer to :
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/data/SQLStatement.html#parameters
If the IN clause does not allow parameters, you can try old-school SQL style: append multiple
" OR (tdId = :param" + paramCounter.toString() + ")"
to the SQL string