Wordpress Plugin Custom Databse Query (SQL UPDATE SET WHERE) - mysql

Quick background:
CRUD plugin
WP 3.5
Creating this plugin for my personal knowledge - I'm sure one already exists.
Variables echo out ok.
The create portion of plugin needed 3 hours of research to get the query to work. Tried everything in codex, a handful of answers from Stack Overflow until something worked.
Problem:
Updating a row in a custom table.
This query will work in SQL ->
UPDATE wp_current_issue SET issue_year=9999, issue_month=29 WHERE issue_id=17;
Tried:
global $wpdb;
$wpdb->update(CURRENT_ISSUE_TABLE,
array(
'issue_year' => $issue_year,
'issue_month' => $issue_month
),
array('issue_id' => $issue_edit_id),
array('%d','%d'),
array('%d'));
As well as: (with and without backticks around column names)
global $wpdb;
$sql = $wpdb->prepare("UPDATE wp_current_issue SET issue_year=".$issue_year.", issue_month=".$issue_month." WHERE issue_id=".$issue_edit_id);
$wpdb->query($sql);
Now the funny thing is, I went through the same thing with the INSERT INTO statement. Tried everything. Only thing that would work is this:
global $wpdb;
$sql = $wpdb->prepare(
"INSERT INTO `".CURRENT_ISSUE_TABLE."` (`issue_path`,`issue_img_path`,`issue_year`,`issue_month`) VALUES (%s,%s,%d,%d)", $fileName_issue, $fileName_img, $issue_year, $issue_month);
$wpdb->query($sql);
Anyone know how I could rewrite this query for an update statement?

Assuming I understood your problem correctly if you want to rewrite that insert into an update then just use
$sql = $wpdb->prepare(
"UPDATE `".CURRENT_ISSUE_TABLE."` SET `issue_path` = %s, `issue_img_path` = %s, `issue_year` = %s, `issue_month` = %s
WHERE issue_id = %i", $fileName_issue, $fileName_img, $issue_year, $issue_month, $issue_edit_id);
The rest is the same.

Related

Data is not constantly loading from custom wordpress table

global $wpdb;
$tablename_temp = $wpdb->prefix . 'faculty_temp';
$toke = $_GET['token'];
$user = $wpdb->get_results("SELECT * FROM $tablename_temp where token='$toke'");
This is to fetch data from custom wordpress table, the loads fine sometimes. On some times data isn't fetched. Like I refresh 10 times, data doesn't appear in 1 out of 10 times. Seems to be server issue, while updating files in Advanced File Manager & while getting data in WP data access plugin also has same issue
First of all your code is unsafe. Need to add the sanitize functions and the escaping query to the database
global $wpdb;
$tablename_temp = $wpdb->prefix . 'faculty_temp';
$toke = isset( $_GET['token'] ) ? sanitize_text_field( wp_unslash( $_GET['token'] ) ) : '';
$user = $wpdb->get_results("SELECT * FROM $tablename_temp where token='".$wpdb->esc_like($toke)."'");
The code is valid. Perhaps the problem is that you are simultaneously writing and deleting to the temporary table "faculty_temp" and the fetch request is sent when data is deleted or modified.
Need to change your method for working with temp data

Laravel parameter binding causing occasional MySQL general error

I have an array of integers that need to be inserted as a batch of rows. Each row needs some other data.
$ids = [1,2]
$thing = 1
$now = Carbon::now(); // This is just a timestamp.
$binding_values = trim(str_repeat("({$thing}, ?, '{$now}'),", count($ids)), ',');
The string $binding_values looks like this:
"(1, ?, '2019-01-01 00:00:00'), (1, ?, '2019-01-01 00:00:00')"
Then I prepare my query string and bind the parameters to it. The IGNORE is used because I have a composite unique index on the table. It doesn't seem relevant to the problem though so I've left the details out.
DB::insert("
INSERT IGNORE INTO table (thing, id, created_at)
VALUES {$binding_values}
", $ids);
This works almost all the time but every now and then I get an error SQLSTATE[HY000]: General error: 2031.
Is the way I'm doing this parameter binding some kind of anti-pattern with Laravel? What might the source of this error be?
Because there is no risk of injection in this method and there is no chance that this method would be extended to a use case with a risk of injection, I've modified it to bake in all the parameters and skip parameter binding. I haven't seen any errors so far.
I would still like to know what might cause this behaviour so I can manage it better in the future. I'd be grateful for any insight.
I don't see a big issue with your query other than baking parameters into the query, which is vulnerable to SQL injection.
From what I can see, your problem is that you need INSERT ÌGNORE INTO which is not supported out of the box by Laravel. Have you thought about using a third-party package like staudenmeir/laravel-upsert?
An alternative could be to wrap your query in a transaction and select existing entries first, giving you the chance to not insert them a second time:
$ids = [1, 2];
$thing = 1;
$time = now();
DB::transaction(function () use ($ids, $thing, $time) {
$existing = DB::table('some_table')->whereIn('id', $ids)->pluck('id')->toArray();
$toInsert = array_diff($ids, $existing);
$dataToInsert = array_map(function ($id) use ($thing, $time) {
return [
'id' => $id,
'thing' => $thing,
'created_at' => $time
];
}, $toInsert);
DB::table('some_table')->insert($dataToInsert);
});
This way you will only insert what is not present yet, but you will also stay within the framework capabilities of the query builder. Only downside is that it will be slightly slower due to a second roundtrip to the database.

Execute mySQL UPDATE command

I have the following code, but it doesn't work. I think it might be because the method was removed? Not sure what the new way to do it is though. I'm on wordpress.
<php?
mysql_query ("UPDATE $wpdb->users
SET access_key = $newAccessKey
WHERE ID = $currentUserID");
?>
That don't work.
users is the table nome.
Advice??
This script is to be run on a page on php.
First you start with wrong syntax to start PHP script '
Now you should check the type of access_key if it is integer than you wrote right and if it is varchar or text then you should write with ''. For this your query is below.
mysql_query ("UPDATE $wpdb->users
SET access_key = '$newAccessKey'
WHERE ID = $currentUserID");
I hope you will get solution.
use single quote for string vars and be sure for sanitize $wpdb->users use concat
mysql_query ("UPDATE " . $wpdb->users .
" SET access_key = '$newAccessKey'
WHERE ID = $currentUserID");
The reason why it doesn't work for you can be a number of things. You can find it out by enabling error reporting (ini_set('display_errors', true); error_level(E_ALL);) and checking for mysql errors (echo mysql_error();).
Second, if it wasn't a typo when you were writing the question:
<php? doesn't start PHP code. It should be <?php, the way you wrote it it is ignored both by the server and the browser because it is an illegal tag and the code inbetween isn't executed at all.
Apart from that, $wpdb brings its own update() function that you can use:
$wpdb->update(
$wpdb->users,
array( 'access_key' => $newAccessKey ),
array( 'ID' => $currentUserID ),
array(
'%s' // if $newAccessKey is a string
// '%d' // if $newAccessKey is an integer
),
array( '%d' )
);
That way you don't have to worry about the database connection or deprecated functions.

Import of 50K+ Records in MySQL Gives General error: 1390 Prepared statement contains too many placeholders

Has anyone ever come across this error: General error: 1390 Prepared statement contains too many placeholders
I just did an import via SequelPro of over 50,000 records and now when I go to view these records in my view (Laravel 4) I get General error: 1390 Prepared statement contains too many placeholders.
The below index() method in my AdminNotesController.php file is what is generating the query and rendering the view.
public function index()
{
$created_at_value = Input::get('created_at_value');
$note_types_value = Input::get('note_types_value');
$contact_names_value = Input::get('contact_names_value');
$user_names_value = Input::get('user_names_value');
$account_managers_value = Input::get('account_managers_value');
if (is_null($created_at_value)) $created_at_value = DB::table('notes')->lists('created_at');
if (is_null($note_types_value)) $note_types_value = DB::table('note_types')->lists('type');
if (is_null($contact_names_value)) $contact_names_value = DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname');
if (is_null($user_names_value)) $user_names_value = DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname');
// In the view, there is a dropdown box, that allows the user to select the amount of records to show per page. Retrieve that value or set a default.
$perPage = Input::get('perPage', 10);
// This code retrieves the order from the session that has been selected by the user by clicking on a table column title. The value is placed in the session via the getOrder() method and is used later in the Eloquent query and joins.
$order = Session::get('account.order', 'company_name.asc');
$order = explode('.', $order);
$notes_query = Note::leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')));
if (!empty($created_at_value)) $notes_query = $notes_query->whereIn('notes.created_at', $created_at_value);
$notes = $notes_query->whereIn('note_types.type', $note_types_value)
->whereIn(DB::raw('CONCAT(contacts.first_name," ",contacts.last_name)'), $contact_names_value)
->whereIn(DB::raw('CONCAT(users.first_name," ",users.last_name)'), $user_names_value)
->paginate($perPage)->appends(array('created_at_value' => Input::get('created_at_value'), 'note_types_value' => Input::get('note_types_value'), 'contact_names_value' => Input::get('contact_names_value'), 'user_names_value' => Input::get('user_names_value')));
$notes_trash = Note::onlyTrashed()
->leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')))
->get();
$this->layout->content = View::make('admin.notes.index', array(
'notes' => $notes,
'created_at' => DB::table('notes')->lists('created_at', 'created_at'),
'note_types' => DB::table('note_types')->lists('type', 'type'),
'contacts' => DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname', 'cname'),
'accounts' => Account::lists('company_name', 'company_name'),
'users' => DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname', 'uname'),
'notes_trash' => $notes_trash,
'perPage' => $perPage
));
}
Any advice would be appreciated. Thanks.
Solved this issue by using array_chunk function.
Here is the solution below:
foreach (array_chunk($data,1000) as $t)
{
DB::table('table_name')->insert($t);
}
There is limit 65,535 (2^16-1) place holders in MariaDB 5.5 which is supposed to have identical behaviour as MySQL 5.5.
Not sure if relevant, I tested it on PHP 5.5.12 using MySQLi / MySQLND.
This error only happens when both of the following conditions are met:
You are using the MySQL Native Driver (mysqlnd) and not the MySQL client library (libmysqlclient)
You are not emulating prepares.
If you change either one of these factors, this error will not occur. However keep in mind that doing both of these is recommended either for performance or security issues, so I would not recommend this solution for anything but more of a one-time or temporary problem you are having. To prevent this error from occurring, the fix is as simple as:
$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
While I think #The Disintegrator is correct about the placeholders being limited. I would not run 1 query per record.
I have a query that worked fine until I added one more column and now I have 72k placeholders and I get this error. However, that 72k is made up of 9000 rows with 8 columns. Running this query 1 record at a time would take days. (I'm trying to import AdWords data into a DB and it would literally take more than 24 hours to import a days worth of data if I did it 1 record at a time. I tried that first.)
What I would recommend is something of a hack. First either dynamically determine the max number of placeholders you want to allow - i.e. 60k to be safe. Use this number to determine, based on the number of columns, how many complete records you can import/return at once. Create the full array of data for you query. Use a array_chunk and a foreach loop to grab everything you want in the minimum number of queries. Like this:
$maxRecords = 1000;
$sql = 'SELECT * FROM ...';
$qMarks = array_fill(0, $maxInsert, '(?, ...)');
$tmp = $sql . $implode(', ', $qMarks);
foreach (array_chunk($data, $maxRecords) AS $junk=>$dataArray) {
if (count($dataArray) < $maxRecords)) { break; }
// Do your PDO stuff here using $tmp as you SQL statement with all those placeholders - the ?s
}
// Now insert all the leftovers with basically the same code as above except accounting for
// the fact that you have fewer than $maxRecords now.
Using Laravel model, copy all 11000 records from sqlite database to mysql database in few seconds. Chunk data array to 500 records:
public function handle(): void
{
$smodel = new Src_model();
$smodel->setTable($this->argument('fromtable'));
$smodel->setConnection('default'); // sqlite database
$src = $smodel::all()->toArray();
$dmodel = new Dst_model();
$dmodel->setTable($this->argument('totable'));
$dmodel->timestamps = false;
$stack = $dmodel->getFields();
$fields = array_shift($stack);
$condb = DB::connection('mysql');
$condb->beginTransaction();
$dmodel::query()->truncate();
$dmodel->fillable($stack);
$srcarr=array_chunk($src,500);
$isOK=true;
foreach($srcarr as $item) {
if (!$dmodel->query()->insert($item)) $isOK=false;
}
if ($isOK) {
$this->notify("Przenieśliśmy tabelę z tabeli : {$this->argument('fromtable')} do tabeli: {$this->argument('totable')}", 'Będzie świeża jak nigdy!');
$condb->commit();
}
else $condb->rollBack();
}
You can do it with array_chunk function, like this:
foreach(array_chunk($data, 1000) as $key => $smallerArray) {
foreach ($smallerArray as $index => $value) {
$temp[$index] = $value
}
DB::table('table_name')->insert(temp);
}
My Fix for above issue:
On my side when i got this error I fixed it by reducing the the bulk insertion chunk size from 1000 to 800 and it worked for me.
Actually there were too many fields in my table and most them contains the details descriptions of size like a complete page text. when i go for there bulk insertion the service caused crashed and through the above error.
I think the number of placeholders is limited to 65536 per query (at least in older mysql versions).
I really can't discern what this piece of code is generating. But if it's a gigantic query, There's your problem.
You should generate one query per record to import and put those into a transaction.

Joomla Database - How to use LIMIT in getQuery?

I want to build the below query using joomla inbuilt database class.
SELECT *
FROM table_name
ORDER BY id DESC
LIMIT 1
This is the query I have built up to now.
$db =& JFactory::getDBO();
$query = $db->getQuery(true);
$query->select($db->nameQuote('*'));
$query->from($db->nameQuote(TABLE_PREFIX.'table_name'));
$db->setQuery($query);
$rows = $db->loadObjectList();
I don't know how to add the limit(LIMIT 1) to the query. Can someone please tell me how to do it? Thanks
Older than Joomla 3.0
$db = JFactory::getDBO();
$query = $db->getQuery(true);
$query->select('*')
->from($db->nameQuote('#__table_name'))
->order($db->nameQuote('id').' desc');
$db->setQuery($query,0,1);
$rows = $db->loadObjectList();
$db->setQuery function takes 3 parameters. The first one being the query, then the start, then the limit. We can limit records as shown above.
Newer than Joomla 3.0
setLimit(integer $limit, integer $offset)
If you want just one row
$query->setLimit(1);
Read more
This should work as well:
$query->setLimit(1);
Documentation: http://api.joomla.org/cms-3/classes/JDatabaseQueryLimitable.html
SetLimit doesn't work for me in Joomla 3.4.x, so try:
Within the model:
protected function getListQuery()
{
// Create a new query object.
$db = JFactory::getDBO();
$query = $db->getQuery(true);
// Select some fields
$query->select('*');
$query->from('#__your_table');
$this->setState('list.limit', 0); // 0 = unlimited
return $query;
}
Davids answer: https://joomla.stackexchange.com/questions/4249/model-getlistquery-fetch-all-rows-with-using-jpagination
Run that before the model calls getItems and it will load all the
items for you.
A few caveats with this.
You can also do this outside the model, so if for instance you were in
your view. You could do the following:
$model = $this->getModel(); $model->setState('list.limit', 0);
Sometimes you can do this too early, before the model's state has been
populated, which will cause the model to get rebuilt from the user
state after you have set the limit, basically overriding the limit.
To fix this, you can force the model to populate its state first:
$model = $this->getModel(); $model->getState();
$model->setState('list.limit', 0); The actual populateState method is
protected, so outside the model you can't call it directly, but any
call to getState will make sure that the populateState is called
before returning the current settings in the state.
Update: Just had to revisit this answer, and I can confirm, both the methods
setLimit & order are working if used as below.
$query->order($db->qn($data->sort_column_name) . ' ' . $data->sort_column_order);
$query->setLimit($length,$start);
OLD ANSWER
As of 08/Sept/14 The solutions from #Dasun or #escopecz arent working for me on J3.x
but this old trick is working for me which is nice,
$query->order($db->qn('id') . ' DESC LIMIT 25');
And About your specific requirement of wishing to fetch only 1 row you could use :
$rows = $db->loadObject();