When I have to get paid and pending orders, I searched and found a way to clone the query.
$paid = $products->clone()->where('paid', 1)->count();
$pending = $products->clone()->where('paid', 0)->count();
I wonder if this approach saves query time or if we still send two requests to the database server.
Thanks
laravel added the Illuminate\Support\Benchmark utility class in version 9.32. If you have it available, you can test for yourself.
use Illuminate\Support\Benchmark;
public function yourControllerMethod()
{
$products = Product::query();
$iterations = 100;
Benchmark::dd(
[
'cloning' => function () use ($products) {
$products->clone()->where('paid', 1)->count();
$products->clone()->where('paid', 0)->count();
},
'not cloning' => function () {
Product::query()->where('paid', 1)->count();
Product::query()->where('paid', 0)->count();
},
],
$iterations
);
}
Both approaches will execute the same amount of SQL queries though. I think the time saved (if any) will be minimal.
What I am trying to do is to cache all the results in a MySQL table that seldom changes, so as to minimize calls to database and increase query speed. There are about 100k records in there.
Is there a library that can sync changes made in this table, like say when a record is updated or inserted, the redis cache will also be invalidated and updated.
I have seen one for elasticsearch, but nothing for redis.
From this page:
Yii copying data from one model to another
There is this comment:
You can get all models attributes by:
$data = $model->attributes;
and assign them to another model
$anotherModel = new AnotherActiveRecord();
$anotherModel->setAttributes($data);
now another model will extract whatever it can from $data
I'm curious, can a Redis cache also "mirror" the data from a database table in a similar way?
Or is this just a bad idea overall, and its better off caching the query as it comes along, or is there a better way.
You can enable caching based on https://www.yiiframework.com/doc/guide/2.0/en/caching-data
[
'components' => [
'cache' => [
'class' => 'yii\redis\Cache',
'redis' => [
'hostname' => 'localhost',
'port' => 6379,
'database' => 0,
]
],
],
]
and then use Query Caching which natively defined on query builder level
$result = $db->cache(function ($db) {
// the result of the SQL query will be served from the cache
// if query caching is enabled and the query result is found in the cache
// ... perform SQL queries here ...
});
Also you can use Cache Dependencies based on your table (some criteria like if max(updated_at) is changed or not).
// Create a dependency on updated_at field
$dependency = new yii\caching\DbDependency(['sql' => 'select max(updated_at) from my_table']);
$duration = 60; // cache query results for 60 seconds.
$result = $db->cache(function ($db) {
// ... perform SQL queries here ...
return $result;
}, $duration, $dependency);
I'm wondering about performance and thinking about how to narrow down my SQL queries. Therefore I have the following question:
Let's say we have following relations:
public function getOrders() {
return $this->hasMany(Orders::className(), ['fk_product_id' => 'id']);
}
public function getOrdersByDate() {
return $this->hasMany(Orders::className(), ['fk_product_id' => 'id'])->orderBy('date');
}
So the question is, is there a way to connect these two relations without having to make extra SQL query when I call for $model->ordersByDate? I know I could go through the first relation with foreach() and sort it to get the result of 2nd relation, but that doesn't seem very wise.
You can use ->with() to get all the information at once
Model::find()->with('orders')->with('ordersByDate')->all()
and then reference them with $model->orders
Or you can get the orders once with getOrders and sort/find in the array later.
Unless you could be more specific in what kind of queries you're running, and you want a solution you can adapt to several different issues, you just have to read up :)
http://www.yiiframework.com/doc-2.0/yii-helpers-basearrayhelper.html
check the BaseArrayHelper::index()
http://www.w3schools.com/php/php_arrays_sort.asp
http://php.net/manual/en/function.array-search.php
http://php.net/manual/en/function.ksort.php
$fruits = array("d"=>"lemon", "a"=>"orange", "b"=>"banana", "c"=>"apple");
ksort($fruits);
foreach ($fruits as $key => $val) {
echo "$key = $val\n";
}
In Symfony3, I'm using Doctrine's QueryBuilder to iterate up to 500k rows from my 35 million row table:
$query = $this->createQueryBuilder('l')
->where('l.foo = :foo')
->setParameter('foo', $foo)
->getQuery();
$results = $query->iterate();
foreach ($results as $result) {
$em->clear();
// My logic using $result[0]
}
The memory usage of this often approaches 512mb, before I even begin to iterate. Is there any further way I can optimise this? Am I correct in reading that hydration is turned off when iterating a query?
I had great results with generators. Perhaps processing results in a separate method helps PHP to cleanup unused objects. I'm not sure what you're doing to process your records, and cannot guarantee you'll get the same results, but in my case memory consumption remained constant through whole script execution:
public function getMyResults($foo)
{
$query = $this->createQueryBuilder('l')
->where('l.foo = :foo')
->setParameter('foo', $foo)
->getQuery();
foreach ($query->iterate() as $result) {
yield $result[0]
$em->clear();
}
}
public function processMyResults($foo)
{
foreach ($this->getMyResults($foo) as $result) {
}
}
If this doesn't help, consider making a query with DBAL or PDO (both with the fetch() method to avoid fetching all records at once). Doctrine's iterator might leak memory (PDO's resultset shouldn't).
Doctrine will solve 80% of your problems. The remaining 20% is better approached without it.
Am I correct in reading that hydration is turned off when iterating a query?
No, unless you change the hydration mode. You can do it by passing a second argument to the iterate() method.
Example from Doctrine docs
$batchSize = 20;
$i = 1;
$q = $em->createQuery('select u from MyProject\Model\User u');
foreach ($q->toIterable() as $user) {
$user->increaseCredit();
$user->calculateNewBonuses();
++$i;
if (($i % $batchSize) === 0) {
$em->flush(); // Executes all updates.
$em->clear(); // Detaches all objects from Doctrine!
}
}
$em->flush();
Has anyone ever come across this error: General error: 1390 Prepared statement contains too many placeholders
I just did an import via SequelPro of over 50,000 records and now when I go to view these records in my view (Laravel 4) I get General error: 1390 Prepared statement contains too many placeholders.
The below index() method in my AdminNotesController.php file is what is generating the query and rendering the view.
public function index()
{
$created_at_value = Input::get('created_at_value');
$note_types_value = Input::get('note_types_value');
$contact_names_value = Input::get('contact_names_value');
$user_names_value = Input::get('user_names_value');
$account_managers_value = Input::get('account_managers_value');
if (is_null($created_at_value)) $created_at_value = DB::table('notes')->lists('created_at');
if (is_null($note_types_value)) $note_types_value = DB::table('note_types')->lists('type');
if (is_null($contact_names_value)) $contact_names_value = DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname');
if (is_null($user_names_value)) $user_names_value = DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname');
// In the view, there is a dropdown box, that allows the user to select the amount of records to show per page. Retrieve that value or set a default.
$perPage = Input::get('perPage', 10);
// This code retrieves the order from the session that has been selected by the user by clicking on a table column title. The value is placed in the session via the getOrder() method and is used later in the Eloquent query and joins.
$order = Session::get('account.order', 'company_name.asc');
$order = explode('.', $order);
$notes_query = Note::leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')));
if (!empty($created_at_value)) $notes_query = $notes_query->whereIn('notes.created_at', $created_at_value);
$notes = $notes_query->whereIn('note_types.type', $note_types_value)
->whereIn(DB::raw('CONCAT(contacts.first_name," ",contacts.last_name)'), $contact_names_value)
->whereIn(DB::raw('CONCAT(users.first_name," ",users.last_name)'), $user_names_value)
->paginate($perPage)->appends(array('created_at_value' => Input::get('created_at_value'), 'note_types_value' => Input::get('note_types_value'), 'contact_names_value' => Input::get('contact_names_value'), 'user_names_value' => Input::get('user_names_value')));
$notes_trash = Note::onlyTrashed()
->leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')))
->get();
$this->layout->content = View::make('admin.notes.index', array(
'notes' => $notes,
'created_at' => DB::table('notes')->lists('created_at', 'created_at'),
'note_types' => DB::table('note_types')->lists('type', 'type'),
'contacts' => DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname', 'cname'),
'accounts' => Account::lists('company_name', 'company_name'),
'users' => DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname', 'uname'),
'notes_trash' => $notes_trash,
'perPage' => $perPage
));
}
Any advice would be appreciated. Thanks.
Solved this issue by using array_chunk function.
Here is the solution below:
foreach (array_chunk($data,1000) as $t)
{
DB::table('table_name')->insert($t);
}
There is limit 65,535 (2^16-1) place holders in MariaDB 5.5 which is supposed to have identical behaviour as MySQL 5.5.
Not sure if relevant, I tested it on PHP 5.5.12 using MySQLi / MySQLND.
This error only happens when both of the following conditions are met:
You are using the MySQL Native Driver (mysqlnd) and not the MySQL client library (libmysqlclient)
You are not emulating prepares.
If you change either one of these factors, this error will not occur. However keep in mind that doing both of these is recommended either for performance or security issues, so I would not recommend this solution for anything but more of a one-time or temporary problem you are having. To prevent this error from occurring, the fix is as simple as:
$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
While I think #The Disintegrator is correct about the placeholders being limited. I would not run 1 query per record.
I have a query that worked fine until I added one more column and now I have 72k placeholders and I get this error. However, that 72k is made up of 9000 rows with 8 columns. Running this query 1 record at a time would take days. (I'm trying to import AdWords data into a DB and it would literally take more than 24 hours to import a days worth of data if I did it 1 record at a time. I tried that first.)
What I would recommend is something of a hack. First either dynamically determine the max number of placeholders you want to allow - i.e. 60k to be safe. Use this number to determine, based on the number of columns, how many complete records you can import/return at once. Create the full array of data for you query. Use a array_chunk and a foreach loop to grab everything you want in the minimum number of queries. Like this:
$maxRecords = 1000;
$sql = 'SELECT * FROM ...';
$qMarks = array_fill(0, $maxInsert, '(?, ...)');
$tmp = $sql . $implode(', ', $qMarks);
foreach (array_chunk($data, $maxRecords) AS $junk=>$dataArray) {
if (count($dataArray) < $maxRecords)) { break; }
// Do your PDO stuff here using $tmp as you SQL statement with all those placeholders - the ?s
}
// Now insert all the leftovers with basically the same code as above except accounting for
// the fact that you have fewer than $maxRecords now.
Using Laravel model, copy all 11000 records from sqlite database to mysql database in few seconds. Chunk data array to 500 records:
public function handle(): void
{
$smodel = new Src_model();
$smodel->setTable($this->argument('fromtable'));
$smodel->setConnection('default'); // sqlite database
$src = $smodel::all()->toArray();
$dmodel = new Dst_model();
$dmodel->setTable($this->argument('totable'));
$dmodel->timestamps = false;
$stack = $dmodel->getFields();
$fields = array_shift($stack);
$condb = DB::connection('mysql');
$condb->beginTransaction();
$dmodel::query()->truncate();
$dmodel->fillable($stack);
$srcarr=array_chunk($src,500);
$isOK=true;
foreach($srcarr as $item) {
if (!$dmodel->query()->insert($item)) $isOK=false;
}
if ($isOK) {
$this->notify("Przenieśliśmy tabelę z tabeli : {$this->argument('fromtable')} do tabeli: {$this->argument('totable')}", 'Będzie świeża jak nigdy!');
$condb->commit();
}
else $condb->rollBack();
}
You can do it with array_chunk function, like this:
foreach(array_chunk($data, 1000) as $key => $smallerArray) {
foreach ($smallerArray as $index => $value) {
$temp[$index] = $value
}
DB::table('table_name')->insert(temp);
}
My Fix for above issue:
On my side when i got this error I fixed it by reducing the the bulk insertion chunk size from 1000 to 800 and it worked for me.
Actually there were too many fields in my table and most them contains the details descriptions of size like a complete page text. when i go for there bulk insertion the service caused crashed and through the above error.
I think the number of placeholders is limited to 65536 per query (at least in older mysql versions).
I really can't discern what this piece of code is generating. But if it's a gigantic query, There's your problem.
You should generate one query per record to import and put those into a transaction.