Does Laravel clone query reduce query time? - mysql

When I have to get paid and pending orders, I searched and found a way to clone the query.
$paid = $products->clone()->where('paid', 1)->count();
$pending = $products->clone()->where('paid', 0)->count();
I wonder if this approach saves query time or if we still send two requests to the database server.
Thanks

laravel added the Illuminate\Support\Benchmark utility class in version 9.32. If you have it available, you can test for yourself.
use Illuminate\Support\Benchmark;
public function yourControllerMethod()
{
$products = Product::query();
$iterations = 100;
Benchmark::dd(
[
'cloning' => function () use ($products) {
$products->clone()->where('paid', 1)->count();
$products->clone()->where('paid', 0)->count();
},
'not cloning' => function () {
Product::query()->where('paid', 1)->count();
Product::query()->where('paid', 0)->count();
},
],
$iterations
);
}
Both approaches will execute the same amount of SQL queries though. I think the time saved (if any) will be minimal.

Related

Insert query works fine on my local but doesn't execute on my live database (Laravel/MySQL)

I have a very simple function that loops through an array and inserts some data into a results table - this works perfectly fine on my local using the very same code. On my local setup (Mac) using Laravel Valet & an MySQL database it hits the function Result::create($data) and inserts this data in the database. However on the live/remote site it never hits the Result::create() within the insertUniqueMatches for some reason.
I have added the db user in the env file and it has been granted all privileges so I cannot understand why this won't insert the entry into the results table. Can anyone explain what I am doing wrong? All migrations have been ran to ensure my local and live db are identical.
P.S i have tried both the $fillable variable with all the relevant items in the array and also with the $guarded as a blank array and the problem persists.
class Result extends Model
{
use HasFactory;
// protected $fillable = ['match_id', 'home_team_id', 'away_team_id', 'home_team_goals', 'away_team_goals', 'outcome', 'match_date', 'properties', 'platform_id'];
protected $guarded = [];
public static function insertUniqueMatches($matches, $platform = null)
{
$inserted = 0;
foreach ($matches as $match) {
// check if existing match already exists in the db, if so don't re-insert this
if (Result::where('match_id', '=', $match['matchId'])->doesntExist()) {
$carbonDate = Carbon::now();
$carbonDate->timestamp($match['timestamp']);
$clubs = collect($match['clubs'])->values();
$data = [
'match_id' => $match['matchId'],
'home_team_id' => $clubs[0]['id'],
'away_team_id' => $clubs[1]['id'],
'home_team_goals' => $clubs[0]['goals'],
'away_team_goals' => $clubs[1]['goals'],
'outcome' => self::getMatchOutcome($clubs[0]),
'match_date' => $carbonDate->format('Y-m-d H:i:s'),
'properties' => json_encode([
'clubs' => $match['clubs'],
'players' => $match['players']
]),
'platform_id' => $platform
];
dump($data); // this shows valid data in the terminal
// this if condition is only reached on my local development but never on live so no inserts happen on the live DB
if (Result::create($data)) {
$inserted++;
dump('inserted matchId: '. $match['matchId']); // never see this on line but always on local
}
}
}
return $inserted;
}
i think better solution for now is you can find the problem.
you could write code into try-catch for more information.
replace this code
try {
Result::create($data);
} catch (\Exception $e) {
dd($e);
}
with:
dump($data); // this shows valid data in the terminal
// this if condition is only reached on my local development but never on live
if (Result::create($data)) {
$inserted++;
dump('inserted matchId: '. $match['matchId']); // never see this on line but always on local
}

Execute stored procedure in directus headless cms

I just find directus headless cms
Looks awesome. It resolve many uses cases for me.
But I am concerned about how to achieve transactions, aggregate functions or complex queries. I understand that maybe is out scope.
If a custom endpoint or graphql allow me execute a stored procedure i will have all my needs achieved.
Is it possible?
Hi finally I find how to use custom endpoints to do plain querys, including stored procedures.
Maybe is possible implement a module for add admin gui option for that, I try work in that, for the moment this is the example for a select:
use Directus\Application\Http\Request;
use Directus\Application\Http\Response;
return [
'' => [
'method' => 'GET',
'handler' => function (Request $request, Response $response) {
$container = \Directus\Application\Application::getInstance()->getContainer();
$dbConnection = $container->get('database');
$tableGateway = new \Zend\Db\TableGateway\TableGateway('directus_users', $dbConnection);
$query = $tableGateway->getAdapter()->query("select * from productos where 1=1");
$result = $query->execute();
if ($result->count() > 0) {
$returnArr = array();
while ($result->valid()) {
$returnArr[] = $result->current();
$result->next();
}
if (count($returnArr) > 0) {
return $response->withJson([
'data' => [
$returnArr,
],
]);
}
}
return "{}";
},
],
];
Sorry for my bad English.

Getting Error When I Start multiple select query inside loop TimeoutError: ResourceRequest timed out

I'm using nodeJs Express Framework.
I'm using mysql database with sequelizejs library and using querying for retrieve data.
I am getting timeout error when I fired select query for almost 50,00,000 records.
I have done the server timeout but not worked.
I have done the pooling method in sequlizeJs But not worked.
function fetchNamesData(req, name) {
return new Promise((resolve, reject) => {
const names = req.app.locals.models.names_data;
names.findAll({
where: {
name: name
},
order: [['date', 'DESC']],
limit: 50
})
.then(function (dbRes) {
console.log(dbRes.length);
resolve(dbRes);
})
.catch(function (dbErr) {
console.log(dbErr);
return reject(dbErr);
});
});
}
allNames.forEach(element => {
//console.log(element.dataValues.name);
fetchNamesData(req, element.dataValues.name).then((dbRes) => {
//here I will have all the records
}).catch((dbErr) => { console.log(dbErr) });
var allNames = {having almost 7000 names}
now I iterate this obj and each names having 50 record in database
I want to get that all record like 50*7000 = 3,50,000.
What happens in your case is :
Looping through 7000 names and at same time hitting 7000 queries in mySql , and mysql will create queue for executing 7000 queries at same time cause load on machine. Either you can update your configuration to handle such load OR
Solution to this : Try to put some timeout b/w each queries , this way you will be able to fetch more records ,
allNames.forEach(element => {
setTimeout(() => { // <----------- HERE -------------
fetchNamesData(req, element.dataValues.name).then((dbRes) => {
//here I will have all the records
}).catch((dbErr) => {
console.log(dbErr)
});
},500); // <----------- HERE -------------
});
I have found the solution like
- Remove the unwanted console.log() and
- Also your hardware configuration depend upon it for timeout error.
- When query is firing do not start or run any other work it will lead to time out error[when there are multiple crud operation is going on].
- Also give index to table field when particular field is going to use in where clause.

How can fetching huge records using Laravel and MySQL?

I Need experts Suggestions and Solutions. We are developing job portal website here by handle around 1 million records. We are facing records fetching timeout errors. How can I handle those records using laravel and MySql?
We are trying to follow steps:
Increase the PHP execution time
MySql Indexing
Paginations
You should be chunking results when working with large data sets. This allows you to process smaller loads, reduces memory consumption and allows you to return data to the User while the rest is being fetched/processing. See the laravel documentation on chunking:
https://laravel.com/docs/5.5/eloquent#chunking-results
To further speed things up you can leverage multithreading and spawn concurrent processes that each handle a chunk at a time. Symfony's Symfony\Component\Process\Process class makes this easy to do.
https://symfony.com/doc/current/components/process.html
From the docs:
If you need to work with thousands of database records, consider using the chunk method. This method retrieves a small chunk of the results at a time and feeds each chunk into a Closure for processing. This method is very useful for writing Artisan commands that process thousands of records.
For example, let's work with the entire users table in chunks of 100 records at a time:
DB::table('users')->orderBy('id')->chunk(100, function ($users) {
foreach ($users as $user) {
//
}
});
Hi I think this might help
$users = User::groupBy('id')->orderBy('id', 'asc');
$response = new StreamedResponse(function() use($users){
$handle = fopen('php://output', 'w');
// Add Excel headers
fputcsv($handle, [
'col1', 'Col 2' ]);
$users->chunk(1000, function($filtered_users) use($handle) {
foreach ($filtered_users as $user) {
// Add a new row with user data
fputcsv($handle, [
$user->col1, $user->col2
]);
}
});
// Close the output stream
fclose($handle);
}, 200, [
'Content-Type' => 'text/csv',
'Content-Disposition' => 'attachment; filename="Users'.Carbon::now()->toDateTimeString().'.csv"',
]);
return $response;
Laravel has a lazy feature for this purpose. I tried both chunk and cursor. The cursor makes one query and puts a lot of data in the memory which is not useful if you have millions of records in DB. Chunk also was ok but lazy much cleaner in the way you write your code.
use App\Models\Flight;
foreach (Flight::lazy() as $flight) {
//
}
Source: https://laravel.com/docs/9.x/eloquent#chunking-results

Import of 50K+ Records in MySQL Gives General error: 1390 Prepared statement contains too many placeholders

Has anyone ever come across this error: General error: 1390 Prepared statement contains too many placeholders
I just did an import via SequelPro of over 50,000 records and now when I go to view these records in my view (Laravel 4) I get General error: 1390 Prepared statement contains too many placeholders.
The below index() method in my AdminNotesController.php file is what is generating the query and rendering the view.
public function index()
{
$created_at_value = Input::get('created_at_value');
$note_types_value = Input::get('note_types_value');
$contact_names_value = Input::get('contact_names_value');
$user_names_value = Input::get('user_names_value');
$account_managers_value = Input::get('account_managers_value');
if (is_null($created_at_value)) $created_at_value = DB::table('notes')->lists('created_at');
if (is_null($note_types_value)) $note_types_value = DB::table('note_types')->lists('type');
if (is_null($contact_names_value)) $contact_names_value = DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname');
if (is_null($user_names_value)) $user_names_value = DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname');
// In the view, there is a dropdown box, that allows the user to select the amount of records to show per page. Retrieve that value or set a default.
$perPage = Input::get('perPage', 10);
// This code retrieves the order from the session that has been selected by the user by clicking on a table column title. The value is placed in the session via the getOrder() method and is used later in the Eloquent query and joins.
$order = Session::get('account.order', 'company_name.asc');
$order = explode('.', $order);
$notes_query = Note::leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')));
if (!empty($created_at_value)) $notes_query = $notes_query->whereIn('notes.created_at', $created_at_value);
$notes = $notes_query->whereIn('note_types.type', $note_types_value)
->whereIn(DB::raw('CONCAT(contacts.first_name," ",contacts.last_name)'), $contact_names_value)
->whereIn(DB::raw('CONCAT(users.first_name," ",users.last_name)'), $user_names_value)
->paginate($perPage)->appends(array('created_at_value' => Input::get('created_at_value'), 'note_types_value' => Input::get('note_types_value'), 'contact_names_value' => Input::get('contact_names_value'), 'user_names_value' => Input::get('user_names_value')));
$notes_trash = Note::onlyTrashed()
->leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')))
->get();
$this->layout->content = View::make('admin.notes.index', array(
'notes' => $notes,
'created_at' => DB::table('notes')->lists('created_at', 'created_at'),
'note_types' => DB::table('note_types')->lists('type', 'type'),
'contacts' => DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname', 'cname'),
'accounts' => Account::lists('company_name', 'company_name'),
'users' => DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname', 'uname'),
'notes_trash' => $notes_trash,
'perPage' => $perPage
));
}
Any advice would be appreciated. Thanks.
Solved this issue by using array_chunk function.
Here is the solution below:
foreach (array_chunk($data,1000) as $t)
{
DB::table('table_name')->insert($t);
}
There is limit 65,535 (2^16-1) place holders in MariaDB 5.5 which is supposed to have identical behaviour as MySQL 5.5.
Not sure if relevant, I tested it on PHP 5.5.12 using MySQLi / MySQLND.
This error only happens when both of the following conditions are met:
You are using the MySQL Native Driver (mysqlnd) and not the MySQL client library (libmysqlclient)
You are not emulating prepares.
If you change either one of these factors, this error will not occur. However keep in mind that doing both of these is recommended either for performance or security issues, so I would not recommend this solution for anything but more of a one-time or temporary problem you are having. To prevent this error from occurring, the fix is as simple as:
$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
While I think #The Disintegrator is correct about the placeholders being limited. I would not run 1 query per record.
I have a query that worked fine until I added one more column and now I have 72k placeholders and I get this error. However, that 72k is made up of 9000 rows with 8 columns. Running this query 1 record at a time would take days. (I'm trying to import AdWords data into a DB and it would literally take more than 24 hours to import a days worth of data if I did it 1 record at a time. I tried that first.)
What I would recommend is something of a hack. First either dynamically determine the max number of placeholders you want to allow - i.e. 60k to be safe. Use this number to determine, based on the number of columns, how many complete records you can import/return at once. Create the full array of data for you query. Use a array_chunk and a foreach loop to grab everything you want in the minimum number of queries. Like this:
$maxRecords = 1000;
$sql = 'SELECT * FROM ...';
$qMarks = array_fill(0, $maxInsert, '(?, ...)');
$tmp = $sql . $implode(', ', $qMarks);
foreach (array_chunk($data, $maxRecords) AS $junk=>$dataArray) {
if (count($dataArray) < $maxRecords)) { break; }
// Do your PDO stuff here using $tmp as you SQL statement with all those placeholders - the ?s
}
// Now insert all the leftovers with basically the same code as above except accounting for
// the fact that you have fewer than $maxRecords now.
Using Laravel model, copy all 11000 records from sqlite database to mysql database in few seconds. Chunk data array to 500 records:
public function handle(): void
{
$smodel = new Src_model();
$smodel->setTable($this->argument('fromtable'));
$smodel->setConnection('default'); // sqlite database
$src = $smodel::all()->toArray();
$dmodel = new Dst_model();
$dmodel->setTable($this->argument('totable'));
$dmodel->timestamps = false;
$stack = $dmodel->getFields();
$fields = array_shift($stack);
$condb = DB::connection('mysql');
$condb->beginTransaction();
$dmodel::query()->truncate();
$dmodel->fillable($stack);
$srcarr=array_chunk($src,500);
$isOK=true;
foreach($srcarr as $item) {
if (!$dmodel->query()->insert($item)) $isOK=false;
}
if ($isOK) {
$this->notify("Przenieśliśmy tabelę z tabeli : {$this->argument('fromtable')} do tabeli: {$this->argument('totable')}", 'Będzie świeża jak nigdy!');
$condb->commit();
}
else $condb->rollBack();
}
You can do it with array_chunk function, like this:
foreach(array_chunk($data, 1000) as $key => $smallerArray) {
foreach ($smallerArray as $index => $value) {
$temp[$index] = $value
}
DB::table('table_name')->insert(temp);
}
My Fix for above issue:
On my side when i got this error I fixed it by reducing the the bulk insertion chunk size from 1000 to 800 and it worked for me.
Actually there were too many fields in my table and most them contains the details descriptions of size like a complete page text. when i go for there bulk insertion the service caused crashed and through the above error.
I think the number of placeholders is limited to 65536 per query (at least in older mysql versions).
I really can't discern what this piece of code is generating. But if it's a gigantic query, There's your problem.
You should generate one query per record to import and put those into a transaction.