I am trying to use the OnDelete trigger for my Cloud Functions on Firestore. I have two collections "alerts" and "logs". The Log object has an "alertId" key. What I'm trying to do is when an Alert is deleted to delete all the corespondent logs using a cloud function.
Something like that:
exports.deleteLogs = functions.database.instance('my-app').ref('/alerts/{alertId}')
.onDelete((snap) => {
snap.ref('logs',ref => ref.where('alertId', '==', alertId)).delete();
});
You can trigger a function when a Firestore document is deleted like this:
exports.deleteUser = functions.firestore
.document('alerts/{alertID}')
.onDelete((snap, context) => {
// Get an object representing the document prior to deletion
const deletedValue = snap.data();
// From there, get the deleted alert's id and delete all logs
// with that alertId key
});
Related
I am using this json server in my Angular app, to create, fetch, and delete posts.
In the following method, I delete a post with a specified id:
deleteConsumer(post: Post): Observable<Post> {
const url = `${this.apiUrl}/${post.id}`;
return this.httpClient.delete<Post>(url);
}
I looked at the .delete code and searched for something like a .deleteall but could not find it. Is there really no such method that would delete everything?
If there really isn't, then my attempt at doing it myself is not paying off, because what I have done is not working:
deleteConsumers(): Observable<Post> {
let i: number = 0;
this.httpClient.get<Post[]>(this.apiUrl).forEach(
() => {
++i;
const url = `${this.apiUrl}/${i}`;
return this.httpClient.delete<Post>(url);
}
);
}
Obviously, this is wrong in terms of return type, but I cannot figure out what to do... How can I modify the first method, so it would go through all the json objects in my db.json file; meaning iterate through all the existing posts and delete them all?
I did encounter this when using json-server with Vue.js and I realized that there was no special function to delete all at once. I had to work around it.
So, for example in your case, I would first map the posts array to get a new array with only the post ids:
const postsIdsArray = this.posts.map((post) => post.id)
Then, assuming you already have a function to delete one post given the id, I would then execute the function for each of the ids in the array:
postsIdsArray.forEach((id) => this.deletePost(id))
Just combine the two lines in one JavaScript function (in this case I used Vue.js):
deleteAllPosts(){
const postsIdsArray = this.posts.map((post) => post.id)
postsIdsArray.forEach((id) => this.deletePost(id))
}
I've got two services in Feathers.
For the sake of example, let's call them:
TodoLists
TodoItems
TodoItems have a N:1 association to TodoLists, and I include the TodoItems model in a find hook of the TodoLists service.
Now - on my frontend I have a listener that listens for 'update' event on TodoLists.
What is the right way to get the 'update' event emitted for TodoLists when any TodoItem is updated?
Try watching on all TodoItems updates and filter by your TodoList id.
const currentTodoList = 1;
app.service('TodoItems').on('updated', (item) => {
if (item.todolist === currentTodoList){
// update ui
}
});
If you additionally want to optimize the traffic send by the backend to only send needed data, you could use some subscribe pattern (TodoListSubscription join feathers channel in create and leave in remove).
In Laravel After recording last row to a DB table, can I safely access same recorded data right after recording it by calling latest() queries? Because transactions by other users may occur at the same time, and it may not really be the last record anymore?
Edit:
For example:
Public function StoreNotif($data){
auth()->user()->Notif()->create(store $data here..)
}
Public function SendNotif(){
$data="123";
$this->StoreNotif($data)
event(new Notification(stored Notif instance?));
}
No, you cannot rely on the database to return the record from your current script.
The ->latest() method will always sort the records with the most recent created_at date first.
https://laravel.com/docs/6.x/queries#ordering-grouping-limit-and-offset
But you haven't provided any code or explanation as to why this is a concern. If you just created a new record, why do you need to query it again? You should already have access to an instance of the model.
EDIT: I've made a few edits to demonstrate how you would pass the model from a controller to an event as referenced in the comments. Please post your code if you want more specific help.
SomeController.php
function store()
{
$model = Model::create([
'some_data' => 1
]);
// fire an event with the newly created model
event(new SomeEvent($model));
dd($model);
}
------------------------
Model {
// ...
attributes: [
'id' => 101,
'some_data' => 1
'created_at' => '2019-10-06 12:48:01',
'updated_at' => '2019-10-06 12:48:01',
]
// ...
}
SomeEvent.php
<?php
namespace App\Events;
use App\Model;
use Illuminate\Queue\SerializesModels;
class SomeEvent
{
use SerializesModels;
public $model;
public function __construct(Model $model)
{
$this->model = $model;
// ...
}
}
EDIT: Per your newly added code, you just need to pass the new model back to the original method. You could do something like this.
Public function StoreNotif($data)
{
// add a return statement
return auth()->user()->Notif()->create(store $data here..);
}
Public function SendNotif()
{
$data="123";
// store the returned data to a variable
$model = $this->StoreNotif($data);
// call the event with the model instance
event(new Notification(model));
}
I'm not sure what 'latest' is but I do know that MySQL uses SELECT LAST_INSERT_ID as the query to get the 'per-connection' id of the last inserted item. Under the covers it's using mysql_insert_id so if you are in a language that supports it, you could use that too.
I'm considering using Loopback to build a RESTFull API, internal usage. I'm currently prototyping a small portion of the API to evaluate limitations and workload.
I have a huge constraint : I'm allowed to Create/Read/Update, but to Delete, I have update the DB entry to mark it as 'deleted' (boolean in the database). I'm not allowed to physically deleted the DB entry.
I have a PersistedModel, and some relation between object (dependencies, like one object child from another).
My question is : Is there a way to override the DELETE actions done in the background and input some custom code :
mark the object as "deleted" (like an UPDATE table SET deleted = 1 WHERE id = XXX)
manually CASCADE to dependent objects
while using the DELETE api call ?
Thanks fro your advice.
To overwrite delete method you can use the next code. Create a mixin based on it and attach for every required model
module.exports = function(Model) {
Model.removeById = Model.deleteById = Model.destroyById = function(id, options, cb) {
if (cb === undefined) {
if (typeof options === 'function') {
// destroyById(id, cb)
cb = options;
}
}
cb = cb || createPromiseCallback();
Model.update({id: id, {deleted: true}, options, cb})
return cb.promise;
};
}
So the situation is that I am using Doctrine as the ORM for one of my projects.
Now I want to be able to track the changes happening on certain tables of my website without having to much extra coding for that.
For eg. I have a database which has many tables. out of that i have a table users on which I want to track the changes done
1. users has column name with value 'Raman'
2. Using update sql below i modify the row
update users set name = 'Raman Joshi' where name='Raman'
Is there any in built feature in doctrine that allows to create a log table tracking all the data level changes log that was done?
You can use a Doctrine preUpdate event listener to do this. Here's a simple example that will send changes to a logger:
use Psr\Log\LoggerInterface as Logger;
use Doctrine\ORM\Event\PreUpdateEventArgs;
class ChangeLoggerListener
{
protected $logger;
public function __construct(Logger $logger)
{
$this->logger = $logger;
}
public function preUpdate(PreUpdateEventArgs $eventArgs)
{
//find out class and id of object being updated
$obj=$eventArgs->getEntity();
$class=get_class($eventArgs->getEntity());
$id=$obj->getId();
$log="$class($id) updated: ";
//find out what has changed...
$changes=$eventArgs->getEntityChangeSet();
$separator='';
foreach ($changes as $field => $values) {
$log.=$separator."$field changed from {$values[0]} to {$values[1]}";
$separator=", ";
}
//send it to logger
$this->logger->info($log);
}
}
The manual page shows how to register the listener, but if you're using Symfony, you can register the listener as a service with this in your services.yml
my.change_logger:
class: My\ExampleBundle\Listener\ChangeLoggerListener
arguments: [#logger]
tags:
- { name: doctrine.event_listener, event: preUpdate }