FeathersJS update trigger on create - feathersjs

I am trying to find the best way to update a value in another service after the creation of data, essentially how a trigger would work in a database context. For instance I have an objects:
foo= {
_id,
barID,
otherData
}
bar= {
_id,
newestFooID,
otherData
}
When a new foo object is created I want update bar to link to the new object. Is the only way to do this through a third service? or is there someway to get enough use an after hook (the id field for foo is not available on create)?
I am trying to avoid using events, avoid rewriting the generated service data, and make it database independent (right now it is using mongoose, but that can/will change in the future).

The id for foo should be available in an after hook for create. Then you can just patch the bar service accordingly:
app.service('foo').after({
create(hook) {
const barId = hook.result.barId;
const newestFooId = hook.result._id;
// Update the barId with the newestFooId
return app.service('bar')
.patch(barId, { newestFooId })
.then(() => hook);
}
});

I have figured out the problem! The issue is because the hook.result is a mongoose object and not a different object. The mongoose object didn't have the requisite field in it. While working on another problem I cam across the feathers mongoose hook "toObject()" which when called makes the result a javascript object with all the data I need in it.

Related

Displaying data based on type of user authentication

So I am have made a laravel authentication application that can log in/register users based on a type and depending on a type they can see different information in this case data tables. I used the laratrust package to do this and works well, but if I want to make this with a vue component that will show the data using some special data grid how would go about doing it as the controller which I am using where the data is collected but is also the place where the view which the user will see depending on the type of user is also checked. So, how will I send the json data to the vue component and what other things do I need to consider.
Here is the controller in laravel:
class DashboardController extends Controller
{
public function index()
{
if(Auth::user()->hasRole('user')){
$posts = DB::select('select * from office');
return view('userdash',['posts'=>$posts]);
}elseif(Auth::user()->hasRole('administrator')){
$posts = Post::all();
return view('administratordash',['posts'=>$posts]);
}elseif(Auth::user()->hasRole('admin')){
$people = DB::select('select * from office');
$posts = Post::all();
return view('dashboard',['posts'=>$posts,'people'=>$people]);
}
}
}
One of the things I tried was
return response(view('userdash',array('posts'=>$posts)),200,['Content-Type' => 'application/json']);
This way I can just send the json data and then render it in the view component. But I am not sure if it is working as I get back a bunch of html and some of the data in the database but not all of it. Also not sure how this can be passed to the view component. Maybe as a prop but not sure.
Any and all help and suggestions are appreciated.
As you are responding from the controller with different views, you can just check for any variables sent with the wiew in your view and once your variables are at your disposal in the view yo can serialize them using $myVar=$myVariable->toJson() if they are laravel collections or json_encode($myVariable) if they are simple arrays.
Then you can
<my-component :data ={{$myVar}}/>

Gatling: How to setUp and tearDown scenario

I have a Gatling test which should do the following:
create user once
retrieve user's data according to specific load model. Actual load testing.
delete user after when done
Question: how to emulate this with Gatling? If I chain calls like :
val scn = scenario("Test scenario").exec(_create-user_).exec(_retrive-user_).exec(_delete-user_)
setUp(scn).protocols(httpConf))
then creating and deleting user will be part of the test.
You can use the before and after hooks to create and delete the user.
class RetrieveUserSimulation extends Simulation {
before {
// create user
}
setUp(scn).protocols(httpConf)
after {
// delete user
}
}
You'll have to issue the create and delete HTTP requests manually. before and after take => Unit thunks, not Scenarios.
In before hook we can call a method which can have below code.
val httpClient = HttpClientBuilder.create.build
val httpResponse = httpClient.execute(new HttpPut(urlString))
println("StatusCode - " + httpResponse.getStatusLine.getStatusCode)
httpClient.close()
We can use HttpGet as well. Here used apache library
example : org.apache.http.impl.client.HttpClientBuilder

How to fetch json data to shared service and use it between two isolated controllers

I am trying to fetch data from server and assign them to array in sharedScope factory. Next I want to simply inject factory into my two separated controllers and use two-way data binding between them to operate on the same array.
The similar approach I want to achieve is very well described here:
https://stackoverflow.com/a/24913983/5195524
Data field has already been assigned and everything works fine. The problem starts when i want to fetch data from server first. What should I do to make the data immediately available in controllers?
Your version is not working because of javascript variable scopes. Instead of accessing prototype chain you are creating new variable in success function. The most easiest fix is:
app.factory("sharedScope", function($http) {
var self = this;
self.data = {};
init();
function init() {
$http.get('http://jsonplaceholder.typicode.com/posts').
success(function(response) {
self.data.text = response;
});
}
return self;
});
Here is working plnkr http://plnkr.co/edit/6gA7nt4cYwOWJGAuoLe5?p=preview

Enable ChangeTracking In Child Objects Using STE

I'm using STE and I want to enable change tracking for an object and its children. What I currently have to do now is something like this.
int id = 1;
using(CustomerEntities context = new CustomerEntities())
{
CustomerSection custSection = context.CustomerSections.Include("CustomerSections.Customers").SingleOrDefault(p => p.ID == id);
custSection.StartTracking();
foreach(Customer cust in custSection.Customers)
{
cust.StartTracking();
{
return custSection;
}
What I am looking for is a way to automatically enable change tracking for the child objects too, without having to loop through each one and explicitly tell it to start tracking changes.
Thanks in advance for any insight.
Most probably you are using Self Tracking entities in combination with WCF. Then it's not needed to enable the changetracking manually. this is already done for you. The T4 template that generates the STE's includes a method decorated with the [OnDeserialized] attribute which starts the tracking once entities are deserialized (which occurs normally after reaching the client and converted back into runtime class instances fromout the xml that WCF generated for the transport. See the exact code example:
[OnDeserialized]
public void OnDeserializedMethod(StreamingContext context)
{
IsDeserializing = false;
ChangeTracker.ChangeTrackingEnabled = true;
}
Search your entities or the T4 template and you will find this soon.

Linq to SQL and concurrency with Rob Conery repository pattern

I have implemented a DAL using Rob Conery's spin on the repository pattern (from the MVC Storefront project) where I map database objects to domain objects using Linq and use Linq to SQL to actually get the data.
This is all working wonderfully giving me the full control over the shape of my domain objects that I want, but I have hit a problem with concurrency that I thought I'd ask about here. I have concurrency working but the solution feels like it might be wrong (just one of those gitchy feelings).
The basic pattern is:
private MyDataContext _datacontext
private Table _tasks;
public Repository(MyDataContext datacontext)
{
_dataContext = datacontext;
}
public void GetTasks()
{
_tasks = from t in _dataContext.Tasks;
return from t in _tasks
select new Domain.Task
{
Name = t.Name,
Id = t.TaskId,
Description = t.Description
};
}
public void SaveTask(Domain.Task task)
{
Task dbTask = null;
// Logic for new tasks omitted...
dbTask = (from t in _tasks
where t.TaskId == task.Id
select t).SingleOrDefault();
dbTask.Description = task.Description,
dbTask.Name = task.Name,
_dataContext.SubmitChanges();
}
So with that implementation I've lost concurrency tracking because of the mapping to the domain task. I get it back by storing the private Table which is my datacontext list of tasks at the time of getting the original task.
I then update the tasks from this stored Table and save what I've updated
This is working - I get change conflict exceptions raised when there are concurrency violations, just as I want.
However, it just screams to me that I've missed a trick.
Is there a better way of doing this?
I've looked at the .Attach method on the datacontext but that appears to require storing the original version in a similar way to what I'm already doing.
I also know that I could avoid all this by doing away with the domain objects and letting the Linq to SQL generated objects all the way up my stack - but I dislike that just as much as I dislike the way I'm handling concurrency.
I worked through this and found the following solution. It works in all the test cases I (and more importantly, my testers!) can think of.
I am using the .Attach() method on the datacontext, and a TimeStamp column. This works fine for the first time that you save a particular primary key back to the database but I found that the datacontext throws a System.Data.Linq.DuplicateKeyException "Cannot add an entity with a key that is already in use."
The work around for this I created was to add a dictionary that stored the item I attach the first time around and then every subsequent time I save I reuse that item.
Example code is below, I do wonder if I've missed any tricks - concurrency is pretty fundamental so the hoops I'm jumping through seem a little excessive.
Hopefully the below proves useful, or someone can point me towards a better implementation!
private Dictionary<int, Payment> _attachedPayments;
public void SavePayments(IList<Domain.Payment> payments)
{
Dictionary<Payment, Domain.Payment> savedPayments =
new Dictionary<Payment, Domain.Payment>();
// Items with a zero id are new
foreach (Domain.Payment p in payments.Where(p => p.PaymentId != 0))
{
// The list of attached payments that works around the linq datacontext
// duplicatekey exception
if (_attachedPayments.ContainsKey(p.PaymentId)) // Already attached
{
Payment dbPayment = _attachedPayments[p.PaymentId];
// Just a method that maps domain to datacontext types
MapDomainPaymentToDBPayment(p, dbPayment, false);
savedPayments.Add(dbPayment, p);
}
else // Attach this payment to the datacontext
{
Payment dbPayment = new Payment();
MapDomainPaymentToDBPayment(p, dbPayment, true);
_dataContext.Payments.Attach(dbPayment, true);
savedPayments.Add(dbPayment, p);
}
}
// There is some code snipped but this is just brand new payments
foreach (var payment in newPayments)
{
Domain.Payment payment1 = payment;
Payment newPayment = new Payment();
MapDomainPaymentToDBPayment(payment1, newPayment, false);
_dataContext.Payments.InsertOnSubmit(newPayment);
savedPayments.Add(newPayment, payment);
}
try
{
_dataContext.SubmitChanges();
// Grab the Timestamp into the domain object
foreach (Payment p in savedPayments.Keys)
{
savedPayments[p].PaymentId = p.PaymentId;
savedPayments[p].Timestamp = p.Timestamp;
_attachedPayments[savedPayments[p].PaymentId] = p;
}
}
catch (ChangeConflictException ex)
{
foreach (ObjectChangeConflict occ in _dataContext.ChangeConflicts)
{
Payment entityInConflict = (Payment) occ.Object;
// Use the datacontext refresh so that I can display the new values
_dataContext.Refresh(RefreshMode.OverwriteCurrentValues, entityInConflict);
_attachedPayments[entityInConflict.PaymentId] = entityInConflict;
}
throw;
}
}
I would look at trying to utilise the .Attach method by passing the 'original' and 'updated' objects thus achieving true optimistic concurrency checking from LINQ2SQL. This IMO would be preferred to using version or datetime stamps either in the DBML objects or your Domain objects. I'm not sure how MVC allows for this idea of persisting the 'original' data however.. i've been trying to investigate the validation scaffolding in the hope that it's storing the 'original' data.. but i suspect that it is as only as good as the most recent post (and/or failed validation). So that idea may not work.
Another crazy idea i had was this: override the GetHashCode() for all of your domain objects where the hash represents the unique set of data for that object (minus the ID of course). Then, either manually or with a helper bury that hash in a hidden field in the HTML POST form and send it back to your service layer with your updated domain object - do the concurrency checking in your service layer or data layer (by comparing the original hash with a newly extracted domain object's hash) but be aware that you need to be checking for and raising concurrency exceptions yourself. It's nice to use the DMBL functions but the idea of abstracting away the data layer is so to not depend on the particular implementation's features etc. So having full control of the optimistic concurrency checking on your domain objects in your service layer (for example) seems like a good approach to me.