Is it possible to make use of a hook like restrictToOwner in another hook? For example I want users to be able to update their own information in general only, but I also want them to have access to specific properties of other users. So if a specific query parameter exist, let's say a comment about a user, I want the user to pass and otherwise I want to make use of the restrictToOwner hook. Of course I could just write my own equivalent restrictToOwner hook, but I would like to make use of the already existing one, if possible(?). The code below isn't working, but I want something like:
module.exports = function(options) {
return function(hook) {
if (typeof hook.data.comment !== 'undefined')
return hook;
return auth.restrictToOwner({ ownerField: '_id' });
};
};
Another similar thing I want to do is executing a hook only if the request is an external call. My internal script should have unlimited access. Something like:
// user/hooks/index.js
exports.before = {
patch: [
globalHooks.ifExternal(auth.restrictToOwner({ ownerField: '_id' }))
]
};
// hooks/index.js
exports.ifExternal = function(func) {
return function(hook) {
if (typeof hook.params.provider === 'undefined') //? if internal
return hook;
return func;
};
};
Thanks in advance!
auth.restrictToOwner is a function that returns another function that consumes a hook object.
So you need to call it like this:
return auth.restrictToOwner({ ownerField: '_id' })(hook);
To your second question:
if (!hook.params.provider) {
// internal only stuff here.
}
Related
BEFORE ANYONE MARKS THIS AS DUPLICATE, I AM AWARE OF:
How to import CSV or JSON to firebase cloud firestore
As in the question above, I am trying to send JSON to Firestore using a Google Cloud Function, allowing me to convert my realtime database to Firestore. I have been using the script provided as an answer by Maciej Kaputa in the link above. The script he provides is as follows, which is exactly the same as that provided in the link above:
const admin = require('../functions/node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://<your-database-name>.firebaseio.com"
});
const data = require("./fakedb.json");
/**
* Data is a collection if
* - it has a odd depth
* - contains only objects or contains no objects.
*/
function isCollection(data, path, depth) {
if (
typeof data != 'object' ||
data == null ||
data.length === 0 ||
isEmpty(data)
) {
return false;
}
for (const key in data) {
if (typeof data[key] != 'object' || data[key] == null) {
// If there is at least one non-object item then it data then it cannot be collection.
return false;
}
}
return true;
}
// Checks if object is empty.
function isEmpty(obj) {
for(const key in obj) {
if(obj.hasOwnProperty(key)) {
return false;
}
}
return true;
}
async function upload(data, path) {
return await admin.firestore()
.doc(path.join('/'))
.set(data)
.then(() => console.log(`Document ${path.join('/')} uploaded.`))
.catch(() => console.error(`Could not write document ${path.join('/')}.`));
}
/**
*
*/
async function resolve(data, path = []) {
if (path.length > 0 && path.length % 2 == 0) {
// Document's length of path is always even, however, one of keys can actually be a collection.
// Copy an object.
const documentData = Object.assign({}, data);
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// Resolve a colleciton.
resolve(data[key], [...path, key]);
}
}
// If document is empty then it means it only consisted of collections.
if (!isEmpty(documentData)) {
// Upload a document free of collections.
await upload(documentData, path);
}
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
However, this does not seem to work for me. I have set up my firebase cli in the usual way and checked if functions can be sent to my project by running the standard "Hello world" function with success. It is not a problem with my JSON, as I have used an online validator. Through some research, I found that eslint ecmaVersion 6 does not allow async functions, which can be resolved by changing the ecmaVersion in the .eslintrc.json file to "ecmaVersion": 2017. Next, you have to delete some rules from the .eslintrc.json, these are // Require the use of === and !==, // Disallow null comparisons without type-checking operators and // Disallow await inside of loops. Now, with these modifications, when I type firebase deploy --only functions into my functions folder in the terminal, I am told that the function was deployed and provided with a link to my Firebase console. However, my function doesn't appear in the functions tab of firebase, nor is my data uploaded to the cloud firestore. Does anyone know what i am doing wrong? Could it be that I am not calling exports on the function? Or is this used only for prebuilt Firebase functions?
That script is not a Cloud Function, it looks like it is desired to simply run as a Node.js standalone script with data been given whatever data from RTDB you want in Cloud Firestore.
If you want to use it as a Cloud Function, you'll need to set some exports and use the data from them as the value of data. Note - that will only allow you to transfer the data changed/read. You'll likely want to run this against the entire DB at least a first (hence why it's a standalone script).
I am trying to learn ReactJS with ES6 along with setting up an instance of Fixed-Data-Table. I'm using the ObjectDataExample example from the github repo, but instead of the faker() values fed to the DataListStore, I want to use a DataListStore that gets its cache from a remote JSON resource. This is how I have defined my DataListStore:
class MyDataListStore {
constructor(/* url string */ url) {
this.url = url || 'http://localhost:8080/default-json';
this._cache = [];
this.pageSize = 1;
this.size = 0;
this.getRemoteData(url);
}
getRemoteData() {
/**
* Fetch remote JSON to be used in the store.
*/
var that = this;
fetch(this.url).then(function(response) {
return response.json();
}).then(function(j) {
console.log(j);
//this.pageSize = j["pages"];
that.size = j["total"];
that._cache = j["table"];
if (that._cache) {
// do something here?
}
});
}
getObjectAt(/*number*/ index) /*?object*/ {
if (index < 0 || index > this.size){
return undefined;
}
if (this._cache[index] === undefined) {
//this._cache[index] = this.createFakeRowObjectData(index);
}
return this._cache[index];
}
getSize() {
return this.size;
}
}
module.exports = MyDataListStore;
As you can see I'm following the FakeObjectDataListStore provided with the example from fixed-data-table more or less. The JSON is fetched properly, the _cache is populated with an array of objects, and when you output getSize once getRemoteData has executed, you do get the size of the _cache. However, I haven't figured out how my fixed-data-table Table component should be updated once the data has been fetched. Currently the Table is rendered but is simple blank with no rows.
class ObjectDataExample extends React.Component {
constructor(props) {
super(props);
this.state = {
dataList: new MyDataListStore()
};
}
render() {
var {dataList} = this.state;
return <Table
rowHeight={70} rowsCount={dataList.getSize()} width={1170} height={500} headerHeight={30}>
<Column
header={<Cell>ID</Cell>}
cell={<TextCell data={dataList} col="id" />}
width={50}
fixed={true}
/>
<Column
header={<Cell>Email</Cell>}
cell={<TextCell data={dataList} col="email" />}
width={300}
fixed={true}
/>
</Table>
}
}
module.exports = ObjectDataExample;
I think the main issue is that I don't have any code meant to populate the table once MyDataListStore is populated with the data from the async call. However, I can't find any help from the examples given in the Fixed-Data-Table github repo or the docs. Any idea how to get this done? I assume I need to set up some sort of event listener, but I'm not sure where/how to do this, as I'm still new to both ReactJS and Fixed-Data-Table.
Edit: I should also add that when the page loads, I get the following error:
Uncaught TypeError: Cannot read property 'id' of undefined
once I set the initial this.size to more than 0. So of course the table doesn't have the available data when it's first loading.
Edit 2: After looking into this further, it looks like if I run the fetch in componentDidMount of my ObjectDataExample and use this.setState(); to reset the dataList object, then I get the table updated. However, this looks a little messy and I'd assume there's a better way to do this directly from my MyDataListStore object.
Thanks,
One design issue with the current implementation of MyDataListStore is that it does not provide a way to notify the caller when the data has been loaded.
One possible way you might do this is to implement some sort of factory function (in the example below, I'm pretending that one exists called MyDataListStore.of) that returns a Promise that eventually resolves the MyDataListStore instance once the data loads:
// In the ObjectData component constructor, we call the MyDataListStore
// factory function and once it resolves, we assign it to our
// state. This will cause our component to re-render.
constructor() {
MyDataListStore.of(myDataListStoreUrl).then(store => {
this.setState({ dataList: store });
});
}
Now, once the data in the data list store resolves, our template (specified in your render function) will render correctly.
The DataListStore.of function we used earlier might look something like this:
class MyDataListStore {
static of(url) {
const dataListStore = new MyDataListStore(url);
return dataListStore.getRemoteData().then(() => return dataListStore);
}
/* ... other MyDataListStore properties/methods ... */
}
And finally we need to update the getRemoteData to return a promise. This is what will allow any clients of our MyDataListStore class to be notified that the data has loaded:
getRemoteData() {
/**
* Fetch remote JSON to be used in the store.
*/
var that = this;
// Return the chained promise! This promise will resolve
// after our last callback is called.
return fetch(this.url).then(function(response) {
return response.json();
}).then(function(j) {
console.log(j);
//this.pageSize = j["pages"];
that.size = j["total"];
that._cache = j["table"];
if (that._cache) {
// do something here?
}
});
}
I know this has something to do with using $q and promises, but I've been at it for hours and still can't quite figure out how it's supposed to work with my example.
I have a .json file with the data I want. I have a list of people with id's. I want to have a service or factory I can query with a parameter that'll http.get a json file I have, filter it based on the param, then send it back to my controller.
angular
.module("mainApp")
.controller('personInfoCtrl',['$scope', '$stateParams', 'GetPersonData', function($scope, $stateParams, GetPersonData) {
$scope.personId = $stateParams.id; //this part work great
$scope.fullObject = GetPersonData($stateParams.id);
//I'm having trouble getting ^^^ to work.
//I'm able to do
//GetPersonData($stateParams.id).success(function(data)
// { $scope.fullObject = data; });
//and I can filter it inside of that object, but I want to filter it in the factory/service
}]);
Inside my main.js I have
//angular.module(...
//..a bunch of urlrouterprovider and stateprovider stuff that works
//
}]).service('GetPersonData', ['$http', function($http)
{
return function(id) {
return $http.get('./data/people.json').then(function(res) {
//I know the problem lies in it not 'waiting' for the data to get back
//before it returns an empty json (or empty something or other)
return res.data.filter(function(el) { return el.id == id)
});
}
}]);
The syntax of the filtering and everything works great when it's all in the controller, but I want to use the same code in several controls, so I'm trying to break it out to a service (or factory, I just want the controllers to be 'clean' looking).
I'm really wanting to be able to inject "GetPersonData" to a controller, then call GetPersonData(personId) to get back the json
You seems to be syntax issue in your filter function in the service.
.service('GetPersonData', ['$http', function($http){
return function(id) {
return $http.get('./data/people.json').then( function (res) {
return res.data.filter(function(el) { return el.id == id });
});
}}]);
But regarding the original issue you cannot really access the success property of the $q promise that you are returning from your function because there is no such property exist, It exists only on the promise directly returned by the http function. So you just need to use the then to chain it through in your controller.
GetPersonData($stateParams.id).then(function(data){ $scope.fullObject = data; });
If you were to return return $http.get('./data/people.json') from your service then you will see the http's custom promise methods success and error.
I need to get some JSON data from the server with Angular. Let's say the user data. I created a service like that:
app.service('User', function($http) {
retrun $http({method: 'GET', url:'/current_user'});
});
And in my controller:
app.controller('SomeCtrl', function($scope, User) {
User.success(function(data) {
$scope.user = data;
});
});
That works just fine but what if I want to add some methods to the user? For instance in the view I would like to do {{user.isAdmin()}}. Is it the correct approach? Where can I add those methods?
If you wanted your service to always return an object with this method, do something like this:
app.service('User', function($http) {
return $http({method: 'GET', url:'/current_user'}).
then(function(response) {
response.data.isAdmin = function() { return true; };
return response.data;
});
});
Now any future code that references this promise and uses .then() will retrieve the new object. Take a look at the promise documentation for more information.
http://docs.angularjs.org/api/ng.$q
Keep in mind by using 'then' on an httpPromise it will be converted to a normal promise. You no longer have the convenience methods 'success' and 'error'.
It may be better practice to create a class for the object you are returning with a constructor function which takes the data object and assigns appropriate properties (or extends the instance). This way you can simply do something like
return new User(val);
And you will get all of the methods you want (with a prototype, etc).
You can do this in a few ways in the service you created:
Start using $resource and use a transform on the response:
http://jsfiddle.net/roadprophet/prtAP/
...
transformResponse: function (data, headers) {
data = {};
data.coolThing = 'BOOM-SHAKA-LAKA';
return data;
}
...
I recommend this method because it scales cleaner due to the use of $resource.
Setup a transformResponse with $http:
http://jsfiddle.net/roadprophet/bPfcz/
Use your own promise that resolves after the get promise resolves but with the mapped data. This is probably the most manual way to handle it since it requires you to manage multiple promises.
I returned mongoose docs as json in this way:
UserModel.find({}, function (err, users) {
return res.end(JSON.stringify(users));
}
However, user.__proto__ was also returned. How can I return without it? I tried this but not worked:
UserModel.find({}, function (err, users) {
return res.end(users.toJSON()); // has no method 'toJSON'
}
You may also try mongoosejs's lean() :
UserModel.find().lean().exec(function (err, users) {
return res.end(JSON.stringify(users));
});
Late answer but you can also try this when defining your schema.
/**
* toJSON implementation
*/
schema.options.toJSON = {
transform: function(doc, ret, options) {
ret.id = ret._id;
delete ret._id;
delete ret.__v;
return ret;
}
};
Note that ret is the JSON'ed object, and it's not an instance of the mongoose model. You'll operate on it right on object hashes, without getters/setters.
And then:
Model
.findById(modelId)
.exec(function (dbErr, modelDoc){
if(dbErr) return handleErr(dbErr);
return res.send(modelDoc.toJSON(), 200);
});
Edit: Feb 2015
Because I didn't provide a solution to the missing toJSON (or toObject) method(s) I will explain the difference between my usage example and OP's usage example.
OP:
UserModel
.find({}) // will get all users
.exec(function(err, users) {
// supposing that we don't have an error
// and we had users in our collection,
// the users variable here is an array
// of mongoose instances;
// wrong usage (from OP's example)
// return res.end(users.toJSON()); // has no method toJSON
// correct usage
// to apply the toJSON transformation on instances, you have to
// iterate through the users array
var transformedUsers = users.map(function(user) {
return user.toJSON();
});
// finish the request
res.end(transformedUsers);
});
My Example:
UserModel
.findById(someId) // will get a single user
.exec(function(err, user) {
// handle the error, if any
if(err) return handleError(err);
if(null !== user) {
// user might be null if no user matched
// the given id (someId)
// the toJSON method is available here,
// since the user variable here is a
// mongoose model instance
return res.end(user.toJSON());
}
});
First of all, try toObject() instead of toJSON() maybe?
Secondly, you'll need to call it on the actual documents and not the array, so maybe try something more annoying like this:
var flatUsers = users.map(function() {
return user.toObject();
})
return res.end(JSON.stringify(flatUsers));
It's a guess, but I hope it helps
model.find({Branch:branch},function (err, docs){
if (err) res.send(err)
res.send(JSON.parse(JSON.stringify(docs)))
});
I found out I made a mistake. There's no need to call toObject() or toJSON() at all. The __proto__ in the question came from jquery, not mongoose. Here's my test:
UserModel.find({}, function (err, users) {
console.log(users.save); // { [Function] numAsyncPres: 0 }
var json = JSON.stringify(users);
users = users.map(function (user) {
return user.toObject();
}
console.log(user.save); // undefined
console.log(json == JSON.stringify(users)); // true
}
doc.toObject() removes doc.prototype from a doc. But it makes no difference in JSON.stringify(doc). And it's not needed in this case.
Maybe a bit astray to the answer, but if anyone who is looking to do the other way around, you can use Model.hydrate() (since mongoose v4) to convert a javascript object (JSON) to a mongoose document.
An useful case would be when you using Model.aggregate(...). Because it is actually returning plain JS object, so you may want to convert it into a mongoose document in order to get access to Model.method (e.g. your virtual property defined in the schema).
PS. I thought it should have a thread running like "Convert json to Mongoose docs", but actually not, and since I've found out the answer, so I think it is not good to do self-post-and-self-answer.
You can use res.json() to jsonify any object.
lean() will remove all the empty fields in the mongoose query.
UserModel.find().lean().exec(function (err, users) {
return res.json(users);
}
It worked for me:
Products.find({}).then(a => console.log(a.map(p => p.toJSON())))
also if you want use getters, you should add its option also (on defining schema):
new mongoose.Schema({...}, {toJSON: {getters: true}})
Try this options:
UserModel.find({}, function (err, users) {
//i got into errors using so i changed to res.send()
return res.send( JSON.parse(JSON.stringify(users)) );
//Or
//return JSON.parse(JSON.stringify(users));
}
Was kinda laughing at how cumbersome this was for a second, given that this must be extremely common.
Did not bother digging in the docs and hacked this together instead.
const data = await this.model.logs.find({ "case_id": { $regex: /./, $options: 'i' }})
let res = data.map(e=>e._doc)
res.forEach(element => {
//del unwanted data
delete element._id
delete element.__v
});
return res
First i get all docs which have any value at all for the case_id field(just get all docs in collection)
Then get the actual data from the mongoose document via array.map
Remove unwanted props on object by mutating i directly