Feathersjs :my hook is failing(loosing context) once i chained it with an other in before function - feathersjs

I have 2 functions(authenticate, restrictAccess) in the before hook (get) and i want to chain them together.But the restrictAccess is being executed twice(by the second round it has lost the context)
When i remove the authenticate , the restrictAccess works as expected.
Here is my hook
module.exports = {
before: {
all: [],
get: [authenticate('jwt'), restrictAccess()],
....
But when i remove authenticate as this
module.exports = {
before: {
all: [],
get: [ restrictAccess()],
....
restrictAccess works as expected

The only reason I could think of is that you maybe using the hooks on the users-service.
If this is the case the authenticate-hook probably internally calls the user service to set params.user to the requesting user, which would cause the two calls for the restrictAccess-hook.
A possible fix would be to ignore all internal calls in your restrictAccess-hook:
module.exports = context => {
if (context.params && context.params.provider) {
// put restriction logik here....
}
return context;
}

Related

Nextjs/Vercel error: only absolute URLs are supported. Where is the '.json' in my route params coming from?

The Problem:
I'm new to Next.js (1 month) and Vercel (1 day), and between them something appears to be inserting .json in my urls on the search route, causing them to fail with error:
[GET] /_next/data/9MJcw6afNEM1L-eax6OWi/search/hand.json?term=hand
10:21:52:87
Function Status:None
Edge Status:500
Duration:292.66 ms
Init Duration: 448.12 ms
Memory Used:88 MB
ID:fra1:fra1::ldzhz-1644484912454-0a30b71b6c90
User Agent:Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:97.0) Gecko/20100101 Firefox/97.0
TypeError: Only absolute URLs are supported
at getNodeRequestOptions (/var/task/node_modules/next/dist/compiled/node-fetch/index.js:1:61917)
at /var/task/node_modules/next/dist/compiled/node-fetch/index.js:1:63448
at new Promise (<anonymous>)
at Function.fetch [as default] (/var/task/node_modules/next/dist/compiled/node-fetch/index.js:1:63382)
at fetchWithAgent (/var/task/node_modules/next/dist/server/node-polyfill-fetch.js:38:39)
at getServerSideProps (/var/task/.next/server/chunks/730.js:238:28)
at Object.renderToHTML (/var/task/node_modules/next/dist/server/render.js:566:26)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at async doRender (/var/task/node_modules/next/dist/server/base-server.js:855:38)
at async /var/task/node_modules/next/dist/server/base-
server.js:950:28
2022-02-10T09:21:53.788Z 994c9544-0bbe-4a68-af83-f0e4c322151e ERROR
Error: Your `getServerSideProps` function did not return an object. Did you forget to add a `return`?
at Object.renderToHTML (/var/task/node_modules/next/dist/server/render.js:592:19)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at async doRender (/var/task/node_modules/next/dist/server/base-server.js:855:38)
at async /var/task/node_modules/next/dist/server/base-server.js:950:28
at async /var/task/node_modules/next/dist/server/response-cache.js:63:36 { page: '/search/[term]'}
RequestId: 994c9544-0bbe-4a68-af83-f0e4c322151e Error: Runtime exited with error: exit status 1
Runtime.ExitError
Though the browser says https://.../search/hand as it should.
No such thing is happening on my local server build though, and it works perfectly well.
Background/Code Snippets:
The search route is the only route that uses SSR, and is also the only route with this issue. It is a dynamic route, so it seems either next in production or vercel expects some kind of json for it -presumably pre-rendered content-, and is replacing the route URL with json.
Also I have had to use the VERCEL_URL environment variable to prepare a URL for fetch requests, so this may also be messing up the URL, but the .json in the error message makes me think otherwise, since search should not be pre-rendered.
The Page Structure For the Search Route (Index imports the component in [term] and defines its own getServerSide props to accommodate a search route without a param):
|-Search
|- [term].js
|- Index.js
The Code For [term].js:
...
export default function Search({results, currentSearch}){
...
}
export async function getServerSideProps(req) {
const { criteria, page } = req.query;
const { term } = req.params || { term: '' };
try {
const data = await fetch(`${process.env.VERCEL_URL}/api/search/${term}?criteria=${criteria || 'name'}&page=${page}`);
const searchRes = await data.json();
return {
props: {
results: searchRes.data,
currentSearch: searchRes.query
}
}
} catch (e) {
console.log(e)
}
}
Index.js is similar:
import Search from "./[term]";
export default Search;
export async function getServerSideProps(req) {
const { criteria, page } = req.query;
const { term } = req.params || { term: '' };
if(!term){
return {
props: {
results: [],
currentSearch: {}
}
}
}
try {
const data = await fetch(`${process.env.VERCEL_URL}/api/search/${term}?criteria=${criteria || 'name'}&page=${page}`);
const searchRes = await data.json();
return {
props: {
results: searchRes.data,
currentSearch: searchRes.query
}
}
} catch (e) {
console.log(e)
}
}
The API I'm trying to fetch from is confirmed to be working, so this problem is strictly regarding pages, or .json being provided to the fetch method from router params.
It would turn out that VERCEL_URL is actually an absolute URL (It does not include a protocol). I had to deploy console.log statements to find this. A little embarrassed that I missed it in the docs.
The .json was not actually in the query or params, and therefore not in the fetch request. The fetch failed because the url had no protocol.
The .json in the page url must be from Next's internal operations, and does not mean the page is being built ahead of time. Yes it is being rendered using some json, but my thinking that the json indicates a pre-rendered page(SSG/ISR) was wrong. This must mean Server Side Rendering will also make use of such json, but only at runtime, when the request is made.
The use of .json after the params slug in the GET requests for a page has no bearing on the internal flow of your app, provided it has worked correctly. If you see it in error messages, know that it is from Next and examine other parts of the code at the point of failure.
The page structure I attempted ([param].js + index.js in the same
directory) is fine, which is why my local build could work properly.
I want to delete this question because the solution is essentially one that a thorough look in the docs would have revealed, but at the same time I think the mistake itself is an easily made one and that some of the conclusions listed above(particularly the one about json being used in all next routes) could save time spent debugging for some new users of Next/Vercel.

redux-saga takeLeading action plus additional parameter

I've implement a redux effect takeLeading that will ignore subsequent actions if the saga is currently running:
export const takeLeading = (patternOrChannel, saga, ...args) => fork(function*() {
while (true) {
const action = yield take(patternOrChannel);
yield call(saga, ...args.concat(action));
}
});
I use this for API fetching in my application, where each endpoint in my API has its own action type. So for GET methods it's useful to block if the request has already been dispatched somewhere else in the app. The saga looks like:
return function* () {
yield all([takeLeading(GET_USER_ID, callApiGen), takeLeading(GET_WIDGET_ID, callApiGen)]);
}
The obvious problem is that if I want to get two different user IDs, the second will block because it too has action type GET_USER_ID. Short of making a different action for each possible parameter, is there a way to implement some takeLeadingForFunc(<action>, (action) => <id>, saga) that allows me to keep the concise format of specifying one effect per request type but allows me to not block if the <id> is different? I was trying to wrap takeLeading with takeEvery to implement something but couldn't quite get it.
EDIT:
I got something like this to work:
export const takeLeadingForFunc = (f) => (patternOrChannel, saga, ...args) => fork(function*() {
let takeLeadings = {};
while (true) {
const action = yield take(patternOrChannel);
if (!(f(action) in takeLeadings)) {
yield call(saga, ...args.concat(action))
takeLeadings[f(action)] = yield takeLeading((ac) => f(ac) === f(action) && ac.type === action.type, saga, ...args)
}
}
});
Which takes an extractor function f that should return a primitive. This feels kind of hacky, so was wondering if there's a more idiomatic way to do this.

How to parse or Stringify in asycnhronous way in javascript

I see that JSON.stringify and JSON.parse are both sycnhronous.
I would like to know if there a simple npm library that does this in an asynchonous way .
Thank you
You can make anything "asynchronous" by using Promises:
function asyncStringify(str) {
return new Promise((resolve, reject) => {
resolve(JSON.stringify(str));
});
}
Then you can use it like any other promise:
asyncStringfy(str).then(ajaxSubmit);
Note that because the code is not asynchronous, the promise will be resolved right away (there's no blocking operation on stringifying a JSON, it doesn't require any system call).
You can also use the async/await API if your platform supports it:
async function asyncStringify(str) {
return JSON.stringify(str);
}
Then you can use it the same way:
asyncStringfy(str).then(ajaxSubmit);
// or use the "await" API
const strJson = await asyncStringify(str);
ajaxSubmit(strJson);
Edited: One way of adding true asynchrnous parsing/stringifying (maybe because we're parsing something too complex) is to pass the job to another process (or service) and wait on the response.
You can do this in many ways (like creating a new service that shares a REST API), I will demonstrate here a way of doing this with message passing between processes:
First create a file that will take care of doing the parsing/stringifying. Call it async-json.js for the sake of the example:
// async-json.js
function stringify(value) {
return JSON.stringify(value);
}
function parse(value) {
return JSON.parse(value);
}
process.on('message', function(message) {
let result;
if (message.method === 'stringify') {
result = stringify(message.value)
} else if (message.method === 'parse') {
result = parse(message.value);
}
process.send({ callerId: message.callerId, returnValue: result });
});
All this process does is wait a message asking to stringify or parse a JSON and then respond with the right value.
Now, on your code, you can fork this script and send messages back and forward. Whenever a request is sent, you create a new promise, whenever a response comes back to that request, you can resolve the promise:
const fork = require('child_process').fork;
const asyncJson = fork(__dirname + '/async-json.js');
const callers = {};
asyncJson.on('message', function(response) {
callers[response.callerId].resolve(response.returnValue);
});
function callAsyncJson(method, value) {
const callerId = parseInt(Math.random() * 1000000);
const callPromise = new Promise((resolve, reject) => {
callers[callerId] = { resolve: resolve, reject: reject };
asyncJson.send({ callerId: callerId, method: method, value: value });
});
return callPromise;
}
function JsonStringify(value) {
return callAsyncJson('stringify', value);
}
function JsonParse(value) {
return callAsyncJson('parse', value);
}
JsonStringify({ a: 1 }).then(console.log.bind(console));
JsonParse('{ "a": "1" }').then(console.log.bind(console));
Note: this is just one example, but knowing this you can figure out other improvements or other ways to do it. Hope this is helpful.
Check this out, another npm package-
async-json is a library that provides an asynchronous version of the standard JSON.stringify.
Install-
npm install async-json
Example-
var asyncJSON = require('async-json');
asyncJSON.stringify({ some: "data" }, function (err, jsonValue) {
if (err) {
throw err;
}
jsonValue === '{"some":"data"}';
});
Note-Didn't test it, you need to manually check it's dependency and
required packages.
By asynchronous I assume you actually mean non-blocking asynchronous - i.e., if you have a large (megabytes large) JSON string, and you stringify, you don't want your web server to hard freeze and block newly incoming web requests for 500+ milliseconds while it processes the object.
Option 1
The generic answer is to iterate through your object piece by piece, and to then call setImmedate whenever a threshold is reached. This then allows other functions in the event queue to run for a bit.
For JSON (de)serialization, the yieldable-json library does this very well. It does however drastically sacrifice JSON processing time (which is somewhat intentional).
Usage example from the yieldable-json readme:
const yj = require('yieldable-json')
yj.stringifyAsync({key:"value"}, (err, data) => {
if (!err)
console.log(data)
})
Option 2
If processing speed is extremely important (such as with real-time data), you may want to consider spawning multiple Node threads instead. I've used used the PM2 Process Manager with great success, although initial setup was quite daunting. Once it works however, the final result is magic, and does not require modifying your source code, just your package.json file. It acts as a proxy, load balancer, and monitoring tool for Node applications. It's somewhat analogous to Docker swarm, but bare metal, and does not require a special client on the server.

Component not rendered when calling "replace" in onEnter

I am trying to implement simple authentication with react-router. I think there is an issue with replace and callback. Consider following code:
1) Routing configuration
function getRoutes() {
return {
path: "/",
indexRoute: require("./Home"),
component: require("./Shared/Layout"),
onEnter: handleEnter,
childRoutes: [
require("./Login"),
require("./Secured"),
require("./Any")
]
}
}
function handleEnter(nextState, replace, callback) {
let state = store.getState()
if (!state.hasIn(["shared", "user"])) {
store.dispatch(fetchUser())
.then(callback)
}
}
2) ./Secured route configuration
export = {
path: "/secured",
component: require("./Secured"),
onEnter(nextState, replace) {
let state = store.getState()
if (!state.getIn(["shared", "user"])) {
replace("/login")
}
}
}
It should work like this:
Fetch user when entering root route (async operation, we need callback)
Go to /secured and check whether user is authenticated when entering the route
If user is not authenticated go to /login
The problem is that the /login page will not be rendered. The URL is changed to /login, but nothing is displayed and there are no error messages in console. When I remove callback parameter from the root route configuration, it starts working as expected.
Am I doing something wrong?
Well, it was really stupid :) I've forgot to call callback when user is already authenticated.

AngularJS - access elements in scope

I have done a service that gets a json file from the server with the translated values of the labels of my webapp. Seems to work fine:
mobilityApp.service('serveiTraduccions', function($resource) {
this.getTranslation = function($scope) {
var languageFilePath = 'traduccions/traduccio_en.json';
$resource(languageFilePath).get(function (data) {
$scope.translation = data;
});
};
});
What I am trying to do is acces that "$scope.translation" from my controler, I tried all and nothing worked. The object is saved in my $scope as you can see:
how can I get the values of the "registroBtnRegistro", "registroErrorRegistro" etc ?
Thanks in advance !
I tried:
console.log($scope.translation); -> undefined
console.log($scope['translation']); -> undefined
console.log($scope.translation.registroBtnRegistro); -> TypeError:
Cannot read property 'registroBtnRegistro' of undefined
console.log($scope.translation['registroBtnRegistro']); -> TypeError:
Cannot read property 'registroBtnRegistro' of undefined
Maybe you're trying to access these values from another $scope that not inherits the scope where you've created your translation model.
Try to assign this model directly to $rootScope, so you can access it from every scope:
mobilityApp.service('serveiTraduccions', function($resource, $rootScope) {
this.getTranslation = function() {
var languageFilePath = 'traduccions/traduccio_en.json';
$resource(languageFilePath).get(function (data) {
$rootScope.translation = data;
});
};
});
this answer is a blind attempt because your original post lacks basic information like the call from the controller.
we can refine it until we make it work.
First, you should be returning something from your method:
mobilityApp.service('serveiTraduccions', function($resource) {
this.getTranslation = function() {
var languageFilePath = 'traduccions/traduccio_en.json';
return $resource(languageFilePath);
};
});
You are using $resource but you might as well use basic $http.get(). at least it doesn't look like a restful api to me.
In any case, because it's an asynchronous request, it will not return the list of translated strings, but a resource "class" that allows methods like get, delete or the more general query():
from the docs: default methods are
{ 'get': {method:'GET'},
'save': {method:'POST'},
'query': {method:'GET', isArray:true},
'remove': {method:'DELETE'},
'delete': {method:'DELETE'} };
sidenote: injecting $scope in a service doesn't make much sense to me: services are used to encapsulate common logic accross components. However, you can pass a scope instance as a parameter.
Then, the controller that uses this should have the service injected and use a callback to get the results when they have arrived (asynchronous operation!):
TraduccioCtrl ... {
$scope.translation = {}; // avoid undefined when the view just loads
ServeiTraduccions.getTranslation.query(function (response) {
$scope.translation = response; // and angular's two-way data binding will probably do the rest
});
}
The Angular docs about ng-resource have a working example. Other questions in SO have addressed this already too, like Using AngularJS $resource to get data