Is there FeathersJs syntax for creating an endpoint with hooks in a single command? - feathersjs

I've found how to add hooks to an existing service:
app.use("/hello", { get: async () => "Hello World"});
app.service('/hello').hooks({
before: { create: someHookFn }
});
But I'm pretty sure this syntax can be improved, and I'm just failing to find an example. Trying to search the source code was no help either, it's pretty hairy in terms of its type definitions.
Is there FeathersJs syntax for creating an endpoint with hooks in a single command?
Something like this:
// non-functional code
app.use("/hello", {
service: { get: async () => "Hello World"},
hooks: {
before: { create: someHookFn }
}
});

You can make a function which does this like so:
function createService(service, hooks) {
return feathers().use('', service).service('').hooks(hooks);
}
Then use it like so:
app.use("/hello", createService({
{ get: async () => "Hello World"},
{ before: { create: someHookFn } }
}));
The reason for doing this for me was that I wanted to have a service which was not connected to an endpoint for use in graphql. Also, I don't like connecting things by string id.

Related

Mock mysql connection with Jest

Trying to test code that looks like this :
const mysql = require('mysql2/promise');
async myFunction () {
const db = await mysql.createConnection(options);
const results = await db.execute('SELECT `something` from `table`;');
await db.end();
// more code ...
}
I need to mock the the mysql connection in a way that will allow me to use whatever it returns to mock a call to the execute function.
I have tried mocking the whole mysql2/promise module but of course that did not work, since the mocked createConnection was not returning anything that could make a call to the execute function.
I also tried only mocking these 3 functions that I need instead of mocking the whole module, something like:
jest.mock('mysql2/promise', () => ({
createConnection: jest.fn(() => ({
execute: jest.fn(),
end: jest.fn(),
})),
}));
But that did not work too.
Any suggestions are highly appreciated.
I would approach this differently. When you feel you need to mock entire third-party libraries for testing, something is off in your application.
As a general best practice, you should always wrap third-party libraries. Check out this discussion for starters.
Basically the idea is to define your own interfaces to the desired functionality, then implement these interfaces using the third-party library. In the rest of your code, you would only work against the interfaces, not against the third-party implementation.
This has a couple of advantages
You can define the interfaces yourself. It will normally be much smaller than the entire third-party library, as you rarely use all functionality of that third-party library, and you can decide what's the best interface definition for your concrete use cases, rather than having to follow exactly what some library author dictates you.
If one day you decide you don't want to use MySQL anymore but move to Mongo, you can just write a Mongo implementation of your DB interface.
In your case, most importantly: You can easily create a mock implementation of your DB interface without having to start mocking the entire third-party API.
So how could this work?
First, define an interface as it would be most useful in your code. Perhaps, a DB interface for you could look like this:
interface Database<T> {
create(object: T): Promise<void>;
get(id: string): Promise<T>;
getAll(): Promise<T[]>;
update(id: string, object: T): Promise<void>;
delete(id: string): Promise<void>;
}
Now, you can develop your entire code base against this one Database interface. Instead of writing MySQL queries all across your code, when you need to retrieve data from 'table', you can use your Database implementation.
I'll just take an example ResultRetriever here that is pretty primitive, but serves the purpose:
class ResultRetriever {
constructor(private database: Database<Something>) {}
getResults(): Promise<Something[]> {
return this.database.getAll();
}
}
As you can see, your code does not need to care about which DB implementation delivers the data. Also, we inverted dependencies here: ResultReteriver is injected its Database instance. It does not know which conrete Database implementation it gets. It doesn't need to. All it cares about is that it is a valid one.
You can now easily implement a MySQL Database class:
class MySqlDatabase<T> implements Database<T> {
create(object: T): Promise<void> {...}
get(id: string): Promise<T> {...}
getAll(): Promise<T[]> {
const db = await mysql.createConnection(options);
const results = await db.execute('SELECT `something` from `table`;');
await db.end();
return results;
}
update(id: string, object: T): Promise<void> {...}
delete(id: string): Promise<void> {...}
}
Now we've fully abstracted the MySQL-specific implementation from your main code base. When it comes to testing, you can write a simple MockDatabase:
export class MockDatabase<T> implements Database<T> {
private objects: T[] = [];
async create(object: T): Promise<void> {
this.objects.push(object);
}
async get(id: string): Promise<T> {
return this.objects.find(o => o.id === id);
}
async getAll(): Promise<T[]> {
return this.objects;
}
update(id: string, object: T): Promise<void> {...}
delete(id: string): Promise<void> {...}
}
When it comes to testing, you can now test your ResultRetriever using your MockDatabase instead of relying on the MySQL library and therefore on mocking it entirely:
describe('ResultRetriever', () => {
let retriever: ResultRetriever;
let db: Database;
beforeEach(() => {
db = new MockDatabase();
retriever = new ResultRetriever(db);
});
...
});
I am sorry if I went a bit beyond the scope of the question, but I felt just responding how to mock the MySQL library was not going to solve the underlying architectural issue.
If you are not using/don't want to use TypeScript, the same logics can be applied to JavaScript.
There is a "brute-force" way if all you are really trying to do is to mock your MySQL calls. The following code is in TypeScript, but should be easily adaptable to regular JavaScript.
import * as mysql from "mysql2/promise";
import { mocked } from "ts-jest/utils";
jest.mock("mysql2/promise");
async function dbMethod(conn: mysql.Pool, field1Value: number): Promise<any> {
return await (conn.execute("SELECT field_1, field_2 FROM foo WHERE field_1=?",
[field1Value]) as Promise<mysql.RowDataPacket[]>);
}
describe("dbMethod", () => {
let mockDB: mysql.Pool;
beforeEach(() => {
mockDB = {
execute: jest.fn()
} as unknown as mysql.Pool;
});
it("should get something from database", async () => {
const mockExecute = mocked(mockDB.execute);
const testData = [{
field_1: 123,
field_2: 456
}];
mockExecute.mockResolvedValue([
[testData] as mysql.RowDataPacket[],
[]
]);
// Confirm results back from MySQL
await expect(dbMethod(mockDB, 123)).resolves.toEqual([[testData], []]);
// If you want, you can confirm MySQL execute was called as expected
expect(mockExecute).toHaveBeenCalledWith(
"SELECT field_1, field_2 FROM foo WHERE field_1=?",
[123]
);
});
});

Is it possible to merge two json responses using nginx?

I have an existing express endpoint that looks like this:
app.get(`${route}/:id`, async (req, res) => {
try {
const id = req.params.id;
const result = await dbFn(id);
res.send(result);
} catch (err) {
res.status(500).end();
}
});
And this is going to return an object that looks like:
{
"id": 123,
"name": "Foo"
}
Now, I want to extend this API, so that if it has a Accept: application/vnd.v2 header, then it will also fetch some data from a different service, and add that on. (See my related question where using content negotiation is suggested).
ie. the response will be:
{
"id": 123,
"name": "Foo",
"extraData": {
"foo": "bar"
}
}
Now, I can do this with express, here's how I have done it:
app.get(`${route}/:id`, async (req, res, next) => {
try {
const id = req.params.id;
const jobSeeker = await dbFn(id);
if (req.accepts("application/vnd.v2")) {
const response = await axios.get(`${integrationApiPath}/connection/${id}`);
const ssiData = response.data;
res.send({
...jobSeeker,
ssiData
})
}
else {
res.send(jobSeeker);
}
} catch (err) {
res.status(500).end();
}
});
But it struck me as a bit of a messy way to do API versioning.
What would be much nicer, is if I can have nginx handling this versioning instead.
That way, I don't need to modify my existing API, I can just create the new service, and have nginx examine the headers, and make both microservice calls and join them together.
Is this possible?
"But it struck me as a bit of a messy way to do API versioning."
I don't think that this is a bad way to do API versioning, since it's the comman way to do it. I addition you can serve the new service in a new subdirectory (e.g. yourwebsite.com/yourservice.../v2/yourFunction).
"What would be much nicer, is if I can have nginx handling this versioning instead."
I also won't confirm that it would be nicer to let nginx do the "logic" of your webservice, since nginx is about to serve your website/webservice and to to implement the logic.
However, if you still want to merge the requests using nginx you may want to have a look at this question/answer. This answer uses openresty. You may have to install this first.
As described, you can call multiple (in your case 2) services using this code:
location /yourServiceV2 {
content_by_lua_block {
local respA = ngx.location.capture("/yourService")
local respB = ngx.location.capture("/theServiceWhichExtendsYourService")
ngx.say(respA.body .. respB.body)
}
}
If you only want to perform the mentioned code under when a specific header is present you can use a if statement like described in this answer. So, your if statement would look like this:
if ($http_accept == 'application/vnd.v2') {
return 405;
}

How to pass a thunk or callback function into a redux action. Serializing functions in a redux store for modals and toast confirm notifications

When using a generic modal or toast with a confirm button, it becomes useful to be able to pass an action into this component so it can be dispatched when you click confirm.
The action may look something like this:
export function showConfirm({modalConfirm}) {
return {
type: 'MODALS/SHOW_MODAL',
payload: {
modalId: getUuid(),
modalType: 'CONFIRM',
modalConfirm : modalConfirm,
},
};
}
Where modalConfirm is another action object such as:
const modalConfirm = {
type: 'MAKE_SOME_CHANGES_AFTER_CONFIRM',
payload: {}
}
The modalConfirm action is dispatched inside the modal component using dispatch(modalConfirm) or even dispatch(Object.assign({}, modalConfirm, someResultFromTheModal)
Unfortunatley this solution only works if modalConfirm is a simple redux action object. This system is clearly very limited. Is there anyway you can pass a function (such as a thunk) in instead of a simple object?
Ideally, something full featured likes this:
const modalConfirm = (someResultFromTheModal) => {
return (dispatch, getState){
dispatch({
type: 'MAKE_SOME_UPDATES',
payload: someResultFromTheModal
})
dispatch({
type: 'SAVE_SOME_STUFF',
payload: http({
method: 'POST',
url: 'api/v1/save',
data: getState().stuffToSave
})
})
}
}
Funny, putting an action object in the store and passing it as a prop to a generic dialog is exactly the approach I came up with myself. I've actually got a blog post waiting to be published describing that idea.
The answer to your question is "Yes, but....". Per the Redux FAQ at http://redux.js.org/docs/FAQ.html#organizing-state-non-serializable , it's entirely possible to put non-serializable values such as functions into your actions and the store. However, that generally causes time-travel debugging to not work as expected. If that's not a concern for you, then go right ahead.
Another option would be to break your modal confirmation into two parts. Have the initial modal confirmation still be a plain action object, but use a middleware to watch for that being dispatched, and do the additional work from there. This is a good use case for Redux-Saga.
I ended up using string aliases to an actions library that centrally registers the actions.
Modal emmiter action contains an object with functionAlias and functionInputs
export function confirmDeleteProject({projectId}) {
return ModalActions.showConfirm({
message: 'Deleting a project it permanent. You will not be able to undo this.',
modalConfirm: {
functionAlias: 'ProjectActions.deleteProject',
functionInputs: { projectId }
}
})
}
Where 'ProjectActions.deleteProject' is the alias for any type of complicated action such as:
export function deleteProject({projectId}) {
return (dispatch)=>{
dispatch({
type: 'PROJECTS/DELETE_PROJECT',
payload: http({
method: 'DELETE',
url: `http://localhost:3000/api/v1/projects/${projectId}`,
}).then((response)=>{
dispatch(push(`/`))
}),
meta: {
projectId
}
});
}
}
The functions are registered in a library module as follows:
import * as ProjectActions from '../../actions/projects.js';
const library = {
ProjectActions: ProjectActions,
}
export const addModule = (moduleName, functions) => {
library[moduleName] = functions
}
export const getFunction = (path) => {
const [moduleName, functionName] = path.split('.');
// We are getting the module only
if(!functionName){
if(library[moduleName]){
return library[moduleName]
}
else{
console.error(`Module: ${moduleName} could not be found.`);
}
}
// We are getting a function
else{
if(library[moduleName] && library[moduleName][functionName]){
return library[moduleName][functionName]
}
else{
console.error(`Function: ${moduleName}.${functionName} could not be found.`);
}
}
}
The modalConfirm object is passed in to the modal by props. The modal component requires the getFunction function in the module above. The modalConfirm object is transformed into a function as follows:
const modalConfirmFunction = (extendObject, modalConfirm) => {
const functionFromAlias = getFunction(modalConfirm.functionAlias);
if(functionFromAlias){
dispatch(functionFromAlias(Object.assign({}, modalConfirm.functionInputs, extendObject)));
}
}
As you can see, this function can take in inputs from the modal. It can execute any type of complicated action or thunk. This system does not break time-travel but the centralized library is a bit of a drawback.

Understanding FeathersJS hooks

I'm following the tutorial. In Asynchronous hooks, there's a snippet like this:
todoService.before({
find(hook) {
return this.find().then(data => {
hook.params.message = 'Ran through promise hook';
hook.data.result = data;
// Always return the hook object
return hook;
});
}
});
Would you please let me know what this.find() is supposed to do?
find is a Feathers service method and this is the service the hook is running on.

fake model response in backbone.js

How can I fake a REST response in my model s.t. it does not really go to the service but returns a fixed json?
If possible show me a version that does it with overriding sync() and a version that overrides fetch(). I failed with both so this will be a good education for as for the difference between them.
Backbone.Model.extend({
fetch: function(){
var model = this;
model.set({yourStatic: "Json Here"});
}
}
This should work. From the Backbone documentation:
fetch():
Resets the model's state from the server by delegating to Backbone.sync
If your question is related to unit testing your code without the need for a live API, have a look at Sinon.JS. It helps mocking entire API server responses for testing purposes.
Here's an example from the Sinon docs that mocks the $.ajax function of jQuery:
{
setUp: function () {
sinon.spy(jQuery, "ajax");
},
tearDown: function () {
jQuery.ajax.restore(); // Unwraps the spy
},
"test should inspect jQuery.getJSON's usage of jQuery.ajax": function () {
jQuery.getJSON("/some/resource");
assert(jQuery.ajax.calledOnce);
assertEquals("/some/resource", jQuery.ajax.getCall(0).args[0].url);
assertEquals("json", jQuery.ajax.getCall(0).args[0].dataType);
}
}
Take a look at backbone-faux-server. It will allow you to handle (and 'fake' a response for) any sync op (fetch, save, etc) per Model (or Collection).
Sinon.js is a good candidate, although if you want to simulate more than a few responses, it might become a lot of work to setup headers, handle write logic, etc.
Building up on Sinon.js, FakeRest goes a step further and simulates a complete REST API based on a JSON object - all client-side.
My code like that
// config
const TEST_JSON = require('./test.json')
const API_MAP = {
testA: 'someroot'
}
const FAKE_API_MAP = {
testA: TEST_JSON
}
// here's model
let BaseModel = Backbone.Model.extend({
url: function() {
return `${HOST}${API_MAP[this.resourceName]}/`
}
})
let FakeModel = Backbone.Model.extend({
fetch: function(options) {
return this.sync('', this, _.extend({}, options));
},
sync: function(method, model, options) {
this.set(FAKE_API_MAP[this.resourceName], this.options)
this.trigger('sync', this);
},
});
// now it's easy for switch them
let modelA = new BaseModel({
resourceName: 'testA'
})
modelA.fetch()
let fakeModelA = new FakeModel({
resourceName: 'testA'
})
fakeModelA.fetch()