feathers.js -> Authentication token missing - feathersjs

I have a feathters.js application and now I want to secure the create and update hooks. I use a socket.io client and currently am going for JWT. I have added what I think I needed to add but am getting Error: Authentication token missing and Error Authenticating. The later I understand for that is from my code. I have a backend / frontend situation
So this is what I've implemented so far.
File: backend\backend.js (called in backend\index.js for the configuration of the app)
'use strict';
const path = require('path');
const serveStatic = require('feathers').static;
const favicon = require('serve-favicon');
const compress = require('compression');
const cors = require('cors');
const feathers = require('feathers');
const configuration = require('feathers-configuration');
const authentication = require('feathers-authentication');
const hooks = require('feathers-hooks');
const rest = require('feathers-rest');
const bodyParser = require('body-parser');
const socketio = require('feathers-socketio');
const middleware = require('./middleware/index');
const services = require('./services/index');
const appFeathers = feathers();
appFeathers.configure(configuration(path.join(__dirname, '..')));
appFeathers.use(compress())
.options('*', cors())
.use(cors())
.use(favicon(path.join(appFeathers.get('public'), 'favicon.ico')))
.use('/', serveStatic(appFeathers.get('public')))
.use(bodyParser.json())
.use(bodyParser.urlencoded({extended: true}))
.configure(hooks())
.configure(rest())
.configure(socketio())
.configure(services)
.configure(middleware)
.configure(authentication());
module.exports = appFeathers;
File: backend\config\default.json
{
"host": "localhost",
"port": 3001,
"mysql_connection": "mysql://CONNECTION_STRING",
"public": "../public/",
"auth": {
"idField": "id",
"token": {
"secret": "SECRET_KEY"
},
"local": {}
}
}
In a working component of the frontend:
<template>
<div class="vttIndex">
idnex.vue
todo: eagle.js slideshow
todo: first info
<ul>
<li v-for="message in listMessages">
{{ message }}
</li>
</ul>
</div>
</template>
<script>
import feathers from 'feathers/client';
import socketio from 'feathers-socketio/client';
import hooks from 'feathers-hooks';
import io from 'socket.io-client';
import authentication from 'feathers-authentication/client';
import * as process from "../nuxt.config";
const vttSocket = io(process.env.backendUrl);
const vttFeathers = feathers()
.configure(socketio(vttSocket))
.configure(hooks())
.configure(authentication());
const serviceMessage = vttFeathers.service('messages');
vttFeathers.authenticate({
type: 'token',
'token ': 'SECRET_KEY'
}).then(function(result){
console.log('Authenticated!', result);
}).catch(function(error){
console.error('Error authenticating!', error);
});
export default {
layout: 'default',
data: function() {
return {
listMessages: []
}
},
mounted: function() {
serviceMessage.find().then(page => {
this.listMessages = page.data;
});
serviceMessage.on('created', (serviceMessage) => {
this.listMessages.push(serviceMessage);
});
}
}
</script>
As token, I have the secret key of the backend json file. As you see, now I only try to log console messages. It is doing something for my error message is coming from there.
Question
Where am I missing what to have this functioning?
Goal
Just in case it's needed. My goal is for all 'public' data to be select with a token in my client and then an admin section maybe with 0auth. So the general 'SELECT' stuff is secured through a token instead of no authentication at all.
Solution
Okay I solved it, sort of. I first needed to create a user. Then I needed to do a local login with the user. That returns a token. If I use that, then there is no problem at all.

To use a token, you must first make sure it is generated. I was using the secret key as token what isn't correct. When you first athenticate with the 'local' type (default email and password) it will create a token and that is what you could then use with the method 'token'

Related

Firebase Cloud Function with Swagger: Error: could not handle the request

I have this code in a function called docs:
const functions = require("firebase-functions");
const express = require("express");
const swaggerUi = require("swagger-ui-express");
const swaggerJsdoc = require("swagger-jsdoc");
const options = {
definition: {
openapi: "3.0.0",
info: {
title: "API",
version: "1.0.0",
},
},
apis: ["../**/*.function.js"],
};
const openapiSpecification = swaggerJsdoc(options);
console.log("🚀 ", openapiSpecification);
const app = express();
app.use(
"/",
swaggerUi.serve,
swaggerUi.setup(openapiSpecification, {
swaggerOptions: {
supportedSubmitMethods: [], //to disable the "Try it out" button
},
})
);
module.exports = functions.https.onRequest(app);
But every time I hit the URL, this error gets returned:
Error: could not handle the request
URL:
https://REGION-PROJECT.cloudfunctions.net/docs
The deps above are all installed.
Any idea what is causing this issue?
Or better yet, how can I serve an endpoint for Swagger docs?
Please keep in mind that all other functions don't use express; most are callable. Just this one function uses it, so not sure if the mixing not supported.
Here's the folder structure:
package.json
index.js
app
auth
login.function.js
users
createUser.function.js
Inside index.js, the functions get loaded and added to the exports dynamically. This setup works fine and deploys well, so no issues there.

Ipfs node won't connect to my react app running on local host

I am trying to create a react component that lets you submit a .jpeg and upload it ipfs. The daemon starts and I can view the ipfs node in the ipfs-go-companion add on with config
gateway - http://localhost:8080
API - http://127.0.0.1:5001
It has a toggle button that enables/disables ipfs integrations for localhost but it is disabled and every time I press it; it disables itself again.
In my app when I select a file and click the submit button I get the following error.
fetch.browser.js:101 POST http://127.0.0.1:5001/api/v0/add?stream-channels=true&progress=false 403 (Forbidden)
fetchWith # fetch.browser.js:101
fetch # http.js:130
Client.fetch # core.js:192
post # http.js:174
addAll # add-all.js:36
index.js:1 HTTPError: 403 - Forbidden
at Object.errorHandler [as handleError] (http://localhost:3000/static/js/1.chunk.js:57836:15)
at async Client.fetch (http://localhost:3000/static/js/1.chunk.js:61144:9)
at async addAll (http://localhost:3000/static/js/1.chunk.js:54724:17)
at async last (http://localhost:3000/static/js/1.chunk.js:67287:20)
at async Object.add (http://localhost:3000/static/js/1.chunk.js:54865:14)
at async handleSubmit (http://localhost:3000/static/js/main.chunk.js:1375:20)
I have started the ipfs daemon with
sudo ipfs daemon
My file upload component is
import React, { Component, useEffect, useState } from "react";
import { Form, Col, Button, InputGroup } from 'react-bootstrap';
const ipfsClient = require('ipfs-http-client')
const Buffer = require('buffer').Buffer;
// connect to the default API address http://localhost:5001
const UploadImage = () => {
const [request, setRequest] = useState({});
const [file, setFile] = useState({});
const [img, setImg] = useState('');
const handleSubmit = async (event) => {
event.preventDefault();
// const { cid } = await ipfs.add(img)
//const ipfs = ipfsClient('http://localhost:5001') // (the default in Node.js)
const ipfs = ipfsClient('/ip4/127.0.0.1/tcp/5001')
try {
const file = await ipfs.add(img)
// setFile(file)
console.log(file)
} catch (e) {
console.error(e)
}
}
const convertToBuffer = async(reader) => {
//file is converted to a buffer for upload to IPFS
const buffer = await Buffer.from(reader.result);
setImg(buffer);
};
const captureFile = async(event) => {
event.stopPropagation()
event.preventDefault()
const file = event.target.files[0];
let reader = new window.FileReader();
reader.onload = function(event) {
//const re = convertToBuffer(res);
const res = reader.readAsArrayBuffer(re);
const re = convertToBuffer(res);
console.log(img);
//setImg(res)
};
// reader.readAsArrayBuffer(blob);
// console.log(res);
};
const myChangeHandler = async(event) => {
let nam = event.target.name;
let val = event.target.value;
switch (nam) {
case 'img':
console.log(img)
setImg(val)
break;
}
}
return (
<div>
<form onSubmit={handleSubmit}>
<div>
<label for="profile_pic">Choose file to upload</label>
<div style={{ padding: '10px' }}><p>Enter your name:</p>
<input
type="file"
name='img'
onChange={captureFile}
accept=".jpg, .jpeg, .png"
required
/></div>
</div>
<div>
<input type='submit' value="Submit" />
</div>
</form>
</div>
)
}
export default UploadImage;
I can add files from the terminal and in the IPFS GUI running on http://127.0.0.1:5001/ when I run the daemon but I keep getting a 403 error every time I click the submit button. I kept getting a Access-Control-Allow-Origin error before that but I installed this add on for chrome and now I don't get that error
https://chrome.google.com/webstore/detail/allow-cors-access-control/lhobafahddgcelffkeicbaginigeejlf
It seems the issue is that CORS is left to the default settings on your IPFS node: block all connections.
To fix this, you must configure your CORS headers.
The official IPFS tutorial to do this is here: https://github.com/ipfs/ipfs-webui#configure-ipfs-api-cors-headers
If you want to do this by editing the config file directly or through the WebUI:
An example all-allowing (potentially unsafe, make sure you're behind a firewall) Node configuration for the API section is:
"API": {
"HTTPHeaders": {
"Access-Control-Allow-Methods": [
"PUT",
"POST"
],
"Access-Control-Allow-Origin": [
"*"
]
}
},
Setting the allow-origin to * will simply allow all origins, but you can also use a list of origin domains to be safer.
If you visit https://webui.ipfs.io or https://dev.webui.ipfs.io they will give you similar instructions on how to change your node's CORS settings so that they may operate.

node discord js export a function

I've been making a discord.js bot following the official guide.
I have all my commands in the /commands folder as advised.
Then I followed the course to create a currency system with sequelize, following this page from the same guide.
I have a balance.js file inside the commands folder, but when I'm calling it, it gives me this error:
TypeError: currency.getBalance is not a function
I've defined the function in my app.js file, but how can I export it (or use it) inside the balance.js which is called by the app.js file?
This is the function defined in the main file app.js:
Reflect.defineProperty(currency, 'getBalance', {
value: function getBalance(id) {
const user = currency.get(id);
return user ? user.balance : 0;
},
});
This is balance.js:
module.exports = {
name: 'balance',
description: 'Informs you about your balance.',
cooldown : 10,
guildOnly : true,
aliases: ['bal', 'cur', 'gem', 'gems'],
execute(message, args) {
const Discord = require('discord.js');
const { Users, CurrencyShop, UserItems, CardBase, UserCollec } = require('../dbObjects');
const currency = require('../app.js')
async () => { const storedBalances = await Users.findAll();
storedBalances.forEach(b => currency.set(b.user_id, b));
UserCollec.sync(); }
const target = message.author;
return message.channel.send(`${target} has ${currency.getBalance(target.id)}<:Gem:756059891465977886>`);
},
};
EDIT:
I progressed. Now I understand that I have to import the currency variable, which has been declared as a new Discord.Collection() in app.js.
I need to refer to this variable in a module and this module doesn't seem to see it as a collection. How do I import it?

Google Cloud Functions error: "No 'Access-Control-Allow-Origin' header is present on the requested resource"

No 'Access-Control-Allow-Origin' header is present on the requested
resource
Failed to load
https://us-central1-social-post-builder-2.cloudfunctions.net/stripe-test-charge: No 'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'http://localhost:3001' is therefore not allowed
access. If an opaque response serves your needs, set the request's
mode to 'no-cors' to fetch the resource with CORS disabled.
I had just added the cors module to express (on Google Cloud Functions) and it seemed to be a partial fix, because a response object now returns (before, undefined was passed, breaking the page).
Now I get the above error in the console. Page doesn't break. Does my code below not allow all domains?
// https://us-central1-social-post-builder-2.cloudfunctions.net/stripe-test-charge
const app = require("express")();
const stripe = require("stripe")("test_api_key_removed");
const cors = require('cors');
app.use(cors());
app.use(require("body-parser").text());
exports.charge_card = async (req, res) => {
try {
let {status} = await stripe.charges.create({
amount: 2000,
currency: "usd",
description: "An example charge",
source: req.body
});
res.json({status});
} catch (err) {
res.status(500).end();
}
};
The package.json file for the above includes the latest packages as of today.
I don't think the React code needs to be shown, but just in case it is relevant, here is the React component that executes the fetch:
// CheckoutForm.js
import React, {Component} from "react";
import { CardElement, injectStripe } from "react-stripe-elements";
class CheckoutForm extends Component {
constructor(props) {
super(props);
this.submit = this.submit.bind(this);
this.state = {
complete : false
};
}
async submit (event) {
// User clicked submit
let {token} = await this.props.stripe.createToken({name: "Name"});
console.log('CheckoutForm:16, token:', token)
let response = await fetch("https://us-central1-social-post-builder-2.cloudfunctions.net/stripe-test-charge", {
method: "POST",
headers: {"Content-Type": "text/plain"},
body: token.id
});
if (response.ok) console.log("Purchase Complete!")
};
render(props) {
// render variables
if (this.state.complete) return <h1>Purchase Complete</h1>;
return (
<div className='checkout'>
<p>
Would you like to complete the purchase?
</p>
<CardElement />
<button onClick={ this.submit } >
Send
</button>
</div>
);
}
}
export default injectStripe(CheckoutForm);

How can I have multiple API endpoints for one Google Cloud Function?

I have a Google Cloud Function which contains multiple modules to be invoked on different paths.
I am using the serverless framework to deploy my functions, but it has the limitation of only one path per function.
I want to use multiple paths in one function just like we can in the AWS serverless framework.
Suppose a user cloud function will have two paths /user/add as well as /user/remove; both the paths should invoke the same function.
Something like this:
serverless.yml
functions:
user:
handler: handle
events:
- http: user/add
- http: user/remove
How can I have multiple API endpoints for one GCF?
Yes, indeed there is no actual REST service backing up Google Cloud Functions. It uses out of the box HTTP triggers.
To hustle the way around, I'm using my request payload to determine which action to perform. In the body, I'm adding a key named "path".
For example, consider the Function USER.
To add a user:
{
"path":"add",
"body":{
"first":"Jhon",
"last":"Doe"
}
}
To remove a user:
{
"path":"remove",
"body":{
"first":"Jhon",
"last":"Doe"
}
}
If your operations are purely CRUD, you can use request.method which offers verbs like GET, POST, PUT, DELETE to determine operations.
You could use Firebase Hosting to rewriting URLs.
In your firebase.json file:
"hosting": {
"rewrites": [
{
"source": "/api/v1/your/path/here",
"function": "your_path_here"
}
]
}
Keep in mind this is a workaround and it has a major drawback: you will pay for double hit. Consider this if your app has to scale.
You can write your functions in different runtimes. The Node.js runtime uses the Express framework. So you can use its router to build different routes within a single function.
Add the dependency
npm install express#4.17.1
The following example is using typescript. Follow these guidelines to initiate a typescript project.
// index.ts
import { HttpFunction } from '#google-cloud/functions-framework';
import * as express from 'express';
import foo from './foo';
const app = express();
const router = express.Router();
app.use('/foo', foo)
const index: HttpFunction = (req, res) => {
res.send('Hello from the index route...');
};
router.get('', index)
app.use('/*', router)
export const api = app
// foo.ts
import { HttpFunction } from '#google-cloud/functions-framework';
import * as express from 'express';
const router = express.Router();
const foo: HttpFunction = (req, res) => {
res.send('Hello from the foo route...');
};
router.get('', foo)
export default router;
to deploy run:
gcloud functions deploy YOUR_NAME \
--runtime nodejs16 \
--trigger-http \
--entry-point api \
--allow-unauthenticated
Currently, in google allows only one event definition per function is supported. For more
Express can be installed with npm i express, then imported and used more or less as normal to handle routing:
const express = require("express");
const app = express();
// enable CORS if desired
app.use((req, res, next) => {
res.set("Access-Control-Allow-Origin", "*");
next();
});
app.get("/", (req, res) => {
res.send("hello world");
});
exports.example = app; // `example` is whatever your GCF entrypoint is
If Express isn't an option for some reason or the use case is very simple, a custom router may suffice.
If parameters or wildcards are involved, consider using route-parser. A deleted answer suggested this app as an example.
The Express request object has a few useful parameters you can take advantage of:
req.method which gives the HTTP verb
req.path which gives the path without the query string
req.query object of the parsed key-value query string
req.body the parsed JSON body
Here's a simple proof-of-concept to illustrate:
const routes = {
GET: {
"/": (req, res) => {
const name = (req.query.name || "world");
res.send(`<!DOCTYPE html>
<html lang="en"><body><h1>
hello ${name.replace(/[\W\s]/g, "")}
</h1></body></html>
`);
},
},
POST: {
"/user/add": (req, res) => { // TODO stub
res.json({
message: "user added",
user: req.body.user
});
},
"/user/remove": (req, res) => { // TODO stub
res.json({message: "user removed"});
},
},
};
exports.example = (req, res) => {
if (routes[req.method] && routes[req.method][req.path]) {
return routes[req.method][req.path](req, res);
}
res.status(404).send({
error: `${req.method}: '${req.path}' not found`
});
};
Usage:
$ curl https://us-east1-foo-1234.cloudfunctions.net/example?name=bob
<!DOCTYPE html>
<html lang="en"><body><h1>
hello bob
</h1></body></html>
$ curl -X POST -H "Content-Type: application/json" --data '{"user": "bob"}' \
> https://us-east1-foo-1234.cloudfunctions.net/example/user/add
{"message":"user added","user":"bob"}
If you run into trouble with CORS and/or preflight issues, see Google Cloud Functions enable CORS?