Apollo Server Subscriptions not working - mysql

I've made a GraphQL backend using Apollo Server, Sequelize (for the ORM), MySQL (DB) and Express (Web Server).
I have also added subscriptions, which the problem is there.
I can't even reach the WS endpoint using a websocket tester.
Can someone review my code and tell me what the problem is? I looked in the docs, other stackoverflow questions and I can't find any solution.
The code: https://github.com/seklyza/graphqlsubscriptions
Thanks for everyone

I think you have to make 2 Servers one for the app which uses the express server and one for the websocket. It could look like this.
GraphQL express server:
...
graphQLServer = express();
const GRAPHQL_PORT = 4000;
graphQLServer.use('/graphql', bodyParser.json(), graphqlExpress((request) => {
return {
schema: executableSchema,
};
}));
graphQLServer.use('/graphiql', graphiqlExpress({
endpointURL: '/graphql',
}));
graphQLServer.listen(GRAPHQL_PORT, () => {
console.log(`GraphQL Server is now running on http://localhost:${GRAPHQL_PORT}/graphql`); // eslint-disable-line no-console
});
...
websocket server for subscriptions:
...
const WS_PORT = 8080;
const websocketServer = createServer((request, response) => {
response.writeHead(404);
response.end();
});
websocketServer.listen(WS_PORT, () => console.log( // eslint-disable-line no-console
`Websocket Server is now running on http://localhost:${WS_PORT}`
));
const subscriptionManager = new SubscriptionManager({
schema: executableSchema,
pubsub: pubsub,
setupFunctions: { /* your subscription channels */ },
});
subscriptionServer = new SubscriptionServer({
subscriptionManager: subscriptionManager
}, {
server: websocketServer,
path: '/',
});
...
And you need some sort of publication subscription service, we use pubSub. It is included in the server file and looks like this:
import {
PubSub
} from 'graphql-subscriptions';
const pubsub = new PubSub();
export {
pubsub
};

You can create some web socket server wrapper which implements start method which will be responsible for creating and running the WSServer, as well as it will create a SubscriptionServer with use of SubscriptionManager
// in subscription.js
import { PubSub, SubscriptionManager } from 'graphql-subscriptions';
const pubSub = new PubSub();
let subscriptionManagerOptions = {
schema: schema, // this is your graphql schema
setupFunctions: {
// here come your setup functions
},
pubSub: pubSub
};
const subscriptionManager = new SubscriptionManager(subscriptionManagerOptions);
export { pubSub, subscriptionManager };
After we have the subscriptionManager created, we can now implement the WSServer
import { createServer } from 'http';
import { SubscriptionServer } from 'subscription-transport-ws';
import { subscriptionManager } from './subscription';
const webSocketServerWrapper = {
start: function(port){
const webSocketServer = createServer((request, response) => {
response.writeHead(404);
response.end();
});
webSocketServer.listen(port, () => {
console.log('WSServer listening on port ' + port);
});
new SubscriptionServer({
subscriptionManager,
onSubscribe: (message, options, request) => {
return Promise.resolve(Object.assign({}, options, {}));
}
}, webSocketServer);
}
};
export default webSocketServerWrapper;
Now you can import the webSocketServerWrapper in the initialisation file like index.js and simply run webSocketServerWrapper.start(PORT);
Here, the second answer which I wrote, you can find a code responsible for creating example subscription and how it should be handled.

Related

Firebase Emulator Suite - simple pubsub example

I have read MANY docs/blogs/SO articles on using the Firebase Emulator Suite trying a simple pubsub setup, but can't seem to get the Emulator to receive messages.
I have 2 functions in my functions/index.js:
const functions = require('firebase-functions');
const PROJECT_ID = 'my-example-pubsub-project';
const TOPIC_NAME = 'MY_TEST_TOPIC';
// receive messages to topic
export default functions.pubsub
.topic(TOPIC_NAME)
.onPublish((message, context) => {
console.log(`got new message!!! ${JSON.stringify(message, null, 2)}`);
return true;
});
// publish message to topic
export default functions.https.onRequest(async (req, res) => {
const { v1 } = require('#google-cloud/pubsub');
const publisherClient = new v1.PublisherClient({
projectId: process.env.GCLOUD_PROJECT,
});
const formattedTopic = publisherClient.projectTopicPath(PROJECT_ID, TOPIC_NAME);
const data = JSON.stringify({ hello: 'world!' });
// Publishes the message as JSON object
const dataBuffer = Buffer.from(data);
const messagesElement = {
data: dataBuffer,
};
const messages = [messagesElement];
// Build the request
const request = {
topic: formattedTopic,
messages: messages,
};
return publisherClient
.publish(request)
.then(([responses]) => {
console.log(`published(${responses.messageIds}) `);
res.send(200);
})
.catch((ex) => {
console.error(`ERROR: ${ex.message}`);
res.send(555);
throw ex; // be sure to fail the function
});
});
When I run firebase emulators:start --only functions,firestore,pubsub and then run the HTTP method with wget -Sv -Ooutput.txt --method=GET http://localhost:5001/my-example-pubsub-project/us-central1/httpTestPublish, the HTTP function runs and I see its console output, but I can't seem to ever get the .onPublish() to run.
I notice that if I mess around with the values for v1.PublisherClient({projectId: PROJECT_ID}), then I will get a message showing up in the GCP cloud instance of the Subscription...but that's exactly what I don't want happening :)

Import csv file and send to backend

I try to create a redux-react app where the users can import an csv-file that later is stored in a database. Right now I am working on the frontend where I want to create a code where the user can chose a csv file from their computer that they want to download and then the file is sent to the backend. I have therfore used the csvReader to read the csv-file but I don't know how to send the data to the backend. I am using nestJS in the backend. I want to send the whole csv-file in one go but i dont know how to tackle the problem. I am a beginner :))) Do you know how to solve my problem?
I can't help you with react but maybe this NestJS part can help you. You can use multer to config your api and setting a store path.
Create multer options
// multer.ts
const excelMimeTypes = [
'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
'application/wps-office.xlsx',
'application/vnd.ms-excel',
];
export const multerOptions = {
fileFilter: (req: any, file: any, cb: any) => {
const mimeType = excelMimeTypes.find(im => im === file.mimetype);
if (mimeType) {
cb(null, true);
} else {
cb(new HttpException(`Unsupported file type ${extname(file.originalname)}`, HttpStatus.BAD_REQUEST), false);
}
},
storage: diskStorage({
destination: (req: any, file: any, cb: any) => {
const uploadPath = '/upload'; // use env var
if (!existsSync(uploadPath)) {
mkdirSync(uploadPath); // create if not exists
}
cb(null, uploadPath);
},
filename: (req: any, file: any, cb: any) => {
cb(null, file.originalname);
},
}),
};
Import multerOption recent created and use FileInterceptor and UploadedFile decorator to get the file.
#Post()
#UseInterceptors(FileInterceptor('file', multerOptions))
uploadFile(#UploadedFile() file) {
console.log(file) // call service or whathever to manage uploaded file.. handleFile in the example below..
}
Manage file (example) using xlsx library.
handleFile(file: any): Promise<any> {
return new Promise(async (resolve: (result: any) => void, reject: (reason: any) => void): Promise<void> => {
try {
const workbook = XLSX.readFile(`${uploadLocation}/${file.filename}`);
resolve(workbook.Sheets[sheetName]);
} catch (error) {
reject(error);
}
});
}
I hope it helps!

Angular HTTPClient (HTTP) requests pending forever

I have recently started working with MySQL as the database for my Angular/NodeJS project (I have been using MongoDB all along). Nonetheless, I'm encountering issues when handling HTTP Requests. I have experimented with GET and POST requests as of now, and GET is forever pending, until failure and POST doesn't post to backend and to the database, likewise. I really hadn't changed the backend configuration from the one I used with MongoDB database, except for the queries, of course.
I have tried debugging the backend to check whether the server is actually running and everything was okay. It just came to requests reaching the specified endpoints that they're always pending. I also tried to log to console if a request gets at a certain endpoint, but nothing was being logged, unfortunately.
server.js
const app = require("./backend/app");
const debug = require("debug")("node-angular");
const http = require("http");
const normalisePort = setPort => {
const port = parseInt(setPort, 10);
if (isNaN(port)) return setPort;
if (port >= 0) return port;
return false;
};
const port = normalisePort(process.env.PORT || "8000");
const server = http.createServer(app);
const error = error => {
if (error.syscall !== "listen") {
throw error;
}
const bind = typeof port === "string" ? "pipe " + port : "port " + port;
switch (error.code) {
case "EACCES":
console.error(bind + " requires elevated privileges");
process.exit(1);
break;
case "EADDRINUSE":
console.error(bind + " is already in use");
process.exit(1);
break;
default:
throw error;
}
};
const listening = () => {
const address = server.address();
const bind = typeof port === "string" ? "pipe " + address : "port " + port;
debug.enabled = true;
debug("Listening on " + bind);
};
app.set("port", port);
server.on("error", error);
server.on("listening", listening);
server.listen(port, "localhost");
app.js
const express = require("express");
const bodyParser = require("body-parser");
const cors = require("cors");
const users = require("./routes/users");
const app = express();
app.use(cors);
app.use(bodyParser.json());
app.use(
bodyParser.urlencoded({
extended: false
})
);
app.use((req, res, next) => {
res.setHeader("Access-Control-Allow-Origin", "*");
res.setHeader(
"Access-Control-Allow-Headers",
"Origin, X-Requested-With, Authorization, Content-Type, Accept"
);
res.setHeader(
"Access-Control-Allow-Methods",
"GET, POST, PATCH, DELETE, OPTIONS"
);
next();
});
app.get("/api/users", users);
module.exports = app;
users.js
const express = require("express");
const router = express.Router();
const db = require("../sql-connection");
router.get("", (req, res, next) => {
db.query("select * from users;", (error, results, fields) => {
if (results.length > 0) {
return res.status(200).send(results);
} else {
return res.status(404).send();
}
});
});
module.exports = router;
sql-connection.js
const mysql = require("mysql");
const sqlConnection = mysql.createConnection({
host: "localhost",
user: "root",
password: "",
database: "payroll"
});
sqlConnection.connect(error => {
if (error) throw error;
console.log("connected to database");
});
module.exports = sqlConnection;
auth.service.ts
export class AuthService {
private _BASE_URL: string = "http://localhost:8000/api";
constructor(private http: HttpClient) {}
public get users(): Observable<any> {
return this.http.get(this._BASE_URL + "/users");
}
}
signup.component.ts
export class SignUpComponent {
constructor(private _authService: AuthService) {}
public onSignUp(): void {
this._authService
.users()
.subscribe(data => (data ? console.log(data) : console.log("no data")));
}
}
When subscribed to the users observable data from backend should logged to console if present, otherwise, 'no data' is logged on the console. Unfortunately, this request takes forever (pending). However, if I don't subscribe to users no request is sent/seen under network tab in dev tools.
I've been using MYSQL database and I would recommend using mysql2 over mysql
mysql2 provides promise based syntaxes over conventional callback methods.
Here's the documentation for Mysql2 for nodejs.
Coming to the problem, I guess it might be because Nodejs is asynchronous while you're using a synchronous approach in setting up the API.
Also when you're working with Asynchronous programming you have to use try-catch-finally instead of conventional if-else statements to log the errors.
So you can use async (req, res, next)=>{ //your code here } rather than just using (req, res, next)=>{ //your code here }.
Also you have to await before calling the sql query, i.e;
await db.query
or
rather in mysql2 it is easier to use const [data] = await pool.execute(query, [params]).

Error handler ignored when NODE_ENV=production

I am building a simple REST API with Node/Express, and I'm having a hard time when I deploy it to production. When NODE_ENV=development, everything works as expected. I get back the JSON error and the correct status code. When NODE_ENV=production, I only get back an HTML page with the default error message and nothing else. I can read the status code, but I need to have access to the full JSON payload to identify the errors better. This is my code:
import Promise from 'bluebird'; // eslint-disable-line no-unused-vars
import express from 'express';
import config from './config';
import routes from './routes';
import { errorMiddleware, notFoundMiddleware } from './middlewares/error.middleware';
import mongoose from './config/mongoose.config';
// create app
const app = express();
(async () => {
// connect to mongoose
await mongoose.connect();
// pretty print on dev
if (process.env.NODE_ENV !== 'production') {
app.set('json spaces', 2);
}
// apply express middlewares
app.use(express.json());
// register v1 routes
app.use('/v1', routes);
// catch errors
app.use(notFoundMiddleware);
app.use(errorMiddleware);
// start server
app.listen(config.port, () => console.info(`server started on port ${config.port}`));
})();
export default app;
This is the notFoundMiddleware:
export default (req, res, next) => next(new Error('Not Found'));
This is the errorMiddleware:
const errorMiddleware = (err, req, res, next) => {
console.log('test'); // this works in development, but not in production
const error = {
status: err.status,
message: err.message
};
if (err.errors) {
error.errors = err.errors;
}
if (process.env.NODE_ENV !== 'production' && err.stack) {
error.stack = err.stack;
}
return res.status(error.status || 500).send({ error });
};
If you are runing on production server, try to use some logging provider like "papertrailapp" to see the error occurs in your app.
I've just stumbled upon the same problem. It turned out it's caused by a transpiler optimization applied when building production bundle - this one: https://babeljs.io/docs/en/babel-plugin-minify-dead-code-elimination
Express' error handlers should have the signature (err, req, res, next) => { ... } (be of arity 4). In your example next is not used anywhere in errorMiddleware function body and thus it gets eliminated (optimized-out) from function signature in production code.
Solution:
use keepFnArgs: true plugin option - possibly through https://webpack.js.org/plugins/babel-minify-webpack-plugin/ webpack configuration:
var MinifyPlugin = require("babel-minify-webpack-plugin")
module.exports = {
// ...
optimization: {
minimizer: [
new MinifyPlugin({
deadcode: {
keepFnArgs: true,
},
}, {}),
],
}
// ...
}
or alternatively pretend in your code that this argument is used:
const errMiddleware = (err, req, res, _next) => {
// ... your code ...
// ...
// cheat here:
_next
}

From ES2018 async/await to ES2015 Promises . ... timeout

I am trying to convert an ES2018 async function into an ES2015 (ES6) function, but I get a timeout, guess my ES2015 version is wrong...but where?
ES2018 version
async function connectGoogleAPI () {
// Create a new JWT client using the key file downloaded from the Google Developer Console
const client = await google.auth.getClient({
keyFile: path.join(__dirname, 'service-key.json'),
scopes: 'https://www.googleapis.com/auth/drive.readonly'
});
// Obtain a new drive client, making sure you pass along the auth client
const drive = google.drive({
version: 'v2',
auth: client
});
// Make an authorized request to list Drive files.
const res = await drive.files.list();
console.log(res.data);
return res.data;
}
ES2015 version w/Promise
function connectGoogleAPI () {
return new Promise((resolve, reject) => {
const authClient = google.auth.getClient({
keyFile: path.join(__dirname, 'service-key.json'),
scopes: 'https://www.googleapis.com/auth/drive.readonly'
});
google.drive({
version: 'v2',
auth: authClient
}), (err, response) => {
if(err) {
reject(err);
} else {
resolve(response);
}
}
});
}
You haven't translated the await of getClient. Remember, await = then (roughly). You're also falling prey to the promise creation anti-pattern: When you already have a promise (from getClient), you almost never need to use new Promise. Just use then.
Here's an example with each await converted into a then, using the chain for the subsequent operations:
function connectGoogleAPI () {
// Create a new JWT client using the key file downloaded from the Google Developer Console
return google.auth.getClient({
keyFile: path.join(__dirname, 'service-key.json'),
scopes: 'https://www.googleapis.com/auth/drive.readonly'
}).then(client => {
// Obtain a new drive client, making sure you pass along the auth client
const drive = google.drive({
version: 'v2',
auth: client
});
// Make an authorized request to list Drive files.
return drive.files.list();
}).then(res => {
console.log(res.data);
return res.data;
});
}
That last part can be just
}).then(res => res.data);
...if you remove the console.log. (Or we could abuse the comma operator.)
Notes:
Each await needs to become a then handler (there were two in the original, awaiting getClient and drive.files.list)
In a then handler, if you have to wait for another promise (such as the one from drive.files.list) you typically return it from the handler, and then use another handler to handle that result (which is why I have return drive.files.list() and then a separate handler for converting res to res.data)
Re that second point: Sometimes nesting is appropriate, such as when you need to combine the result with some intermediate value you only have with in your then handler. (For instance, if we wanted to combine res.data with client.) But generally, prefer not to nest.