After two weeks of trying to run my site, I'm asking for your help.
Has anyone hosted Sails.JS on PlanetHoster?
My queries don't work because the connection to the database doesn't seem established.
Here's an example of some very simple queries:
await User.findOne({ email: email });
Here's what's displayed in the browser error console:
Uncaught (in promise) Error: Request failed with status code 500
I've tried to handle the errors but nothing is displayed...
try { await User.findOne({ email: email }); } catch(err) { // nothing }
So I've deduced that it was a problem with calling the database.
Unfortunately, I have no way to read the error logs ...
Yet, I've set the production.js file (config/env/production.js) and when I run NODE_ENV = production node app.js, it's still displayed in development. In fact, PlanetHoster doesn't require running the command sails lift, it just runs the platform already ...
I'm currently in a total blur as for where to go from here so if you have suggestions, I will take them with pleasure.
Thank you
Environment: Sails v1.0.2
Related
I am trying to integrate Deepnote and Notion. I already connected the database with the integration in Notion, I added the environment variables correctly but when executing node index.js as suggested in Notion's documentation for integrations I get this error:
#notionhq/client warn: request fail {
code: 'object_not_found',
message: 'Could not find database with ID: ********************************. Make sure the relevant pages and databases are shared with your integration.'
}
The database is correctly sharing info with Deepnote integration: check this screenshot.
Any clues? Thanks so much <3
I am trying to connect Notion as database for Deepnote graphics so then can be iframed into Notion.
I try to deploy one gcloud function with the below Github link to back up the datastore.
https://github.com/portsoc/cloud-simple-datastore-backup/blob/master/index.js
After updating the variant BUCKET_NAME with my cloud storage bucket name, I run it under gcloud shell with the command: node index.js and it will backup the datastore successfully.
but when I continue to run the below command to deploy it:
gcloud functions deploy main
--runtime nodejs12 --trigger-http --allow-unauthenticated
--region=asia-southeast2
After a while, it will give me the below error:
Deploying function (may take a while - up to 2 minutes)...failed.
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Function failed on loading user code. This is likely due to a bug in the user code. Error message: Error: please examine your function logs to see the error cause: https://cloud.google.com/functions/docs/monitoring/logging#viewing_logs. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging. Please visit https://cloud.google.com/functions/docs/troubleshooting for in-depth troubleshooting documentation.
Click to view the error screenshot
Any suggestion on this?
Cloud Functions have a specific set of signatures that must be used.
I'm less familiar with JavaScript|Node.JS but I think the function you reference is intended to be invoked as you do node index.js (or similar) and this is incompatible with Cloud Functions.
Please review Write Cloud Functions to understand that signature type that you will need. You will probably have to tweak the authentication in the example to better meet your needs too.
Almost certainly you don't want --allow-unauthenticated either.
After changing the upload code below, I can deploy it.
const { GoogleAuth } = require("google-auth-library");
// fill in your bucket name here:
const BUCKET_NAME = "gs://testinbbk10";
exports.myfunction = async (req,res) =>{
try {
const auth = new GoogleAuth({
scopes: "https://www.googleapis.com/auth/cloud-platform",
});
const client = await auth.getClient();
const projectId = await auth.getProjectId();
console.log(`Project ID is ${projectId}`);
const res2 = await client.request({
method: "POST",
url: `https://datastore.googleapis.com/v1/projects/${projectId}:export`,
data: {
outputUrlPrefix: BUCKET_NAME,
},
});
console.error("RESPONSE:");
console.log(res2.data);
} catch (error) {
console.error("ERROR");
console.error(error);
}
}
but when I try to access the provided link: deploy link, it will show me the error: could not handle the request.
I am confused about how to properly deploy this to the google cloud function? Just want to deploy one simply google cloud function to backup datastore in the google cloud.
I'm using PouchDB 7.0.0 in an Ionic project (Ionic 4.0.5).
Within a provider, I define both a local and a remote database:
#Injectable()
export class DatabaseProvider {
constructor() {
this.db = new PouchDB("mydb");
this.remote = new PouchDB("http://<my_server_running_couchdb>/<remote_db_name>")
}
The local database lives in the Chrome browser as an IndexedDB instance. However, the problem also occurs in Firefox so it does not look like the browser is the guy to blame.
The remote database is initially empty and runs on CouchDB 2.1.2. It has already been created on my server with no admin or member set up, so it should be public and allow non-authenticated requests. By the way, CORS are enabled as well.
In the same provider I also define a method that triggers a replication from the local db to the remote node:
replicateLocalDBToRemote() {
console.log("Replicating database...");
this.db.replicate.to(this.remote).then(() => {
console.log("Celebrate");
}).catch(error => {
console.error(error)
})
}
And here is what the call to replicateLocalDBToRemote throws at me
CustomPouchError {__zone_symbol__currentTask: e, result: {…}}
result:
doc_write_failures: 0
docs_read: 0
docs_written: 0
end_time: "2018-11-21T16:23:36.974Z"
errors: []
last_seq: 0
ok: false
start_time: "2018-11-21T16:23:36.874Z"
status: "aborting"
and I am afraid I can't call this a self-explanatory message.
Any guess on what might be the root cause of the issue?
EDIT: After crawling through the PouchDB repo on github, I found this entry which might refer to the same problem.
I fixed the problem by allowing traffic through port 5984 on my remote CouchDB server.
The thing is, sending requests on port 80 (i.e. GET http://<my_server>.com/mydb) does send back some data so I never bothered to try with port 5984 in the first place because I thought the API was also implemented on port 80...
So at least my issue had nothing to do with PouchDB but I wish the error message was a bit more specific.
So, I've developed a website (HTML) that has an embedded payment form from Stripe called Checkout. When you visit the website, it prompts you to enter your credit card information, so the checkout form is working correctly.
The issue I'm having is processing the token once it's created.
I'm extremely new to web development and I've never written server code before so please, bear with me.
I've been following guides (Process payments with Node, Vue, Stripe & How to set up Stripe payments with Node.js) and stripes documentation on tokenization to create charges using server-side code (Stripe Checkout)
I understand that I have to have Heroku set up to process the charges so I created an account and set up an app from my terminal. I made a new directory that has the modules required (stripe, express, and bodyParser) and I have this code in my server.js file:
It deploys to Heroku successfully but crashes. This is what is being returned in the console:
What am I doing wrong? Any assistance would be a great help.
You are missing a vital piece:
// Start the server
app.listen(port, function(){
console.log('Server listening on port ' + port)
});
You don't seem to start the server in your application. This should be in the bottom of server.js. You also have to remember to set the port:
var port = process.env.PORT || 3000;
It goes above app.listen of course.
I can't tell for sure if that will fix all your errors, but you have to start with starting the server first.
Also, remember to check for errors in callbacks. In the callback for create you are not doing that. E.g.
if (err){
console.error(err);
res.json({ error: err, charge: false });
} else {
// send response with charge data
res.json({ error: false, charge: charge });
}
You are doing res.send() whether or not there are errors. I doubt that this has anything to do with the Heroku error though.
In my Node.js app, I am trying to connect to a MySQL database hosted on Amazon.
$ npm install mysql
My code looks something like this:
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'my amazon sql db',
user : 'me',
password : 'secret',
database : 'my_db'
});
connection.connect();
connection.query('SELECT 1 + 1 AS solution', function(err, rows, fields) {
if (err) throw err;
console.log('The solution is: ', rows[0].solution);
});
connection.end();
I can connect to my MySQL DB using Workbench--therefore, I am pretty sure my credentials are okay.
When I attempt to connect I get the following error:
Connection.js:91 Uncaught TypeError: Net.createConnection is not a function
Debugging the code from the npm library--this is where the error is thrown in connection.js:
this._socket = (this.config.socketPath)
? Net.createConnection(this.config.socketPath)
: Net.createConnection(this.config.port, this.config.host);
The connection.js has a dependency :
var Net = require('net');
I am running Node.js locally on my Windows computer.
Can anyone tell me what could be causing this error?
Created a separate ticket:
Error thrown calling Node.js net.createConnection
The net module required and used in the MySQL node module is a core part of Node.js itself. The error you're getting about Net.createConnection not being a function means it's coming up as an empty object and the error is related to one of your comment to the question:
I am testing my code within a browser.
You must run this particular module on Node.js only, you can't run it in a web browser.
One could think a possibility would be to run your code through a packer like browserify or webpack so you can easily require('mysql') in your browser but it won't work. The net module which is a core dependency of the mysql module will be transformed into an empty object {}.
That's not a bug, it's how it's supposed to work. Browsers don't have generic tcp implementations so it can't be emulated. The empty object is intended to prevent require('net') from failing on modules that otherwise work in the browser.
To avoid this error, you need to run this code in a pure Node.js environment, not in a browser. A simple server could serve this purpose since this code in your client in a browser can't work and would add a security hole as everything client-side is manipulative and as such not secure. You don't want to expose your database on the client-side but only consumes it.