i have three different Files.
app.js / DataBase.js and the client.html.
In the app.js file i start my server with node.js - express.js and open a websocket. (see below) In Database.js i create a connection with my mysql Database.
I was able to send the data from the database to my client page like this:
db = require('./mysql');
io.on('connection', function (WebSocket) {
WebSocket.on('new_data', function (name) {
WebSocket.emit('data_from_the_database', db.DataexportFromDataBase);
});
});
But now: The user should send the app.js File the varaible "name" which should used in the query from the dataexportFromDataBase.
How i can send the variable to the dataBase file? with socket.io?
i tried:
Include the dataBaseConnection in the app.js. There was no problem and i can handle that, but i want three different files. (Server / Database with the querys / Client). I guess my problem ist here:
WebSocket.emit('data_from_the_database', 'db.DataExportFromDataBase');
Need help :)
Related
I'm new to using Svelte and would like to create a ordering website using Svelte. I know that I will need a database to keep track of the order, customer name, price etc. I have used MySQL before but I haven't learned how to connect a database to a website.
Is there a specific database that you can use if you are using Svelte?
Or is there a way to connect MySQL to Svelte?
I have searched about this on Youtube and Google but I'm not sure if it's different if you are using Svelte so I wanted to make sure.
Note: I have not started this project yet so I do not have any code to show I just want to know how you can connect a database if you're using Svelte.
Svelte is a front end javascript framework that run on the browser.
Traditionally, in order to use databases like mysql from a front end project such as svelte, (that contains only html,css and js), you would have to do it with a separate backend project. You can then communicate the svelte app and the backend project with the help of REST api. The same applies to other other front end libraries/frameworks like react, angular vue etc.
There are still so many ways to achieve the result. Since you are focusing on Svelte here are few things options
1 Sapper
Sapper is an application framework powered by svelte. You can also write backend code using express or polka so that you can connect to database of your choice (mysql / mongodb)
2 User Server less database
If you want you app simple and just focus on svelte app, you can use cloud based databases such as firebase. Svelte can directly talk to them via their javascript SDK.
3 monolithic architecture
To connect with mysql in the backend, you would need to use one serverside application programming language such as nodejs (express) php or python or whatever you are familiar with. Then use can embed svelte app or use api to pass data to the svelte app.
I can make an example with mongodb
You have to install the library
npm install mongodb
or add in package.json
Then you have to make a connection file that you have to call everytime you need to use the db
const mongo = require("mongodb");
let client = null;
let db = null;
export async function init() {
if(!client) {
client = await mongo.MongoClient.connect("mongodb://localhost");
db = client.db("name-of-your-db");
}
return { client, db }
}
for a complete example with insert you can see this video
https://www.youtube.com/watch?v=Mey2KZDog_A
You can use pouchdb, which gives you direct access to the indexedDB in the browser. No backend needed for this.
The client-pouchdb can then be replicated/synced with a remote couchdb. This can all be done inside you svelte-app from the client-side.
It is pretty easy to setup.
var db = new PouchDB('dbname');
db.put({
_id: 'dave#gmail.com',
name: 'David',
age: 69
});
db.changes().on('change', function() {
console.log('Ch-Ch-Changes');
});
db.replicate.to('http://example.com/mydb');
more on pouchdb.com
Also the client can save the data offline first and later connect to a remote database.
As i get question mostly about connection to backend, not a database. It is pity, but svelte app template has no way to connect backend "in box".
What about me, i'm using express middleware in front of rollup server. In this case you able to proxy some requests to backend server. Check code below
const proxy = require('express-http-proxy');
const app = require('express')();
app.use('/data/', proxy(
'http://backend/data',
{
proxyReqPathResolver: req => {
return '/data'+ req.url;
}
}
)
);
app.use('/', proxy('http://127.0.0.1:5000'));
app.listen(5001);
This script opend 5001 port where you have /data/ url proxied to backend server. And 5000 port still available from rollup server. So at http://localhost:5001/ you have svelte intance, connected to backend vi /data/ url, here you can send requests for fetching some data from database.
I am a beginner GCP administrator. I have several applications running on one instance. Each application has its own database. I set up automatic instance backup via the GCP GUI.
I would like to prepare for a possible failure of one of the applications, i.e. one database. I would like to prepare a procedure for restoring such a database, but in the GCP GUI there is no option to restore one database, I need to restore the entire instance, which I cannot due to the operation of other applications on this instance.
I also read in the documentation that a backup cannot be exported.
Is there any way to restore only one database from the entire instance backup?
Will I have to write a MySQL script that will backup each database separately and save it to Cloud Storage?
Like Daniel mentioned you can use gcloud sql export/import to do this. You'll also need a Google Storage Bucket.
First export a database to a file
gcloud sql export sql [instance-name] [gs://path-to-export-file.gz] --database=[database-name]
Create an empty database
gcloud sql databases create [new-database-name] --instance=[instance-name]
Use the export file to populate your fresh, empty database.
gcloud sql import sql [instance-name] [gs://path-to-export-file.gz] --database=[database-name]
I'm also a beginner here, but as an alternative, I think could you do the following:
Create a new instance with the same configuration
Restore the original backup into the new instance (this is possible)
Create a dump of the one database that you are interested in
Finally, import that dump into the production instance
In this way, you avoid messing around with data exports, limit the dump operation to the unlikely case of a restore, and save money on database instances.
Curious what people think about this approach?
As of now there is no way to restore only one database from the entire instance backup. As you can check on the documentation the rest of the applications will also experience downtime (since the target instance will be unavailable for connections and existing connections will be lost).
Since there is no built in method to restore only one database from the entire backup instance you are correct and writing a MySQL script to backup each database separately and use import and export operations (here is the relevant documentation regarding import and export operations in the Cloud SQL MySQL context).
But I would recommend you from an implementation point of view to use a separate Cloud SQL instance for each application, and then you could restore the database in case one particular application fails without causing downtime or issues on the rest of the applications.
I see that the topic has been raised again. Below is a description of how I solved the problem with doing backup individual databases from one instance, without using the built-in instance backup mechanism in GCP and uploud it to cloud storage.
To solve the problem, I used Google Cloud Functions written in Node.js 8.
Here is step by step solution:
Create a Cloud Storage Bucket.
Create Cloud Function using Node.js 8.
Edit below code to meet your instance and database parameters:
const {google} = require("googleapis");
const {auth} = require("google-auth-library");
var sqladmin = google.sqladmin("v1beta4");
exports.exportDatabase = (_req, res) => {
async function doBackup() {
const authRes = await auth.getApplicationDefault();
let authClient = authRes.credential;
var request = {
// Project ID
project: "",
// Cloud SQL instance ID
instance: "",
resource: {
// Contains details about the export operation.
exportContext: {
// This is always sql#exportContext.
kind: "sql#exportContext",
// The file type for the specified uri (e.g. SQL or CSV)
fileType: "SQL",
/**
* The path to the file in GCS where the export will be stored.
* The URI is in the form gs://bucketName/fileName.
* If the file already exists, the operation fails.
* If fileType is SQL and the filename ends with .gz, the contents are compressed.
*/
uri:``,
/**
* Databases from which the export is made.
* If fileType is SQL and no database is specified, all databases are exported.
* If fileType is CSV, you can optionally specify at most one database to export.
* If csvExportOptions.selectQuery also specifies the database, this field will be ignored.
*/
databases: [""]
}
},
// Auth client
auth: authClient
};
// Kick off export with requested arguments.
sqladmin.instances.export(request, function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
res.status(200).send("Command completed", err, result);
}); } doBackup(); };
Sorry for the last line but I couldn't format it well.
Save and deploy this Cloud Function
Copy the Trigger URL from configuration page of Cloud function.
In order for the function to run automatically with a specified frequency, use Cloud
Scheduler: Descrition: "", Frequency: USE UNIX-CORN !!!, Time zone: Choose
yours, Target: HTTP, URL: PAST COPIED BEFORE TRIGGER URL HTTP
method: POST
Thats All, it shoudl work fine.
I am trying to create a node.js app to automatically update a webpage every few seconds with new data from a mysql database. I have followed the information on this site: http://www.gianlucaguarini.com/blog/push-notification-server-streaming-on-a-mysql-database/
The code on this site does indeed work, but upon further testing it keeps running the "handler" function and therefore executing the readFile function for each row of the database processed.
I am in the process of learning node.js, but cannot understand why the handler function keeps getting called. I would only like it to get called once per connection. Constantly reading the index.html file like this seems very ineffecient.
The reason that I know the handler function keeps getting called is that I placed a console.log("Hello"); statement in the handler function and it keeps outputting that line to the console.
Do you provide the image URLs that the client.html is looking for? Here's what I think is happening:
The client connects to your server via Socket.IO and retrieves the user information (user_name, user_description, and user_img). The client then immediately tries to load an image using the user_img URL. The author's server code however, doesn't appear to support serving these pictures. Instead it just returns the same client.html file for every request. This would be why it appears to be calling handler over and over again - it's trying to load a picture for every user.
I would recommend using the express module in node to serve static files instead of trying to do it by hand. Your code would look something like this:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
http.use(app.static(__dirname + "/public"));
That essentially says to serve any static files they request from the public folder. In that folder you will put client.html as well as the user photos.
I am using express 4.x, and the latest MySQL package for node.
The pattern for a PHP application (which I am most familiar with) is to have some sort of database connection common file that gets included and the connection is automatically closed upon the completion of the script. When implementing it in an express app, it might look something like this:
// includes and such
// ...
var db = require('./lib/db');
app.use(db({
host: 'localhost',
user: 'root',
pass: '',
dbname: 'testdb'
}));
app.get('/', function (req, res) {
req.db.query('SELECT * FROM users', function (err, users) {
res.render('home', {
users: users
});
});
});
Excuse the lack of error handling, this is a primitive example. In any case, my db() function returns middleware that will connect to the database and store the connection object req.db, effectively giving a new object to each request. There are a few problems with this method:
This does not scale at all; database connections (which are expensive) are going to scale linearly with fairly inexpensive requests.
Database connections are not closed automatically and will kill the application if an uncaught error trickles up. You have to either catch it and reconnection (feels like an antipattern) or write more middleware that EVERYTHING must call pior to output to ensure the connection is closed (anti-DRY, arguably)
The next pattern I've seen is to simply open one connection as the app starts.
var mysql = require('mysql');
var connection = mysql.createConnection(config);
connection.on('connect', function () {
// start app.js here
});
Problems with this:
Still does not scale. One connection will easily get clogged with more than just 10-20 requests on my production boxes (1gb-2gb RAM, 3.0ghz quad CPU).
Connections will still timeout after a while, I have to provide an error handler to catch it and reconnection - very kludgy.
My question is, what kind of approach should be taken with handing database connections in an express app? It needs to scale (not infinitely, just within reason), I should not have to manually close in the route/include extra middleware for every path, and I (preferably) to not want to catch timeout errors and reopen them.
Since, you're talk about MySQL in NodeJS, I have to point you to KnexJS! You'll find writing queries is much more fun. The other thing they use is connection pooling, which should solve your problem. It's using a little package called generic-pool-redux which manages things like DB connections.
The idea is you have one place your express app access the DB through code. That code, as it turns out, is using a connection pool to share the load among connections. I initialize mine something like this:
var Knex = require('knex');
Knex.knex = Knex({...}); //set options for DB
In other files
var knex = require('knex').knex;
Now all files that could access the DB are using the same connection pool (set up once at start).
I'm sure there are other connection pool packages out there for Node and MySQL, but I personally recommend KnexJS if you're doing any dynamic or complex SQL queries. Good luck!
I have a login system with my NodeJS using mysql-node.
The problem i have how ever is how to keep the user logged in, if they refresh the page they have to login again, because i do not know how to store the session.
My login system is like this:
socket.on('login', function(data,callBack){
var username = sanitize(data['login']).escape(),
pass = sanitize(data['password']).escape();
var query = connection.query('SELECT uid FROM users WHERE name = ? AND pass = ?', [username,pass],
function(err,results){
if(err){
console.log('Oh No! '+err);
} else if(results.length == 1){
//some how set a session here
} else if(!results.length) {
console.log('No rows found!');
}
});
});
I'm having difficulty understanding how i set up a session for each client that connects. Is this possible with NodeJS ?
Reading that they assign express to var app but if i already have this : var app = http.createServer( ... how can i also assign express to it :S bit confusing
You need to understand the difference between a express' server and a native NodeJS' server, here my link comparaison nodejs server vs express server
So you can do:
var app = express();
var server = http.createServer(app);
This enable you to have still the low level functionnaly with NodeJS.
So, if you don't want to use existing modules or framework, you can build your own session manager:
using cookie
using IP/UA
using socket
The best way would be first to implement it with socket, for example:
server.on('connection', function (socket) {
socket.id = id;
});
or
server.on('request', function (req, res) {
req.connection.id = id; // The socket can also be accessed at request.connection.
});
So, you just need to implement a middleware who check the id.
If you want to prevent from session prediction, session sidejacking, etc. you need to combine cookies, ip, socket, and your ideas to make it more secure for your app.
Once you've done your session manager, you can choose where to store the sessions, in a simple object, in redis, in mongodb, in mysql ... (express use MemoryStore by default, but maybe not now)
I don't have an idea if nodejs has core feature of saving sessions. you need to use a database along with it. using Express will help you to utilized a database to persist user sessions. You better study and use it
http://expressjs.com/
http://blog.modulus.io/nodejs-and-express-sessions
I don't think there is any session mechanism within Nodejs' core. However, they are plenty of libraries that would allow you to do it. The first that comes to mind is Connect's session, which is a middleware for Nodejs. Have a look at this question for more details on how to use it.
Have a look at this tutorial from dailyjs which tries to include Express's session into a notepad webapp. The source code is available here. (Note that Express' session is based on Connect's, and is practically the same).
EDIT: Here is a more complete example for Node authentication, using mongoose. They do however show their schemas, so I assume you can easily do the transition to MySQL.