how to decode chinese character from mysql with nodejs - mysql

I am trying to query a comments table from mysql database by language.
Whenever I query by language to fetch chinese comments it displays encoded gibberish characters. but whenever I use python to query, it works.
Cloud Platform: Google Cloud SQL
Database location: Google Cloud SQL
Programming Language: Nodejs
Below is my code
// Require process, so we can mock environment variables
const process = require('process');
const Knex = require('knex');
const express = require('express');
const app = express();
const config = {
user: process.env.SQL_USER,
password: process.env.SQL_PASSWORD,
database: process.env.SQL_DATABASE,
socketPath: `/cloudsql/${process.env.INSTANCE_CONNECTION_NAME}`
};
var knex = Knex({
client: 'mysql',
connection: config
});
app.get('/', (req, res) => {
knex.select('post')
.from('comment')
.where({
'language': 'zh'
}).limit(1).then((rows) => {
res.send(rows);
}).catch((err) => {
res.send(err);
});
});
This is my query result:
"post": "最白痴的部长ï¼æœ€åŸºæœ¬çš„常识和逻辑都没有。真丢人ï¼"
please help.....

The text "最白痴的部长ï¼æœ€åŸºæœ¬çš„常识和逻辑都没有。真丢人ï¼" is what you get if "最白痴的部长基本的常识和逻辑都没有。真丢人" is sent encoded as UTF-8, but is then read and decoded as windows-1252 character set.
There are several different places this mis-decoding could happen:
From the client to the application writing to the database when the data was first added
Between the application and MySQL when adding the data
Across a configuration change in MySQL that wasn't applied correctly.
Between MySQL and the application reading the data.
Between the application and the end client displaying the data to you.
To investigate, I suggest being systematic. Start by accessing the data using other tools, e.g. PHPMyAdmin or the mysql command line in Cloud Shell. If you see the right data, you know the issue is (4) or (5). If the database definitly has the wrong data in it, then it's (1), (2) or (3).
The most common place for this error to happen is (5), so I'll go into that a bit more. This is because often websites set the character set to something wrong, or not at all. To fix this, we must make the character set explicit. You can do this in express.js by adding:
res.set('Content-Type', 'text/plain; charset=utf-8')

Related

Syntax error in SQL when inserting data via nodejs

I have a simple nodejs code in pipedream that sends the body email to mySQL Database.
i have checked the connection to database and its working.
Here is my code
const mysql = require('mysql2/promise');
const { host, port, username, password, database } = auths.mysql
const connection = await mysql.createConnection({
host,
port,//3306
user:"u648845344_demo",
password,
database,
});
const [rows, fields] = await connection.execute(
"INSERT INTO Testing (Email) VALUES (${JSON.stringify(steps.trigger.event.body.email)})"
);
console.log(rows);
//console.log(${JSON.stringify(steps.trigger.event.body.email)})
Error i am getting
ErrorYou have an error in your SQL syntax; check the manual that
corresponds to your MariaDB server version for the right syntax to use
near '{JSON.stringify(steps.trigger.event.body.email)})' at line 1 at
PromiseConnection.execute
(/tmp/ee/node_modules/mysql2/promise.js:110:22) at
Object.module.exports (/steps/mysql.js:14:41) at process._tickCallback
(internal/process/next_tick.js:68:7)
i tried getting email on console log but then error i am getting is
TypeError [ERR_INVALID_ARG_TYPE]The first argument must be one of type
string, Buffer, ArrayBuffer, Array, or Array-like Object. Received
type undefined
This is a classic SQL injection bug, and it's easily fixed by using prepared statements:
const [rows, fields] = await connection.execute(
"INSERT INTO Testing (Email) VALUES (?)",
[ steps.trigger.event.body.email ]
);
If you write your queries without data, just placeholders, and use methods like this to add the data to the query via the driver you will not create any SQL injection bugs. These are an extremely serious form of bug because a single one, if discovered, could lead to a catastrophic outcome for you, your project and any business you're working for.
Using JSON.stringify for SQL protection is, and I cannot stress this enough, completely and wildly inappropriate. That escapes JSON and only JSON. You must use SQL-specific escaping functions if that occasion arises, but use prepared statements with placeholder values whenever possible.

Sequelize Run Script File

I have a project which is using Sequelize to manage a set of MySQL databases. Thus far I've been able to run simple queries to create new databases, insert parameters into a table, and select data... however, I have a very long .sql file (+1,700 lines) which when executed will set up a database with a specific schema (ie. tables, views, etc.). The problem is that I can not figure out how to execute a script like this using sequelize. I know the script works on a new database because I can execute the sql file from MySQL Workbench, however I do not know how to execute the script from javascript file using sequelize. I've searched forums but can't seem to find any resources either. Can this be done?
You can run raw query by Sequelize using sequelize.query(sql_string)
and you can use fs or fs-extra to read the sql file;
Just mind that you need to set the multiline statement option true in order to run this sql text:
var sql_string = fs.readFileSync('path to file', 'utf8');
const sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: /* one of 'mysql' | 'mariadb' | 'postgres' | 'mssql' */,
dialectOptions: {
multipleStatements: true
}
});
sequelize.query(sql_string);
Edit 1:
To better understanding of Sequelize class take a look at this

Connecting to an existing database - node.js

I want to develop an API in Node.JS with only one endpoint taking 2 parameters : a number and a datetime.
This endpoint will return the result of a request in a MySql database, in json.
But my problem is : I don't know if I need to define the models in my code. Indeed, my database is already created, I am connected to it and I only need to return the result of one SQL request with the 2 parameters.
According to me, I think there would be a solution to just call the database and directly return the result.
Is it possible ?
Thank you in advance !!
Not sure if I understand correctly the issue, but from the looks of it, maybe you are bound to some kind of ORM. In any case, most ORMs or the underlying database drivers allow you to send raw SQL queries to the MySQL server without the need for any kind of models or schemas.
For instance, using the mysql package from npm (sample taken from the official repo):
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'me',
password : 'secret',
database : 'my_db'
});
connection.connect();
connection.query('SELECT 1 + 1 AS solution', function (error, results, fields) {
if (error) throw error;
console.log('The solution is: ', results[0].solution);
});
connection.end();

Socket.io and MySQL Connections

I'm working on my first node.js-socket.io project. Until now i coded only in PHP. In PHP it is common to close the mysql connection, when it is not needed any more.
My Question: Does it make sense to keep just one mysql-connection during server is running open, or should i handle this like PHP.
Info: In the happy hours i will have about 5 requests/seconds from socket clients and for almost all of them i have to make a mysql_crud.
Which one would you prefer?
io = require('socket.io').listen(3000); var mysql = require('mysql');
var connection = mysql.createConnection({
host:'localhost',user:'root',password :'pass',database :'myDB'
});
connection.connect(); // and never 'end' or 'destroy'
// ...
or
var app = {};
app.set_geolocation = function(driver_id, driver_location) {
connection.connect();
connection.query('UPDATE drivers set ....', function (err) {
/* do something */
})
connection.end();
}
...
The whole idea of Node.js is async io (that includes db queries).
And the rule with a mysql connection is that you can only have one query per connection at a time. So you either make a queue and have a single connection, as in the first option or create a connection each time as with option 2.
I personally would go with option 2, as opening and closing connections are not such a big overhead.
Here are some code samples to help you out:
https://codeforgeek.com/2015/01/nodejs-mysql-tutorial/

converting database from mysql to mongoDb

is there any easy way to change the database from mysql to mongoDB ?
or better any one suggest me good tutorial do it
is there any easy way to change the database from mysql to mongoDB ?
Method #1: export from MySQL in a CSV format and then use the mongoimport tool. However, this does not always work well in terms of handling dates of binary data.
Method #2: script the transfer in your language of choice. Basically you write a program that reads everything from MySQL one element at a time and then inserts it into MongoDB.
Method #2 is better than #1, but it is still not adequate.
MongoDB uses collections instead of tables. MongoDB does not support joins. In every database I've seen, this means that your data structure in MongoDB is different from the structure in MySQL.
Because of this, there is no "universal tool" for porting SQL to MongoDB. Your data will need to be transformed before it reaches MongoDB.
If you're using Ruby, you can also try: Mongify
It's a super simple way to transform your data from a RDBS to MongoDB without losing anything.
Mongify will read your mysql database, build a translation file for you and all you have to do is map how you want your data transformed.
It supports:
Auto updating IDs (to BSON ObjectID)
Updating referencing IDs
Type Casting values
Embedding tables into other documents
Before save filters (to allow changes to the data manually)
and much much more...
Read more about it at: http://mongify.com/getting_started.html
There is also a short 5 min video on the homepage that shows you how easy it is.
Here's what I did it with Node.js for this purpose:
var mysql = require('mysql');
var MongoClient = require('mongodb').MongoClient;
function getMysqlTables(mysqlConnection, callback) {
mysqlConnection.query("show full tables where Table_Type = 'BASE TABLE';", function(error, results, fields) {
if (error) {
callback(error);
} else {
var tables = [];
results.forEach(function (row) {
for (var key in row) {
if (row.hasOwnProperty(key)) {
if(key.startsWith('Tables_in')) {
tables.push(row[key]);
}
}
}
});
callback(null, tables);
}
});
}
function tableToCollection(mysqlConnection, tableName, mongoCollection, callback) {
var sql = 'SELECT * FROM ' + tableName + ';';
mysqlConnection.query(sql, function (error, results, fields) {
if (error) {
callback(error);
} else {
if (results.length > 0) {
mongoCollection.insertMany(results, {}, function (error) {
if (error) {
callback(error);
} else {
callback(null);
}
});
} else {
callback(null);
}
}
});
}
MongoClient.connect("mongodb://localhost:27017/importedDb", function (error, db) {
if (error) throw error;
var MysqlCon = mysql.createConnection({
host: 'localhost',
user: 'root',
password: 'root',
port: 8889,
database: 'dbToExport'
});
MysqlCon.connect();
var jobs = 0;
getMysqlTables(MysqlCon, function(error, tables) {
tables.forEach(function(table) {
var collection = db.collection(table);
++jobs;
tableToCollection(MysqlCon, table, collection, function(error) {
if (error) throw error;
--jobs;
});
})
});
// Waiting for all jobs to complete before closing databases connections.
var interval = setInterval(function() {
if(jobs<=0) {
clearInterval(interval);
console.log('done!');
db.close();
MysqlCon.end();
}
}, 300);
});
MongoVUE's free version can do this automatically for you.
It can connect to both databases and perform the import
I think one of the easiest ways is to export the MySQL database to JSON and then use mongorestore to import it to a MongoDB database.
Step 1: Export the MySQL database to JSON
Load the mysql dump file into a MySQL database if necessary
Open MySQL Workbench and connect to the MySQL database
Go to the Schema viewer > Select database > Tables > right-click on the name of the table to export
Select 'Table Data Export Wizard'
Set the file format to .json and type in a filename such as tablename.json
Note: All tables will need to be exported individually
Step 2: Import the JSON files to a MongoDB using the mongorestore command
The mongorestore command should be run from the server command line (not the mongo shell)
Note that you may need to provide the authentication details as well as the --jsonArray option, see the mongorestore docs for more information
mongoimport -d dbname -u ${MONGO_USERNAME} -p ${MONGO_PASSWORD} --authenticationDatabase admin -c collectionname --jsonArray --file tablename.json
Note: This method will not work if the original MySQL database has BLOBs/binary data.
I am kind of partial to TalendOpenStudio for those kind of migration jobs. It is an eclipse based solution to create data migration "scripts" in a visual way. I do not like visual programming, but this is a problem domain I make an exception.
Adrien Mogenet has create a MongoDBConnection plugin for mongodb.
It is probably overkill for a "simple" migration but ut is a cool tool.
Mind however, that the suggestion of Nix will probably save you time if it is a one-of migration.
You can use QCubed (http://qcu.be) framework for that. The procedure would be something like this:
Install QCubed (http://www.thetrozone.com/qcubed-installation)
Do the codegen on your database. (http://www.thetrozone.com/php-code-generation-qcubed-eliminating-sql-hassle)
Take your database offline from the rest of the world so that only one operation runs at a time.
Now write a script which will read all rows from all tables of the database and use the getJson on all objects to get the json. You can then use the data to convert to array and push it into the mongoDB!
If anyone's still looking for a solution, i found that the easiest way is to write a PHP script to connect to your SQL DB, retrieve the information you want using the usual Select statement, transform the information into JSON using the PHP JSON Encode functions and simply output your results to file or directly to MongoDB. It's actually pretty simple and straight forward, the only thing to do is to double check your output against a Json validator, you may have to use functions such as explode to replace certain characters and symbols to make it valid. I have done this before however i currently do not have the script at hand but from what i can remember it was literally half a page of code.
Oh also remember Mongo is a document store so some data mapping is required to get it to be acceptable with mongo.
For those coming to this with the same problem, you can check out this Github project. This is an ongoing development that will help you migrate data from MySQL database to MongoDB by simply running a simple command.
It will generate MongoDB Schemas in TypeScript so you can use them later in your project. Each MySQL table will be a MongoDB collection, and datatypes will be efficiently converted to their MongoDB compatibles.
The documentation for the same can be found in the project's README.md. Feel free to come in and contribute. Would like to help if need be.
If you are looking for a tool to do it for you, good luck.
My suggestion is to just pick your language of choice, and read from one and write to another.
If I could quote Matt Briggs (it solved my roblem one time):
The driver way is by FAR the most straight forward. The import/export tools are fantastic, but only if you are using them as a pair. You are in for a wild ride if your table includes dates and you try to export from the db and import into mongo.
You are lucky too, being in c#. We are using ruby, and have a 32million row table we migrated to mongo. Our ending solution was to craft an insane sql statement in postgres that output json (including some pretty kludgy things to get dates going properly) and piped the output of that query on the command line into mongoimport. It took an incredibly frustrating day to write, and is not the sort of thing that can ever really be changed.
So if you can get away with it, use ado.net with the mongo driver. If not, I wish you well :-)
(note that this is coming from a total mongo fanboi)
MySQL is very similar to other SQL databases, so I send You to the topić:
Convert SQL table to mongoDB document
You can use the following project.It requires solr like configuration file to be written.Its very simple and straight forward.
http://code.google.com/p/sql-to-mongo-importer/
Try this:
Automated conversion of MySQL dump to Mongo updates using simple r2n mappings.
https://github.com/virtimus/mysql2mongo