Saving variable value in database and using later - mysql

json2json library converts one json format to another using a template variable.
var template = {
"path": ".",
"as": {
"skus": {
"path": "students,student",
"choose": ["name", "subject"],
"format": function(node, value, key) {
return { value : value };
},
"as": {
"StudentName": "name",
"StudentSubject": "subject",
}
}
}
}
transformedJson = new json2json.ObjectTemplate( template ).transform( oldJson );
I want to save this template variable in database and later use it to transform json by querying database.How can this be done?

Judging from your tag, you would like to insert this data into a MySQL database. You just need a client. In that case, you could use a package like mysql to do so. They provide a very basic example in their documentation (which I've very quickly and slightly adapted to your question):
var mysql = require('mysql');
var connection = mysql.createConnection({
host : 'localhost',
user : 'me',
password : 'secret',
database : 'my_db'
});
connection.connect();
connection.query('INSERT ? INTO your_table', [transformedJson], function(err, res) {
if (err) throw err;
console.log(res);
});
connection.end();
This is, of course, assuming that you have a MySQL database, a MySQL server, etc. Generally speaking, it's not a great idea to insert JSON into a MySQL database in the sense of just dumping a big object into a single field. However, MySQL 5.7.8 does support a native JSON data type.

Related

Nodejs Crypto encrypted string not decrypting correctly after storing in MySQL

I have a simple nodejs application that is storing tokens from an external service, its unsafe to store them as plain strings in the event of database compromise so I need to encrypt them.
copying and pasting the following github gist into my application simple-nodejs-iv-encrypt-decrypt.js
I am able to successfully encrypt and decrypt my strings, however once saving the encrypted string to MySQL it no longer decrypts into a matching string
my MySQL column is formatted encryptedToken VARCHAR(255)
// before storing to database
{
encryptedToken: 'OKWLlYEsCtddWQOL8ezQBI+whtU30gVs67nGiRLxxca10Y4AELjMZN3afVzuys17leE9U9Ski+fByaEXFTXnefDUdyR4PUwJBi6poY1RHOY=',
decryptedToken: 'Z4XkR0vkrbAO6LzmaYGYa0dnaaxvlkIme27L-GlPB7l6M4gkikz1S_vTfJyCUJMx'
}
// after storing to database
{
encryptedToken: 'OKWLlYEsCtddWQOL8ezQBI+whtU30gVs67nGiRLxxca10Y4AELjMZN3afVzuys17leE9U9Ski+fByaEXFTXnefDUdyR4PUwJBi6poY1RHOY=',
decryptedToken: ':D�\b�O3Qlס��,,\u0017aYGYa0dnaaxvlkIme27L-GlPB7l6M4gkikz1S_vTfJyCUJMx'
}
the algorithm used is aes256 and the encoding is base64
I believe this is happening because you're using a different IV (Initialization Vector) each time.
The encryptionHelper function getKeyAndIV creates a random IV each time you call it, so decryption will not be deterministic.
If you ensure you're using the same IV each time, the decrypted token should be the same as well.
I've tested this out like so:
SQL
create table tokens (encryptedToken varchar(255))
Javascript / Node.js
const encryptionHelper = require("./simple-nodejs-iv-encrypt-decrypt.js")
// This could be anything
const token = "abcdefghijklmonp";
// We're using fixed values here. In reality you could use a different IV for each row, it's ok to store this in the database.
const key = Buffer.from("MTIzNDU2Nzg5MGFiY2RlZmdoaWprbG1ub3BxcnN0dXY=", "base64");
const iv = Buffer.from("26vFZGhH66xFszo59pEaWA==", "base64");
const encryptedToken = encryptionHelper.encryptText(encryptionHelper.CIPHERS.AES_256, key, iv, token, "base64");
// con should be initialized with a connection to the relevant db.
con.query("insert into tokens (encryptedToken) values (?)", [encryptedToken ], (error, results) => {
if (error) {
console.error("Insert query failed: ", error);
} else {
console.log("Token insert successful!");
}
});
con.query("select * from tokens", (error, results) => {
if (error) {
console.error("Select query failed: ", error);
return;
}
console.log("Tokens (encrypted):", results.map(r => r.encryptedToken));
console.log("Tokens (decrypted):", results.map(r => encryptionHelper.decryptText(encryptionHelper.CIPHERS.AES_256, key, iv, r.encryptedToken, "base64").toString("base64")));
});
// Let's just ensure con is closed
setTimeout(() => {
con.end()
}, 100);

NodeJS JSON to SQL and SQL to JSON Libraries?

So basically I will be getting a feed of a few huge JSON files. I want to convert them into SQL and store them into a MySQL database.
The catch here is that later on I will be needing to get the SQL files from the database and convert them into JSON objects.
https://sqlizer.io/#/ Something like this, where it converts JSON to SQL but vice versa as well.
So I was wondering if there are any NodeJS modules/libraries that have this type of capability.
Thank you.
I don't see where is the problem. With most sql libraries in node, when doing queries in sql, you are getting json back or at least you are getting data that you can convert to JSON with JSON.stringify.
Let say we are doing it with knex and Postgres:
const db = require('knex')({
client: 'pg',
connection: {
host : '127.0.0.1',
user : 'your_database_user',
password : 'your_database_password',
database : 'myapp_test'
}
});
/*
Let assume the content of the json files is this way:
[
{
"name": "foo",
"last_name": "bar"
},
{
"name": "foo",
"last_name": "bar"
}
];
Schema of table should be (
name TEXT NOT NULL,
last_name TEXT NOT NULL
)
*/
// these are the files
const myFilesToRead = ['./file1.json', './file2.json'];
// we are looping through myFilesToRead
// reading the files
// running queries for each object
Promise.all(
myFilesToRead.map((file) => {
// yes you can require json files :)
const fContent = require(file);
return Promise.all(fContent.map((obj) => {
// knex is a query builder, it will convert the code below to an SQL statement
return db('table_name')
.insert(obj)
.returning('*')
.then((result) => {
console.log('inserted', result);
return result;
});
}));
})
)
.then((result) => {
// let's now get these objects back
return db('table_name')
.select('*');
})
.then((result) => {
// that's it
console.log(JSON.stringify(result));
});
If you want to read about knex, here is the doc:
http://knexjs.org/

Exception using a naming convention w/ Breeze Angular mySql Node Express stack

I'm able to successfully connect and query data from a mySql db via a Breeze/Angular client, following the todo-angular example. I switched out the db table and the GUI and was still ok. The problem starts when I try to use a naming convention. (I don't have control over the db that I have to connect to & I really don't want to use Uppercase_Underscored_Words in my client!)
I'm getting the following exception:
/Users/Sherri/Sites/awdb-web/node_modules/breeze-sequelize/node_modules/breeze-client/breeze.debug.js:1852
throw new Error("Unable to locate a registered object by the name: " + k
^
Error: Unable to locate a registered object by the name: NamingConvention.underscoreCamelCase
at Object.__config._fetchObject (/Users/Sherri/Sites/awdb-web/node_modules/breeze-sequelize/node_modules/breeze-client/breeze.debug.js:1852:13)
at MetadataStore.proto.importMetadata (/Users/Sherri/Sites/awdb-web/node_modules/breeze-sequelize/node_modules/breeze-client/breeze.debug.js:6517:40)
at new module.exports.MetadataMapper (/Users/Sherri/Sites/awdb-web/node_modules/breeze-sequelize/MetadataMapper.js:19:8)
at SequelizeManager.importMetadata (/Users/Sherri/Sites/awdb-web/node_modules/breeze-sequelize/SequelizeManager.js:46:24)
at createSequelizeManager (/Users/Sherri/Sites/awdb-web/server/routes.js:114:8)
at /Users/Sherri/Sites/awdb-web/server/routes.js:23:27
When I take the "namingConvention": "camelCase" line out of the metadata.json file, the error goes away, but of course, the database property is not able to be correctly converted.
Here is the relevant code I use to set up the Entity Manager: (EDIT: I'm pretty sure my problem is server side and has nothing to do with this code, though)
var namingConvention = new UnderscoreCamelCaseConvention();
namingConvention.setAsDefault();
breeze.core.config.initializeAdapterInstance("uriBuilder", "json");
var serviceName = 'breeze/awdb';
var manager = new breeze.EntityManager(serviceName);
// Take any server property name and make it camelCase for the client to use.
// also, save it so that we can convert from the client back to the server's name
function UnderscoreCamelCaseConvention() {
var serverNames = {
netPoints: 'netPoints',
netPointsSpent: 'netPointsSpent'
}; // every translated server name
return new breeze.NamingConvention({
name: 'underscoreCamelCase',
clientPropertyNameToServer: clientPropertyNameToServer,
serverPropertyNameToClient: serverPropertyNameToClient
});
function clientPropertyNameToServer(clientPropertyName) {
return serverNames[clientPropertyName];
}
function serverPropertyNameToClient(serverPropertyName) {
var clientName = _.camelCase(serverPropertyName);
serverNames[clientName] = serverPropertyName;
return clientName;
}
}
And here is a snippet of my metadata.json file:
{
"metadataVersion": "1.0.5",
"namingConvention": "underscoreCamelCase",
"localQueryComparisonOptions": "caseInsensitiveSQL",
"dataServices": [
{
"serviceName": "breeze/awdb/",
"hasServerMetadata": true,
"jsonResultsAdapter": "webApi_default",
"useJsonp": false
}
],
"structuralTypes": [
{
"shortName": "person",
"namespace": "AWdb.Models",
"autoGeneratedKeyType": "Identity",
"defaultResourceName": "people",
"dataProperties": [
{
"name": "Person_ID",
"dataType": "Int32",
"isNullable": false,
"defaultValue": 0,
"isPartOfKey": true,
"validators": [
{
"name": "required"
},
{
"min": -2147483648,
"max": 2147483647,
"name": "int32"
}
]
},
{
"name": "Household_ID",
"dataType": "Int32",
"validators": [
{
"min": -2147483648,
"max": 2147483647,
"name": "int32"
}
]
},
....
]
}
],
"resourceEntityTypeMap": {"people": "person:#AWdb.Models"}
}
EDIT:
Here is code from my routes.js file that gets the metadata.
var fs = require('fs');
var breezeSequelize = require('breeze-sequelize');
var SequelizeManager = breezeSequelize.SequelizeManager;
var SequelizeQuery = breezeSequelize.SequelizeQuery;
var SequelizeSaveHandler = breezeSequelize.SequelizeSaveHandler;
var breeze = breezeSequelize.breeze;
var EntityQuery = breeze.EntityQuery;
var dbConfig = {
host: 'localhost',
user: 'xx',
password: 'xx',
dbName: 'xx'
};
var _sequelizeManager = createSequelizeManager();
// _sequelizeManager.sync(true).then(seed).then(function(){
// console.log('db init successful');
// });
exports.init = init;
function init(app) {
app.get('/breeze/awdb/Metadata', function (req, res, next) {
try {
var metadata = readMetadata();
res.send(metadata);
} catch(e){
next(e);
}
});
function createSequelizeManager() {
var metadata = readMetadata();
var sm = new SequelizeManager(dbConfig);
sm.importMetadata(metadata);
return sm;
}
function readMetadata() {
var filename = "server/AWdbMetadata.json";
if (!fs.existsSync(filename)) {
filename = "AWdbMetadata.json";
if (!fs.existsSync(filename)) {
throw new Error("Unable to locate file: " + filename);
}
}
var metadata = fs.readFileSync(filename, 'utf8');
return JSON.parse(metadata);
}
Any ideas? Should I be able to use a custom naming convention when I'm on a node.js server, using a metadata.json file instead of a .net entity framework?
If I'm looking at this correctly, then I think your issue is the metadata on the server. If I understand correctly, your table and column names follow the Uppercase_Underscored_Word pattern. The Breeze/Sequelize stack on the server currently doesn't have the ability to convert names, so you must use the names of entities and properties exactly as they are in the DB schema. Otherwise, the Breeze to Sequelize translation will fail. You can still use a naming convention on the client to turn the underscored server names into whatever you want them to be on the client.
So, you need two metadata files. One for the server that is used by the Breeze/Sequelize stack and that uses names exactly as they are in the DB and then a separate metadata file for the client, where you can do the translation.

Mongoose Routing Specific Query Output

Mongoose newb question. I'm trying to build a MEAN application that queries an existing db based on user entered parameters. The database is being populated with JSONs by an outside program.
So far, I have my router successfully showing all of the Mongo records when I use the below router:
// Pulls all JSONs
router.get('/jsons', function(req, res, next) {
Json.find(function(err, jsons){
if(err) { return next(err);}
res.json(jsons);
});
});
Now I am trying to create a separate route which can show a specific field within that database. (Where TestLocation = "New York")
// Pulls a JSON with City New York
router.get('/jsons/NewYork', function(req, res, next) {
var queryNYC = Json.where({TestLocation: "New York"});
queryNYC.findOne(function(err, jsons) {
if(err) { return next(err);}
res.json(jsons);
});
});
This is returning null to me, though the original route shows that one JSON record does indeed have a TestLocation of New York.
My schema looks like:
var mongoose = require('mongoose');
// Base schema for the query results table
var JsonSchema = new mongoose.Schema({
uploadID: String,
uploadDate: Date,
testDate: Date,
username: String,
type: String,
markers: String,
image: String,
TestLocation: String
},
{
collection: 'data'
}
);
mongoose.model('Json', JsonSchema);
So my question:
A) Am I setting this up correctly
B) Do I need my Mongoose schema to match what's in the DB exactly? My Mongoose schema has a field TestLocation which matches a TestLocation field in the MongoDB database, but I also have a lot of fields in the MongoDB database that aren't included in the schema. Do I need to include these to see this working?
Thanks!
What you're doing looks fine, the data in your database does not have to be the same as your data. Use the following:
Json.findOne({TestLocation: "New York"}, function(err, jsons) {
if(err) { return next(err);}
res.json(jsons);
});

How to update data in cloudant using nodejs cloudant module?

Below mentioned sample json documents.It contains two fields.
{
"_id": "daef4a0e39c0c7a00feb721f6c4ce8b9",
"_rev": "2-8c7ef28df59ecbdaa23b536e58691416",
"name": "sukil",
"skills": "java"
}
In server.js
var express = require('express');
var app = express();
var cloudant = require('cloudant');
cloudant({account:"test", password:"test"}, function(err, cloudant) {
var alice = cloudant.use('opti-update')
alice.atomic("_design/sample", "inplace", "daef4a0e39c0c7a00feb721f6c4ce8b9", {field: "name", value: "bar"}, function (error, response) {
console.log(error+""+response);
})
})
Here _design/sample is a design document name and inplace is update function name then next is document id.It returns error is document update conflict and response is undefined.
In design document mentioned below
{
"_id": "_design/sample",
"_rev": "9-94393ee4665bdfd6fb283e3419a53f24",
"updates": {
"inplace": "function(doc,req){var field = req.body.field;var value = req.body.value;doc[field] = value;return [doc,''];}"
}
}
I want to update the data in cloudant using node cloudant module. I want to update the name field in json document.Above method i tried but it shows document update conflict error.How to resolve this?
The atomic method assumes the first parameter as the design document only. So need to explicitly specify "_design".
alice.atomic("sample", "inplace", "daef4a0e39c0c7a00feb721f6c4ce8b9", {field: "name", value: "bar"}, function (error, response) {
console.log(error+""+response);
})
This may be causing the problem.