I'm doing my project Nodejs and MySQL and I have some problem with query with apostrophe. I got all of the data from github api and it normally works fine. but If data have single apostrophe('), it will get Syntax Error.
And the Error is like this.
ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 's computer engineering classes.', '2017-08-28T04:05:02Z', '2018-10-21T12:04:50Z'' at line 1
At first, I was thinking about using regular expressions to get rid of all ' but this is not a good way to correct the original data itself.
I search the solutions in google but it's really hard to find with this problem. Is there any google idea or solutions?
// // User Repository Information API Process
request(repositoryOptions, function (error, response, data) {
if (error) {
throw error;
}
let result = JSON.parse(data);
for (i = 0; i < result.length; i++) {
// console.log(result[i]);
let sid = shortid.generate();
let githubid = result[i].owner.login;
let name = result[i].name;
let githuburl = result[i].html_url;
let explanation = result[i].description;
let created_at = result[i].created_at;
let updated_at = result[i].updated_at;
let sqlData = `('${sid}', '${githubid}', '${name}', '${githuburl}', '${explanation}', '${created_at}', '${updated_at}')`;
console.log(sqlData);
let sql = `INSERT INTO Personal_Data (id, githubid, name, githuburl, explanation, pjdate1, pjdate2) VALUES ${sqlData}`;
db.query(sql);
}
})
I solved the problem myself. #scetiner helps me about the keyword.
I create array and put it all variables. and because of the SQL Injection attack, I have to add placeholders(I'm not sure what is called. it's look like ?,?,?,?,?)
Anyway Here's my code that I modified.
request(repositoryOptions, function (error, response, data) {
if (error) {
throw error;
}
let result = JSON.parse(data);
for (i = 0; i < result.length; i++) {
// console.log(result[i]);
let sid = shortid.generate();
let githubid = result[i].owner.login;
let name = result[i].name;
let githuburl = result[i].html_url;
let explanation = result[i].description;
let created_at = result[i].created_at;
let updated_at = result[i].updated_at;
let sqlData = [sid, githubid, name, githuburl, explanation, created_at, updated_at];
console.log(sqlData);
let sql = `INSERT INTO Personal_Data (id, githubid, name, githuburl, explanation, pjdate1, pjdate2) VALUES (?,?,?,?,?,?,?)`;
db.query(sql, sqlData);
}
and then it works fine like this
: Console
[ 'FUuBdByBV',
'sangumee',
'Blueinno2',
'https://github.com/sangumee/Blueinno2',
'This repository uses BlueInno 2 and shares the source code written.',
'2018-06-11T04:08:17Z',
'2018-12-04T07:48:08Z' ]
[ 'HiRsc7IjNc',
'sangumee',
'CSS-Grid',
'https://github.com/sangumee/CSS-Grid',
'CSS Grid Study',
'2018-07-13T07:49:57Z',
'2018-07-19T05:17:14Z' ]
Related
I am developing a middleware using express.js with mysql(new to mysql) and in my situation I have built this patch method to update the table. but the issue is I dont want to pass the entire field set to update specific fields out of many. so whats the preferred way to do this so that whatever fields I will send in request body those fields should be updated only.
const updateCompany = (req, res, next) => {
const cid = req.params.cid;
const {
company_id,
company_name,
company_address_line1,
company_address_line2,
company_email,
company_phone,
longitude,
latitude
} = req.body;
var myquery = `UPDATE Company_Master SET company_name="${company_name}",company_address_line1="${company_address_line1}",company_address_line2="${company_address_line2}",company_email="${company_email}",company_phone="${company_phone}",longitude=${longitude},latitude=${latitude} WHERE company_id = "${cid}"`
conn.query(myquery, (err, result) => {
if (err) {
console.log("err" + err);
} else {
res.status(201).json(req.body);
}
})
}
You can do as follows
const updateCompany = (req, res, next) => {
const cid = req.params.cid;
let
allowedcolumns = ["company_name", "company_address_line1", ... ], //all columns that can be updated
stmts = [],
values = [];
for (let c of allowedcolumns) {
if (c in req.body) { //check if there is a value for that column in the request body
stmts.push(`${c} = ?`),
values.push(req.body[c]);
}
}
if (stmts.length == 0) {
return res.sendStatus(204); //nothing to do
}
values.push(cid);
conn.query(`UPDATE Company_Master SET ${stmts.join(", ")} WHERE company_id = ?`, values, (err, result) => {
if (err) {
console.log("err" + err);
res.sendStatus(400);
} else {
res.status(200).json(req.body);
}
})
}
allowedcolumns will contain all columns that you are allowed to update via this request. For each of them check, whether there is a value in the request body or not. If yes, add it to the update statements, if not, ignore it (this assumes, the properties in the req.body and the columns in the table have the same name). Furthermore, to create a parameterized query, add the respective value to a values array, that you then can pass to the query.
If you don't have any values, there is nothing to do, so you can immediately return.
Else execute the query (don't forget to also add the cid to the values array). And return the respective status, based on whether there was an error or not.
BTW: 201 is status CREATED. You shouldn use that, if you are updating an already existing entity ...
On the front end of my app I wanted to parse some data related to a CSV they upload. Through the file upload tool, I first get a FileList object and then pull the 1 file out of it.
I want to turn it into a json object which I could then iterate. I was thinking to user csv-parser from node, but I dont see a way to leverage a File object stored in memory.
How Can I accomplish this?
At first I was doing:
let f = fileList.item(0);
let decoder = new window.TextDecoder('utf-8');
f.arrayBuffer().then( data => {
let _data = decoder.decode(data)
console.log("Dataset", data, _data)
});
And that was passing the array buffer, and decoding the string. While I Could write a generic tool which process this string data based on \n and ',' I wanted this to be a bit more easier to read.
I wanted to do something like:
let json = csvParser(f)
is there a way to user csv-parser from node, (3.0.0) or is there another tool i should leverage? I was thinking that levering modules based on the browser ( new window.TextDecoder(...) ) is poor form since it has the opportunity to fail.
Is there a tool that does this? im trying to create some sample data and given a File picked from an input type="file" i would want to have this be simple and straight forward.
This example below works, but i feel the window dependancy and a gut feeling makes me think this is naive.
const f : File = fileList.item(0)
console.log("[FOO] File", f)
let decoder = new window.TextDecoder('utf-8');
f.arrayBuffer().then( data => {
let _data = decoder.decode(data)
console.log("Dataset", data, _data)
let lines = _data.split("\n")
let headers = lines[0].split(',')
let results = []
for ( let i = 1; i < lines.length; i++) {
let line = lines[i]
let row = {}
line.split(",").forEach( (item, idx) => {
row[headers[idx]] = item;
})
results.push(row)
}
console.log("JSON ARRAY", results)
})
The issue i run when i stop and do: ng serve is that it does not like using the arrayBuffer function and accessing TextDecoder from window, since that thost functions/classes are not a part of File and window respectively during build.
Any thoughts?
This is what I ended up doing, given the file input being passed into this function:
updateTranscoders(project: Project, fileList: FileList, choice: string = 'replace') {
const f: File = fileList.item(0)
//Reads a File into a string.
function readToString(file) : Promise<any> {
const reader = new FileReader();
const future = new Promise( (resolve,reject) => {
reader.addEventListener("load", () => {
resolve(reader.result);
}, false)
reader.addEventListener("error", (event) => {
console.error("ERROR", event)
reject(event)
}, false)
reader.readAsText(file)
});
return future;
}
readToString(f).then( data => {
let lines = data.split("\n")
let headers = lines[0].split(',')
let results = []
for (let i = 1; i < lines.length; i++) {
let line = lines[i]
let row = {}
line.split(",").forEach((item, idx) => {
row[headers[idx]] = item;
})
results.push(row)
}
if (choice.toLowerCase() === 'replace'){
let rows = project.csvListContents.toJson().rows.filter( row => row.isDeployed)
rows.push( ...results)
project.csvListContents = CsvDataset.fromJson({ rows: rows })
}else if (choice.toLowerCase() === 'append') {
let r = project.csvListContents.toJson();
r.rows.push(...results);
project.csvListContents = CsvDataset.fromJson(r);
}else {
alert("Invalid option for Choice.")
}
this.saveProject(project)
})
}
Now the CHOICE portion of the code is where I have a binary option to do a hard replace on CSV contents or just append to it. I would then save the project accordingly. This is also understanding that the first row contains column headers.
I have the following query to select a list of vocabulary from a Japanese dictionary.
SELECT * FROM dictionary
WHERE from_local = 1
AND (word like '%後%' or reading like '%後%')
Running this program in HeidiSQL, it works as expected. I feel like it could be a charset issue but I don't think it would work at all if this were the case. (See screenshot)
My problem occours when I try to run this query in my Node.js app. The results return empty.
I am using npm's mysql library. The dbQuery method is a helper function I made (Pastebin link)
import { dbQuery } from '../db'
const search = async(query) => {
try {
let sql = 'SELECT * FROM dictionary WHERE'
sql += ' from_local = ? AND'
sql += ' (word = ? OR reading = ?)'
const params = [1, '%'+query+'%', '%'+query+'%']
console.log('dictionary DB', {query, sql, params})
return await dbQuery(sql, params)
}
catch(err) {
console.log('search Error', err)
}
}
Soultion I was being stupid
I had forgot to change the = to LIKE in my query
I am migrating the database of my node.js/typescript project from Oracle to MYSQL.
My queries/dml in Oracle are all bind in this style
conn.execute('select date, name from table
where id = :ID and field = :VAR',
{ID: variable1, VAR: variable2});
When using MYSQL I found this:
connection.query('select date, name from table
where id = ? and field = ?',
[variable1, variable2]);
The second approach is worse for me because of following reasons:
i- I would to rewrite a lot of sql calls in my code
ii- I think the first approach is much more reliable, as you are not concerning of having unpredictable results due to changing in SQL
Although I found some mention to the first style here, it couldn't make it work
Any tips?
As I didn't find anything ready that could solve the issue, I tried to solve the problem. Maybe this could be helpful.
first, this code gets an Oracle bind interface type like {ID: 105, DEPT: 'MKT'} and a query like 'select * from emp where id = :ID and deptName = :DEPT' and translates them to [105,'MKT'] and 'select * from emp where id = ? and deptName = ?'
here is the code
const endBindCharacters: string = ' )=';
function prepareSQL(sql: string, binders: Object = null, valueArray: TBindArray): string {
let ich: number = 0;
let bindVariable: string;
if (! binders) {
if (sql.indexOf(':') > 0) {
throw new CustomError(errorCodes.connection.sqlBoundWithNoBinders,
'No binders {} in a bound SQL ' + sql);
};
return sql;
};
while ((ich = sql.indexOf(':')) > 0) {
bindVariable = '';
while (!endBindCharacters.includes(sql[++ich]) && ich < sql.length) {
bindVariable += sql[ich];
};
if (binders[bindVariable]) {
valueArray.push(binders[bindVariable]);
} else {
throw new CustomError(errorCodes.connection.bindVariableNotInBinders, ' Bind variable ' + bindVariable +
' não encontradada no binders {} da expressão:\n' + sql)
};
sql = sql.replace(':' + bindVariable, ' ? ');
};
return sql;
};
This is the wrapper. It will get a Promise from the callback.
export async function executeSQL (conn: TConnection, sql: string,
binders: Object = {}): Promise<TReturn> {
let bindArray: TBindArray = [];
sql = prepareSQL(sql, binders, bindArray);
console.log(sql, binders, bindArray);
return new Promise<TReturn>(function(resolve, reject) {
conn.query(sql, bindArray , function(err: db.IError, results: TReturn) {
if(err) {reject(err)}
else {resolve(results)};
});
});
};
How do a format my json data and/or change my function so that it gets stored as columns in Azure table storage?
I am sending a json string to the IoT hub:
{"ts":"2017-03-31T02:14:36.426Z","timeToConnect":"78","batLevel":"83.52","vbat":"3.94"}
I run the sample function (in the Azure Function App module) to transfer the data from the IoT hub into my storage account:
'use strict';
// This function is triggered each time a message is revieved in the IoTHub.
// The message payload is persisted in an Azure Storage Table
var moment = require('moment');
module.exports = function (context, iotHubMessage) {
context.log('Message received: ' + JSON.stringify(iotHubMessage));
context.bindings.deviceData = {
"partitionKey": moment.utc().format('YYYYMMDD'),
"rowKey": moment.utc().format('hhmmss') + process.hrtime()[1] + '',
"message": JSON.stringify(iotHubMessage)
};
context.done();
};
But in my storage table, it shows up as a single string rather than getting split into columns (as seen in the storage explorer.
How do I get it into columns for ts, timeToConnect, batLevel, and vbat?
In case anyone is looking for a solution in c#:
private static async Task ProcessMessage(string message, DateTime enqueuedTime)
{
var deviceData = JsonConvert.DeserializeObject<JObject>(message);
var dynamicTableEntity = new DynamicTableEntity();
dynamicTableEntity.RowKey = enqueuedTime.ToString("yyyy-MM-dd HH:mm:ss.fff");
foreach (KeyValuePair<string, JToken> keyValuePair in deviceData)
{
if (keyValuePair.Key.Equals("MyPartitionKey"))
{
dynamicTableEntity.PartitionKey = keyValuePair.Value.ToString();
}
else if (keyValuePair.Key.Equals("Timestamp")) // if you are using a parameter "Timestamp" it has to be stored in a column named differently because the column "Timestamp" will automatically be filled when adding a line to table storage
{
dynamicTableEntity.Properties.Add("MyTimestamp", EntityProperty.CreateEntityPropertyFromObject(keyValuePair.Value));
}
else
{
dynamicTableEntity.Properties.Add(keyValuePair.Key, EntityProperty.CreateEntityPropertyFromObject(keyValuePair.Value));
}
}
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("myStorageConnectionString");
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
CloudTable table = tableClient.GetTableReference("myTableName");
table.CreateIfNotExists();
var tableOperation = TableOperation.Insert(dynamicTableEntity);
await table.ExecuteAsync(tableOperation);
}
How do I get it into columns for ts, timeToConnect, batLevel, and
vbat?
To get these attributes as separate columns in table, you would need to defalte the object and store them separately (currently you are just converting the entire object into string and storing that string).
Please try the following code:
module.exports = function (context, iotHubMessage) {
context.log('Message received: ' + JSON.stringify(iotHubMessage));
var deviceData = {
"partitionKey": moment.utc().format('YYYYMMDD'),
"rowKey": moment.utc().format('hhmmss') + process.hrtime()[1] + '',
};
Object.keys(iotHubMessage).forEach(function(key) {
deviceData[key] = iotHubMessage[key];
});
context.bindings.deviceData = deviceData;
context.done();
};
Please note that I have not tried to execute this code so it may contain some errors.