I have the following query to select a list of vocabulary from a Japanese dictionary.
SELECT * FROM dictionary
WHERE from_local = 1
AND (word like '%後%' or reading like '%後%')
Running this program in HeidiSQL, it works as expected. I feel like it could be a charset issue but I don't think it would work at all if this were the case. (See screenshot)
My problem occours when I try to run this query in my Node.js app. The results return empty.
I am using npm's mysql library. The dbQuery method is a helper function I made (Pastebin link)
import { dbQuery } from '../db'
const search = async(query) => {
try {
let sql = 'SELECT * FROM dictionary WHERE'
sql += ' from_local = ? AND'
sql += ' (word = ? OR reading = ?)'
const params = [1, '%'+query+'%', '%'+query+'%']
console.log('dictionary DB', {query, sql, params})
return await dbQuery(sql, params)
}
catch(err) {
console.log('search Error', err)
}
}
Soultion I was being stupid
I had forgot to change the = to LIKE in my query
Related
TypeORM is working on inserting arrays using INSERT.
I'm going to do another task using the IDs that came out as a result of insert.
At this time, the IDs that come out as a result of insert come out in the order of array when insert?
// customRepository.ts
insertArr(nameArr : {name : string}[]){
reuturn this.createQueryBuilder()
.insert()
.into(customTable)
.values(nameArr)
.execute()
}
// service.ts
const connection = getConnection();
const repository = connection.getCustomRepository('customRepository')
const arr = [{name : 'first'},{name : 'second'}]
const result =
await repository.insertArr(
arr
);
console.log('result : ', result);
//Does this result come out in the order of insert?
Thank you!!!
i am executing this select request in mysql data base and nodejs, the expected results are only an array with string content containing the link : ['http://hgj','http://jfhd'], but with the code that i will insert it shows me : [{link:''http://hgj},{link:''http://jfhd}] how can i remove the objects {link} and insert in the table only the string 'http...'?
query = "select link from weblist";
var res= await con.query(query, (error, response) => {
console.log('link from database', error || response);
var table = JSON.parse(JSON.stringify(response));
return table ;
});
There are a few simple steps involved in this solution:
Stringify the object using JSON.stringify. This is now a string
The string contains chars you don't want so you use string.replace() and pass a regex literal that irrespective of char position or casing, it removes the unwanted char with "" (i.e.nothing);
return that new string with a representation of your urls inside an array
One-liner solution:
let oneliner = links.map((elem) => {
return JSON.stringify(elem).replace(/[{}"'link:]/gi, "");
})
console.log(oneliner);
// [ 'http: //hgj', 'http://jfhd' ]
I'm doing my project Nodejs and MySQL and I have some problem with query with apostrophe. I got all of the data from github api and it normally works fine. but If data have single apostrophe('), it will get Syntax Error.
And the Error is like this.
ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 's computer engineering classes.', '2017-08-28T04:05:02Z', '2018-10-21T12:04:50Z'' at line 1
At first, I was thinking about using regular expressions to get rid of all ' but this is not a good way to correct the original data itself.
I search the solutions in google but it's really hard to find with this problem. Is there any google idea or solutions?
// // User Repository Information API Process
request(repositoryOptions, function (error, response, data) {
if (error) {
throw error;
}
let result = JSON.parse(data);
for (i = 0; i < result.length; i++) {
// console.log(result[i]);
let sid = shortid.generate();
let githubid = result[i].owner.login;
let name = result[i].name;
let githuburl = result[i].html_url;
let explanation = result[i].description;
let created_at = result[i].created_at;
let updated_at = result[i].updated_at;
let sqlData = `('${sid}', '${githubid}', '${name}', '${githuburl}', '${explanation}', '${created_at}', '${updated_at}')`;
console.log(sqlData);
let sql = `INSERT INTO Personal_Data (id, githubid, name, githuburl, explanation, pjdate1, pjdate2) VALUES ${sqlData}`;
db.query(sql);
}
})
I solved the problem myself. #scetiner helps me about the keyword.
I create array and put it all variables. and because of the SQL Injection attack, I have to add placeholders(I'm not sure what is called. it's look like ?,?,?,?,?)
Anyway Here's my code that I modified.
request(repositoryOptions, function (error, response, data) {
if (error) {
throw error;
}
let result = JSON.parse(data);
for (i = 0; i < result.length; i++) {
// console.log(result[i]);
let sid = shortid.generate();
let githubid = result[i].owner.login;
let name = result[i].name;
let githuburl = result[i].html_url;
let explanation = result[i].description;
let created_at = result[i].created_at;
let updated_at = result[i].updated_at;
let sqlData = [sid, githubid, name, githuburl, explanation, created_at, updated_at];
console.log(sqlData);
let sql = `INSERT INTO Personal_Data (id, githubid, name, githuburl, explanation, pjdate1, pjdate2) VALUES (?,?,?,?,?,?,?)`;
db.query(sql, sqlData);
}
and then it works fine like this
: Console
[ 'FUuBdByBV',
'sangumee',
'Blueinno2',
'https://github.com/sangumee/Blueinno2',
'This repository uses BlueInno 2 and shares the source code written.',
'2018-06-11T04:08:17Z',
'2018-12-04T07:48:08Z' ]
[ 'HiRsc7IjNc',
'sangumee',
'CSS-Grid',
'https://github.com/sangumee/CSS-Grid',
'CSS Grid Study',
'2018-07-13T07:49:57Z',
'2018-07-19T05:17:14Z' ]
I am migrating the database of my node.js/typescript project from Oracle to MYSQL.
My queries/dml in Oracle are all bind in this style
conn.execute('select date, name from table
where id = :ID and field = :VAR',
{ID: variable1, VAR: variable2});
When using MYSQL I found this:
connection.query('select date, name from table
where id = ? and field = ?',
[variable1, variable2]);
The second approach is worse for me because of following reasons:
i- I would to rewrite a lot of sql calls in my code
ii- I think the first approach is much more reliable, as you are not concerning of having unpredictable results due to changing in SQL
Although I found some mention to the first style here, it couldn't make it work
Any tips?
As I didn't find anything ready that could solve the issue, I tried to solve the problem. Maybe this could be helpful.
first, this code gets an Oracle bind interface type like {ID: 105, DEPT: 'MKT'} and a query like 'select * from emp where id = :ID and deptName = :DEPT' and translates them to [105,'MKT'] and 'select * from emp where id = ? and deptName = ?'
here is the code
const endBindCharacters: string = ' )=';
function prepareSQL(sql: string, binders: Object = null, valueArray: TBindArray): string {
let ich: number = 0;
let bindVariable: string;
if (! binders) {
if (sql.indexOf(':') > 0) {
throw new CustomError(errorCodes.connection.sqlBoundWithNoBinders,
'No binders {} in a bound SQL ' + sql);
};
return sql;
};
while ((ich = sql.indexOf(':')) > 0) {
bindVariable = '';
while (!endBindCharacters.includes(sql[++ich]) && ich < sql.length) {
bindVariable += sql[ich];
};
if (binders[bindVariable]) {
valueArray.push(binders[bindVariable]);
} else {
throw new CustomError(errorCodes.connection.bindVariableNotInBinders, ' Bind variable ' + bindVariable +
' não encontradada no binders {} da expressão:\n' + sql)
};
sql = sql.replace(':' + bindVariable, ' ? ');
};
return sql;
};
This is the wrapper. It will get a Promise from the callback.
export async function executeSQL (conn: TConnection, sql: string,
binders: Object = {}): Promise<TReturn> {
let bindArray: TBindArray = [];
sql = prepareSQL(sql, binders, bindArray);
console.log(sql, binders, bindArray);
return new Promise<TReturn>(function(resolve, reject) {
conn.query(sql, bindArray , function(err: db.IError, results: TReturn) {
if(err) {reject(err)}
else {resolve(results)};
});
});
};
I have a text file. I need to read the file inside a function and return it as a JSON object. The following is throwing an error "Unexpected token V in JSON at position 0" .
Server.js
fs.readfile('result.txt', 'utf8', function(err,data) {
if(err) throw err;
obj = JSON.parse(data);
console.log(obj);
});
result.txt looks like the following
VO1: 10 5 2
VO2: 5 3 2
I think I cannot use JSON.parse directly. How do I proceed?
Assuming the following:
Every line is separated by a newline character (\n)
Every line is separated by a : where the part in front of it is the key and the part behind it is a (space) separated string that should indicate the keys values as an array.
Below should work for your format:
fs.readfile('result.txt', 'utf8', function(err,data) {
if(err) throw err;
let obj = {};
let splitted = data.toString().split("\n");
for (let i = 0; i<splitted.length; i++) {
let splitLine = splitted[i].split(":");
obj[splitLine[0]] = splitLine[1].trim();
}
console.log(obj);
});
It could be issue with UTF-8 string format, Tried below code and it works
const resultBuffer = fs.readFileSync('result.txt');
const resultData = JSON.parse(resultBuffer.toString().trim());
Thanks to Baao for providing that answer.
As another flavor of solution, if you don't have any ":" for perhaps a list of files you could always code in a key like so:
var data = fs.readFileSync(pathAndFilename);
var testData = {};
var splitList = data.toString().split('\r\n');
for (var i = 0; i < splitList.length; i++) {
testData['fileNumber' + i.toString()] = splitList[i];
}
You need to parse the text file by yourself. You can use RegExp or some other means to extract the values, create an object out of that and then JSON.stringify it.
improving upon #baao answer:
const fs = require("fs")
fs.readFile('.czrc', 'utf8', function (err, data) {
if (err) {
console.error(err)
throw "unable to read .czrc file.";
}
const obj = JSON.parse(data)
});
Your result.txt is not valid json.
Valid json would look like this.
{
"VO1": [10, 5, 2],
"VO2": [5, 3, 2]
}