I am using https://github.com/felixge/node-mysql module with node.js.
Mysql table has a field of type POINT. The module requires to send array of arrays to insert bulk records. But It doesn't seem to have option to specify data type.
So naturally, the following gets enclosed in quotes
var loc = "GeomFromText('POINT(" + lat + "," + lon + ")')";
Has anybody tried this? How can I convince the query builder to treat this as an sql function?
Or do I have to make my own query builder?
There is a pull request from kevinhikaruevans that does it. You can do something like that to convert objects to points:
if (typeof val === 'object') {
if(val.hasOwnProperty('lat') && val.hasOwnProperty('long')) {
return 'POINT(' + [val.lat, val.long].map(parseFloat).join(',') + ')';
}
}
Supposing you have a table mytable with only the field point of type POINT, you would insert them like this:
var points = [
[{ lat: 1, long: 4}],
[{ lat: 23, long: -8.345}]
];
var query = connection.query('INSERT INTO mytable(point) VALUES ?', [points], your_callback_func);
console.log("Query: " + query.sql);
This will generate a query similar to:
INSERT INTO mytable(point)
VALUES (POINT(1,4)), (POINT(23,-8.345))
This would convert any object with both lat and long fields to a MySQL point. If this is not an intended behavior, you could create a Point class and use it instead of plain objects, and in lib/protocol/SqlString.js check if the value is an instance of Point.
Try constructing a query to handle POINT() and batch where site is an object with properties and values shown below. This approach works for me.
pool.query('INSERT INTO table SET geometryField = POINT(?,?), ?',[coords.lat,coords.lng,site], function(err, response) {
{ sitename: 'A Site',
customer: 'A Customer',
country: 'AL',
timezone: 'America/Los_Angeles',
address1: '123 My Street',
city: 'MyCity',
state: 'WA',
postalcode: '98110'}
Related
I have these tables on one DB (MySQL) which I need to sync with corresponding tables on another DB (MSSQL), but the field names are different. I was wondering what efficient way there is to convert the names of the fields after fetching the rows so that I could insert them into the other tables.
I was thinking of doing something like this. Make objects where the key is the original table column's names and the value is the destination table column's names:
{
name : UNAME
id : CID
location : LOC
}
And the rows that I fetched and need to insert would look something like this:
{
name: Ethan
id: 1234
location: somewhere1
},
{
name: Jhon
id: 5678
location: somewhere2
}
and then run on these objects and change their key names according to the conversion table, so that I can insert them to the destination table properly.
I can't just insert without field names, as the fields are not in the same order.
How can I do what I've described efficiently? Do you have ideas for better strategies to accomplish this?
thanks a lot!
Sounds about right, how about this:
const converter = {
name : UNAME
id : CID
location : LOC
}
let newData = []
dbResults.forEach(row => {
newData.push({
name: `${row[converter['name']]}`
id: `${row[converter['id']]}`
location: `${row[converter['location']]}`
})
})
EDIT:
Actually looking at the above code there is no need for the converter object:
let newData = []
dbResults.forEach(row => {
newData.push({
name: `${row['UNAME']}`
id: `${row['CID']}`
location: `${row['LOC]}`
})
})
Okay so I've extracted some values in excel using the npm package 'xlsx' and I want to run a MySql query with the formatted result.
Excel extraction
let wb= xlsx.readFile(filePath); //GET WORKBOOK
let ws= wb.Sheets[wb.SheetNames[0]]; //SELECT THE FIRST SHEET IN THE ARRAY
let data= xlsx.utils.sheet_to_json(ws); //CONVERT DATA TO JSON OBJECT
let s =''; //CREATE VARIABLE TO HOLD FORMATTED STRING
for(let i=0; i<data.length; i++){
s+= "'" + data[i].id +"',"; //FORMAT OBJECT TO STRING
}
let fullString= s.substr(0, s.length-1); //STORE FORMATED STRING IN VARIABLE REMOVING FINAL COMMA (,)
Formated string is like so:
'2019-0027178','2019-0027179','2019-0027180','2019-0027181','2019-0027182','2019-0027183'
MySql query is like so:
SELECT name, email, phone FROM persons WHERE id IN (?),
[fullString]
What's expected:
A json object containing the requested information like so:
[{name: "John", email: "john.doe#email.com", phone: "123456789"}, ... ]
Actual result:
An empty array like so:
[]
Investigation and Thoughts
I found out backticks were being added to the query string like so:
... WHERE id IN (`'2019-0027178','2019-0027179','2019-0027180','2019-0027181','2019-0027182','2019-0027183'`)
Actual actual main question:
Is there something I'm doing wrong or is there a proper way to do this?
EDIT!
So for a single question mark ie. ... WHERE id in (?), I get the empty object as stated above. But for two question marks ie. ... WHERE id IN (??), I get this error:
{
"code": "ER_BAD_FIELD_ERROR",
"errno": 1054,
"sqlMessage": "Unknown column ''2019-0027178','2019-0027179','2019-0027180','2019-0027181','2019-0027182','2019-0027183' in 'where clause'",
"sqlState": "42S22",
"index": 0,
"sql": "SELECT name, email, phone FROM persons WHERE id IN (`'2019-0027178','2019-0027179','2019-0027180','2019-0027181','2019-0027182','2019-0027183'`)"
}
Lets say you had a table called personfilter with the columns key and personid.
You then generate a unique key. For example 27. Then the following would do what you want:
INSERT INTO personfilter(key,personid)
values
(27, '2019-0027178'),
(27, '2019-0027179'),
(27, '2019-0027180'),
(27, '2019-0027181'),
(27, '2019-0027182'),
(27, '2019-0027183')
Then you could do the following select
SELECT name, email, phone
FROM persons
JOIN personfilter on personfilter.key = 27 and personfilter.personid = persons.id
Okay so I figured I could just 'prepare' the query before executing it.
~SOLUTION~
Instead of doing this let fullString= s.substr(0, s.length-1); before passing fullString as a parameter to the query string, I just did this:
let fullString= `SELECT name, email, phone FROM persons WHERE id IN (${s.substr(0, s.length-1)})`;
Then passed fullString in place of where the actual query was.
Thanks.
I've been trying, and failing, to generate a Google BigQuery table using a schema that I build from text.
I have no problem defining the schema in script like this:
var fl = {fields: [{name: 'issue_id', type: 'STRING'},.....}
then assigning it as schema: fl
What I want to do is use an array as input to the field list (e.g. name, type) and dynamically build this list into a table schema. I can do this in text (simple string building) but I can't use a text string as a schema - it appears as a null.
There's probably a wildly simple solution but I've not found it yet. It needs to avoid any add-ons if at all possible.
Specific code information
This is the table definition, which requires a schema.
var table = {
tableReference: {
projectId: projectId,
datasetId: datasetId,
tableId: tableId
},
schema: fl
};
If I define fl as below, I don't have a problem. I'm using the expected syntax and it all works.
var fl = {fields: [{name: 'issue_id', type: 'STRING'},{name: 'internal_issue_id', type: 'STRING'}] };
However, if I define fl as below (fs is an array and I'm concatenating text from this array), I end up with fl as a string, which doesn't work here.
var fl = "{fields: ["
while (countRow<numRows) {
fl = fieldList + "{name: '" + fs[countRow][0] + "', type: '" + fs[countRow][1] + "'},";
countRow=countRow+1
}
fl = fl.substring(0,fl.length-1) + "] }"
The string looks exactly like the originally defined variable, but of course is a string so I didn't really expect it to work without some sort of conversion - just like a date string usually needs conversion to be used in date calculations. Currently it appears as a null to the table definition.
I'm sure I'm not the first person to want to do this, and hoping there's a simple solution.
This should be a fairly simple one.
myobject has various properties, _id, name, createdBy, date etc
In my find query I want to only return specific fields from within myObject. So for example, what would I need to do to modify the find query below so that only name was returned?
myCollection.find({createdBy: someId}, {fields: {myObject: 1}}).fetch();
Currently this will return everything in myObject which it should do, I just want one field within myObject returned.
Here is a way to do it within the query:
myCollection.find({createdBy: someId}, {fields: {'myObject.name':
1}}).fetch();
Note the quotes around
'myObject.name'
Lets assume we are talking about posts, and a post document looks like this:
{
_id: 'abc123',
title: 'All about meteor',
author: {
firstName: 'David',
lastName: 'Weldon'
}
}
You can then extract all of the last names from all of the authors with this:
var lastNames = Posts.find().map(function(post) {
return post.author.lastName;
});
Modify the selector and options as needed for your collection. Using fields in this case may be a small optimization if you are running this on the server and fetching the data directly from the DB.
I have a form that I need to populate with JSON data. The form contains select, textarea, and input elements that need to be populated. The JSON data is complex / hierarchical (i.e. many nested objects).
I am aware of http://www.keyframesandcode.com/code/development/javascript/jquery-populate-plugin/ but it uses square bracket notation to map to field names (e.g.
<input name="person[name][last]" ...
I need to use dot notation though (e.g.
<input name="person.name.last" ...
I'm using jQuery so a jQuery solution is fine. Thanks.
Here's a hacked together alternative to populate using a recursive function:
function populator(json, nodes){
$.each(json, function(key, value){
newNodes = nodes ? nodes.slice() : [];
newNodes.push(key);
if (typeof(value)=="object") {
populator(value, newNodes);
else
$('name["' + newNodes.join('.') + '"]').val(value);
}
});
}
With this you can do:
populator({
person: {
name: {
last: 'Doe',
first: 'John'
},
address: {
street: '123 Main Street',
city: 'Montgomery',
state: 'AL'
}
});