EDIT 3 Problem below exists for Coldfusion 9.0, updating to 9.0.1 does indeed fix this
I have an application that is using SerializeJSON to encode query results:
#SerializeJSON('Ok works fine')#
Unfortunately it trims the trailing zeroes from numbers:
#SerializeJSON(12345.50)#
manually if i was to make the same value a string, same thing occurs
#SerializeJSON('12345.50')#
How can I prevent this from happening?
EDIT - my scenario specifics
Database (Oracle) has these example values stored on a row
benefactor_id : 0000729789 varchar2(10)
life_gift_credit_amt : 12345.50 number(14,2)
When I query using Coldfusion 9.0.1 (cfscript if it matters) , here is an RC dump, notice the id string retains leading zeroes, but the number column has removed trailing zero.
While that is interesting, it doesnt matter to the Original issue as i can create a query manually to retain that trailing zero like below, it still gets lost in the serializeJSON
I take the query results, and encode the values using serializeJSON. The JSON is consumed by jquery Datatables ajax. Notice the id string has become a number, and has added the '.0' as Miguel-F mentioned
<cfscript>
...
rc.sql = q.setsql;
rc.qResult = q.execute().getresult();
savecontent variable="rc.aaData" {
for (i=1; i <= rc.qResult.RecordCount; i++) {
writeOutput('{');
for (col=1; col <= iColumnsLen; col++) {
// the following line contains a conditional specific to this example
writeOutput('"#aColumns[col]#":#SerializeJSON(rc.qResult[aColumns[col]][i])#');
//former statement, discarded due to not being able to handle apostrophe's ... writeOutput('"#jsStringFormat(rc.qResult[aColumns[col]][i])#"');
writeOutput((col NEQ iColumnsLen) ? ',' : '');
}
writeOutput('}');
writeOutput((i NEQ rc.qResult.RecordCount) ? ',' : '');
}
};
</cfscript>
I was oringially using jsStringFormat instead of serializeJSON, but this would return invalid JSON due to the comments text area containing apostrophe's ect
{
"sEcho": 1,
"iTotalRecords": 65970,
"iTotalDisplayRecords": 7657,
"aaData": [
{
"nd_event_id": 525,
"benefactor_id": 729789.0,
"seq_number": 182163,
"life_gift_credit_amt": 12345.5,
"qty_requested": 2,
"b_a_comment": "#swap",
"pref_mail_name": "Jay P. Rizzi"
}
]
}
EDIT 2
a quick sidenote, if i change my serialization line to
writeOutput('"#aColumns[col]#": "#SerializeJSON(rc.qResult[aColumns[col]][i])#"');
then my result set changes to placing records in double quoting , but also double double quotes strings, while still removing the trailing zero; It leads me to believe serializeJSON is casting the value as a type?
"aaData": [
{
"nd_event_id": "525",
"benefactor_id": "729789.0",
"seq_number": "182163",
"life_gift_credit_amt": "12345.5",
"qty_requested": "2",
"b_a_comment": ""#swap"",
"pref_mail_name": ""JayP.Rizzi""
},
This is a bit baffling... I tested in CF 9 as well. Not really knowing what you are doing with the serialized data (passing as a service, outputting on a page, etc.), I put together some test patterns. One possible solution is if only trying to serialize a sing value - don't. You can actually run deserialize against your numeric value without serializing, and all it does is strip the trailing 0. Otherwise, if you must serialize a single value and don't want the trailing 0 stripped, set the variable to contain the quotation marks
<cfset manualserial = '"111.10"'>
<cfdump var="#DeSerializeJson(manualserial)#">
At this point you can us Deserialize and see that it maintains the 0, with output of 111.10
Below is some additional testing, so you can see what happens when serializing an array while trying to keep the trailing 0... no luck. However when I forwent the built in CF serialize and just created a serialized string, the trailing 0 is maintained (refer to var customarr and d_customarr in WriteDump example below).
Hope that helps a little.
<cfscript>
/*initial testing*/
string = SerializeJSON('Ok works fine');
numericstring = SerializeJSON('12345.50');
numeric = SerializeJSON(12345.50);
arr = SerializeJSON([12345.50,12345.10,'12345.20']);
arrFormat = SerializeJSON([NumberFormat(12345.50,'.00') & ' ',12345.10,'12345.20']);
d_string = DeSerializeJSON(string);
d_numericstring = DeSerializeJSON(numericstring);
d_numeric = DeSerializeJSON(numeric);
d_arr = DeSerializeJSON(arr);
d_arrFormat = DeSerializeJSON(arrFormat);
/*technically, there is no need to serialize a single string value, as running through DeSerialize just trims the trailing 0
if you need to do so, you would want to pass in as a string with quotation marks*/
customstring = '"12345.50"';
d_customstring = DeSerializeJSON(customstring);
customarr = '["12345.50","12345.10","12345.20"]'; //--you can format your own array instead of using CF to serialize
d_customarr = DeSerializeJSON(customarr);
WriteDump(variables);
</cfscript>
=======appended possible solution b========
I think that manually serializing your records may be the most stable option, try this example, and if it works you should be able to add the function to a cfc or create a udf for re-use. Hope it helps.
<cfscript>
q = QueryNew('nd_event_id,benefactor_id,seq_number,life_gift_credit_amt,qty_requested,b_a_comment,pref_mail_name',
'Integer,VarChar,Integer,Decimal,Integer,VarChar,VarChar');
r = queryaddrow(q,2);
querysetcell(q, 'nd_event_id', 525, 1);
querysetcell(q, 'benefactor_id', 0000729789, 1);
querysetcell(q, 'seq_number', 182163, 1);
querysetcell(q, 'life_gift_credit_amt', 12345.50, 1);
querysetcell(q, 'qty_requested', 2, 1);
querysetcell(q, 'b_a_comment', '##swap', 1);
querysetcell(q, 'pref_mail_name', 'Jay P. Rizzi', 1);
querysetcell(q, 'nd_event_id', 525, 2);
querysetcell(q, 'benefactor_id', 0000729790, 2);
querysetcell(q, 'seq_number', 182164, 2);
querysetcell(q, 'life_gift_credit_amt', 12345.90, 2);
querysetcell(q, 'qty_requested', 10, 2);
querysetcell(q, 'b_a_comment', '##swap', 2);
querysetcell(q, 'pref_mail_name', 'Jay P. Rizzi', 2);
WriteDump(q);
s = membershipManualSerializer(q);
public string function membershipManualSerializer(required query q){
var jsonString = '{"aaData":[';
var cols = listtoarray(q.columnList,',');
for(var i=1; i lte q.recordcount; i++){
jsonString &= "{";
for(var c=1;c lte arraylen(cols);c++){
jsonString &= '"' & cols[c] & '":"' & q[cols[c]][i] & '"';
jsonString &= (c lt arraylen(cols))? ",":"";
}
jsonString &= (i lt q.recordcount)? "},":"}]";
}
jsonString &="}";
return jsonString;
}
WriteOutput(s);
WriteDump(DeserializeJson(s));
</cfscript>
Taken from the comments
The original poster (OP) of this question initially reported that they were having this issue with ColdFusion 9.0.1. As it turned out they were actually running ColdFusion 9.0.0. This is significant because Adobe had made changes to how the SerializeJSON() function treats numbers in version 9.0.1. When the server was upgraded to version 9.0.1 these issues were resolved.
This blog post by Raymond Camden discusses the changes made in 9.0.1 - Not happy with the CF901 JSON Changes?
In that blog post he references bug 83638 that had been entered and then fixed in HotFix 1 for version 9.0.1 - Cumulative Hotfix 1 (CHF1) for ColdFusion 9.0.1
If you search the BugBase for JSON under version 9.0.1 there are several reporting the same issue as the OP.
Those reported bugs also mentioned another issue that the OP had not initially reported, that a .0 was being appended to integers as well. Later in the discussion the OP confirmed that they too were seeing this behavior. This lead them to verify the ColdFusion version being utilized and found that it was not 9.0.1.
Related
I need a better understanding about stringify, escape and storing in mysql database. The task looked easy but with escaping I run into some trouble. So I would be happy for general explanation of the following questions:
What I try is to store a javascript object in a mysql DB. It works fine with stringify prior to send. Getting it back from the DB just parse it and everything is fine.
let myObj = {
name: 'Paul',
age: '24'
}
Now, I have additionally a message in my object, which can have special characters:
let myObj = {
name: 'Paul',
age: '24',
message: 'message with special characters: ',`´"~'
}
Also no problem, I started to escape. The result:
let myObj = {
name: 'Paul',
age: '24',
message: 'message with special characters: \'\,\`´\"\~'
}
If I do stringify the object, I get following result:
{
"name": "Paul",
"age": "24",
"message": "message with special characters: \\'\\,\\`´\\\"\\~"
}
Sending it to mysql DB gives following error:
(node:13077) UnhandledPromiseRejectionWarning: Error: ER_PARSE_ERROR: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '\,\`´\"\~"}
Due to the error I manipulated the special characters and removed the additional '\' which gives following result:
obj.message = obj.message(/\\\\/g,'\\');
output:
{
"name": "Paul",
"age": "24",
"message": "message with special characters: \'\,\`´\\"\~"
}
everything is fine and the data is transfered to the DB and my mysql update query has no failure anymore.
Questions:
Is there a better way dealing with escaping inside an object, which will be stringified and send to a mysql DB?
If yes, how is it done? Or is there no other way as to remove the additional backslashes inserted due to the stringify?
One step further. The message has included a new line: \n:
output stringified:
{
"name": "Paul",
"age": "24",
"message": "message with special characters: \'\,\`´\n\\"\~"
}
Sending it to the DB I get following entry (Where \n: I get a new line):
{"name":"Paul","age":"24","message":"message with special characters: ',`´
\"~"}
Which results in an error parsing it back. here is the log (serverside) prior parsing (error makes sense):
{"name":"Paul","age":"24","message":"\',`´\n' +
'\\"~"}
Question:
Regarding the upper part, what do I have to do, to get the \n also escaped? Which means, that the DB entry is correct and the DB doesn't take the \n to start a new line?
Happy for any explaning / help!
I don't know how's the correct way or the easy way, but that's how I did it when I needed to insert a user generated field as a JSON in a MYSQL database
string_cleanJSON_preStringify(str)
{
if(!str.replace) return str;
str=str.replace(/'/g,"\\'"); //escape all at least ' once
str=str.replace(/"/g,'\\"'); //escape all at least " once
str=str.replace(/[\t\r\n\f]/g,''); // remove problematic escape characters
if(str.charAt(str.length-1) == '\\') str+=' '; // add blank space at the end if \ is last character - for example: {"var":"\"} would be problematic
return str;
}
string_cleanJSON_to_query(str)
{
str = str.replace(/(\\)+\\/g,'\\'); // replace all \ more than 1 in a row, to be just 1 ( \\ -> gets escaped again when it's processed to just \)
str = str.replace(/(\\)+"/g,'\\\\\\"'); // replace all \" more than 1 (ex \\\") - i don't know why \\\\\\ - this seem to work in my case, might need to alter based on str manipulations before insert
str = str.replace(/(\\)+'/g,"\\'"); // i don't know why \\ - this seem to work in my case, might need to alter based on str manipulations before insert
str = str.replace(/(\\)+t/g,"t"); // same process as above but with problematic escape characters
str = str.replace(/(\\)+r/g,"r");
str = str.replace(/(\\)+n/g,"n");
str = str.replace(/(\\)+f/g,"f");
return str;
}
How I use this to get a query:
let o = {field_data:string_cleanJSON_preStringify(user_gen_field_data)}
let j = string_cleanJSON_to_query(JSON.stringify(o));
let q = `INSERT INTO blabla (json) VALUES('${j}')`;
I'm doing a SQL query in Node-Red to output a load of time/value data. This data is then passed to a web page for display in a graph.
Previously I've used php to do the SQL query, which I'm trying to replace. However SQL queries in php are delivered in a different format.
With Node-Red, I get:
[
{
"Watts": 1018,
"Time": 1453825454
},
{
"Watts": 1018,
"Time": 1453825448
},
{
"Watts": 1010,
"Time": 1453825442
}]
With PHP, I get:
[
[1453819620000,962],
[1453819614000,950],
[1453819608000,967],
[1453819602000,947]
]
I think I'm getting an array from php and an array of JSON objects from Node-Red. How do I convert the Node-Red object to be output from Node-Red in the same format as the PHP is? (Ie: I want to handle the processing at the server, rather than the client.)
A function node can be used to generate something in the same format.
var array = msg.payload;
var phpFormat = "[";
for (var i=0; i<array.length; i++) {
phpFormat += "[" +
// time format differ, NodeJS is in seconds
// php is in milliseconds
(array[i].Time * 1000 ) +
"," +
array[i].Watts + "],";
}
//take the last "," off
phpFormat = phpFormat.substring(0,phpFormat.lenght - 1);
phpFormat += "]";
msg.payload = phpFormat;
return msg;
I've had a bit of help from a chap at work and here is what he's come up with, modified for node-red by me:
var outputArray = [];
for(var i in msg.payload){
var entryData = [msg.payload[i]['Time']];
for(var attr in msg.payload[i]) {
if(attr!='Time') {
entryData.push(msg.payload[i][attr])}
};
outputArray.push(entryData); }
var returnMsg={"payload":outputArray};
return returnMsg;
I know, I know, this question is over 2 years old... however, for the next 500 people seeking an answer to a similar problem, I'd like to highlight the new JSONata expression feature built-in to the change node. Using this simple expression:
payload.[Time, Watts]
transforms your JS objects into the requested output of an array of arrays. In fact, much of my old repetitive looping through arrays has been replaced with some simpler (to me) expressions like this.
The magic of the lambda syntax evaluator is documented on the JSONata site. There you will also find the online exerciser where you can build an expression against your own data and immediately see the resulting structure.
Note: in order to use a jsonata expression in your change node, be sure to select the J: pulldown next to the input field (not the {} JSON option)... two totally different things!
Im trying to work out how to append a zero to a specific JSON decoded array value for multiple records stored in a MySQL table according to some conditions.
for example, for table 'menu', column 'params'(text) have records containing JSON decoded arrays of this format:
{"categories":["190"],"singleCatOrdering":"","menu-anchor_title":""}
and column 'id' has a numeric value of 90.
my goal is to add a zero to 'categories' value in menu.params whenever (for example) menu.id is under 100.
for this records the result being
{"categories":["1900"],"singleCatOrdering":"","menu-anchor_title":""}
so im looking for a SQL Query that will search and find the occurrences of "categories": ["999"] in the Database and update the record by adding a zero to the end of the value.
this answer is partially helpful by offering to use mysql-udf-regexp but its referring to REPLACE a value and not UPDATE it.
perhaps the REGEXP_REPLACE? function will do the trick. i have never used this library and am not familiar with it, perhaps there is an easier way to achieve what i need ?
Thanks
If I understand your question correctly, you want code that does something like this:
var data = {
"menu": {
"id": 90,
"params": {
"categories": ["190"],
"singleCatOrdering": "",
"menu-anchor_title": ""
}
}
};
var keys = Object.keys(data);
var columns;
for (var ii = 0, key; key = keys[ii]; ii++) {
value = data[key];
if (value.id < 100) {
value.params.categories[0] += "0";
alert(value.params.categories[0]);
}
}
jsFiddle
However, I am not using a regular expression at all. Perhaps if you reword the question, the necessity of a regex will become clearer.
I am using Jackson to parse JSON from a json inputStream which looks like following:
[
[ 36,
100,
"The 3n + 1 problem",
56717,
0,
1000000000,
0,
6316,
0,
0,
88834,
0,
45930,
0,
46527,
5209,
200860,
3597,
149256,
3000,
1
],
[
........
],
[
........
],
.....// and almost 5000 arrays like above
]
This is the original feed link: http://uhunt.felix-halim.net/api/p
I want to parse it and keep only the first 4 elements of every array and skip other 18 elements.
36
100
The 3n + 1 problem
56717
Code structure I have tried so far:
while (jsonParser.nextToken() != JsonToken.END_ARRAY) {
jsonParser.nextToken(); // '['
while (jsonParser.nextToken() != JsonToken.END_ARRAY) {
// I tried many approaches here but not found appropriate one
}
}
As this feed is pretty big, I need to do this efficiently with less overhead and memory.
Also there are three models to procress JSON: Streaming API, Data Binding and Tree Model. Which one is appropriate for my purpose?
How can I parse this json efficiently with Jackson? How can I skip those 18 elements and jump to next array for better performance?
Edit: (Solution)
Jackson and GSon both works in almost in the same mechanism (incremental mode, since content is read and written incrementally), I am switching to GSON as it has a function skipValue() (pretty appropriate with name). Although Jackson's nextToken() will work like skipValue(), GSON seems more flexible to me. Thanks #Kowser bro for his recommendation, I came to know about GSON before but somehow ignored it. This is my working code:
reader.beginArray();
while (reader.hasNext()) {
reader.beginArray();
int a = reader.nextInt();
int b = reader.nextInt();
String c = reader.nextString();
int d = reader.nextInt();
System.out.println(a + " " + b + " " + c + " " + d);
while (reader.hasNext())
reader.skipValue();
reader.endArray();
}
reader.endArray();
reader.close();
This is for Jackson
Follow this tutorial.
Judicious use of jasonParser.nextToken() should help you.
while (jasonParser.nextToken() != JsonToken.END_ARRAY) { // might be JsonToken.START_ARRAY?
The pseudo-code is
find next array
read values
skip other values
skip next end token
This is for gson.
Take a look at this tutorial. Consider following second example from the tutorial.
Judicious use of reader.begin* reader.end* and reader.skipValue should do the job for you.
And here is the documentation for JsonReader
I'd like to format my query results as a single JSON object containing an array object for each record. Need help writing the script though - the JSON.stringify function is building an array of objects (My JSON is inside out!).
I can always write a function to build the JSON manually but I have a feeling there's already a function to do what I'm looking for. I just can't find it.
The JSON string I want to get:
{["id":1,"info":"Ipsum 0"], ["id":2,"info":"Ipsum 1"],
["id":3,"info":"Ipsum 2"], ["id":4,"info":"Ipsum 3"] (and so on) }
Actual Results
[{"id":1,"info":"Ipsum 0"},{"id":2,"info":"Ipsum 1"},
{"id":3,"info":"Ipsum 2"},{"id":4,"info":"Ipsum 3"},
{"id":5,"info":"Ipsum 4"},{"id":6,"info":"Ipsum 5"},
{"id":7,"info":"Ipsum 6"},{"id":8,"info":"Ipsum 7"},
{"id":9,"info":"Ipsum 8"},{"id":10,"info":"Ipsum 9"}]
My code so far (based on this example)
var sqlite3 = require('sqlite3').verbose();
var db = new sqlite3.Database(':memory:');
db.serialize(function() {
db.run("CREATE TABLE lorem (info TEXT)");
var stmt = db.prepare("INSERT INTO lorem VALUES (?)");
for (var i = 0; i < 10; i++) {
stmt.run("Ipsum " + i);
}
stmt.finalize();
var sql = "SELECT rowid AS id, info FROM lorem";
// Print the records as JSON
db.all(sql, function(err, rows) {
console.log(JSON.stringify(rows));
});
});
db.close();
Based on what I know of JSON I was expecting the whole recordset to be enclosed with curly brackets, and each record to be enclosed with a square bracket. However I'm seeing the opposite.
Nope, you have it backward. Database results will be modeled as an array of objects - 1 array represents the results of the entire query, and each object in that array represents a single result record. In JSON, Arrays use square brackets, objects use curly braces (Same as actual JavaScript code).