Trying to convert JSON data into int in order to perform calculations, multiply by numbers or percentages (or whichever method is best recommended)
Tried performing calculation on the object (using addition for example), but it only added numbers on to the end of the resulting string. I have seen suggestions on using JSON parse (reviver) but can't seem to get my head round getting the desired data when it is only one specific part of the JSON data required rather than multiple items of data from the JSON link.
var xmlhttp = new XMLHttpRequest();
var url = "https://api.coindesk.com/v1/bpi/currentprice.json";
xmlhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
var json = JSON.parse(this.responseText);
parseJson(json);
}
};
xmlhttp.open("GET", url, true);
xmlhttp.send();
function parseJson(json) {
var gbpValue = "1 BTC equals to £" + json["bpi"]["GBP"]["rate"];
document.getElementById("data").innerHTML =
gbpValue;
As mentioned, have tried performing calculations on the result but it only adds numbers to the end of the string. Thanks for any advice or help.
What I can see in your code is that you are adding your json variable with a string which would always result in string concatenation in below line:
var gbpValue = "1 BTC equals to £" + json["bpi"]["GBP"]["rate"];
First check your json object if it a number or string.
You could use the Number() or parseInt() functions to convert a string to number.
Example:
var num=Number("23");
or
var num=parseInt("23");
Hope it helps. :)
Am now using websocket in HTML5 to build my web app.
Because my previous work was based on TCP and I used CRC16 algorithm to wrap the content that will transfer to server side;
Construct message with CRC16 code below:
public static byte[] ConstructMessageWithCRC(string message)
{
//Message with a |
var messageToConstruct = message + "|";
//calculate CRC value
var crcCode = CRC16.CalculateCrc16(messageToConstruct);
var crcCodeShort = ushort.Parse(crcCode);
//CRC high value
var crcHigh = byte.Parse((crcCodeShort / 256).ToString());
//CRC low value
var crcLow = byte.Parse((crcCodeShort % 256).ToString());
var messageBytes = Encoding.Default.GetBytes(messageToConstruct);
var messageLength = messageBytes.Length;
var messageBytesWithCRC = new byte[messageLength + 2];
Array.Copy(messageBytes, 0, messageBytesWithCRC, 0, messageLength);
//append crc value to the message
messageBytesWithCRC[messageLength] = crcHigh;
messageBytesWithCRC[messageLength + 1] = crcLow;
return messageBytesWithCRC;
}
After server side received the data, it will calculate the content via CRC16 algorithm also to ensure that the data is correct.
Check CRC code below:
public static bool CheckMessageCRC(byte[] message, out string messageReceived)
{
//message length that received
var messageLength = message.Length;
//message received without crc value
var messageReceivedStrBytes = new byte[messageLength - 2];
Array.Copy(message, 0, messageReceivedStrBytes, 0, messageLength - 2);
//crc value received
var messageReceivedCrcBytes = new byte[2];
Array.Copy(message, messageLength - 2, messageReceivedCrcBytes, 0, 2);
//get the received message with correct decoding
var messageCalculatedString = Encoding.Default.GetString(messageReceivedStrBytes);
messageReceived = messageCalculatedString;
//get the received crc value
var currentCRC = byte.Parse(messageReceivedCrcBytes[0].ToString()) * 256 + byte.Parse(messageReceivedCrcBytes[1].ToString());
//crc value recalculate
var result = ushort.Parse(CRC16.CalculateCrc16(messageCalculatedString));
//comparison
if (currentCRC == result)
return true;
return false;
}
From above code, you can see what I did.
But in some articles that using websocket: Working with Websockets, it will use below code to handle the message directly:
//When the server is sending data to this socket, this method is called
ws.onmessage = function (evt) {
//Received data is a string; We parse it to a JSON object using jQuery
//http://api.jquery.com/jQuery.parseJSON/
var jsonObject = $.parseJSON(evt.data);
//Do something with the JSON object
};
//Creates an object that will be sent to the server
var myObject = {
Property: "Value",
AnotherProperty: "AnotherValue"
};
//We need to stringify it through JSON before sending it to the server
ws.send(JSON.stringify(myJsonObject));
It will directly send the data out without any check mechanism with the data. So if the data is intercepted by others and we won't know the data has been changed. Also, because of poor network, the data will arrival out of order, then below code won't work correctly, maybe we wanted data as ABC, but we got BCA:
var jsonObject = $.parseJSON(evt.data);
So for evt.data, just wonder how should we check the data transfer complete, data order correct, data content correct and so on.
I have googled a lot and didn't see any info regarding this, just want use this question to declare the right way to transfer data via websocket, thx.
EDIT:
My friend send me this: https://www.rfc-editor.org/rfc/rfc6455, in section 5.2
I think the protocol seems do this work already. Need more of your discusses on it. thx
So, I've parsed a response like this:
var g_rgListingInfo = JSON.parse( response );
response looks like this
{"321242653847396921":{"listingid":"321242653847396921","price":28338,"fee":4249,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2003","steam_fee":1416,"publisher_fee":2833,"asset":{"currency":0,"appid":730,"contextid":"2","id":"3038615825","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"321242653843485871":{"listingid":"321242653843485871","price":30175,"fee":4525,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2003","steam_fee":1508,"publisher_fee":3017,"asset":{"currency":0,"appid":730,"contextid":"2","id":"1730491611","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"782860982384213986":{"listingid":"782860982384213986","price":31305,"fee":4695,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2003","steam_fee":1565,"publisher_fee":3130,"asset":{"currency":0,"appid":730,"contextid":"2","id":"2815962367","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"783987515556891867":{"listingid":"783987515556891867","price":31305,"fee":4695,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2003","steam_fee":1565,"publisher_fee":3130,"asset":{"currency":0,"appid":730,"contextid":"2","id":"3708699202","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"783987515558623437":{"listingid":"783987515558623437","price":30957,"fee":4642,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2003","steam_fee":1547,"publisher_fee":3095,"asset":{"currency":0,"appid":730,"contextid":"2","id":"4462433815","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"718685320959305952":{"listingid":"718685320959305952","price":34000,"fee":5100,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2001","steam_fee":1700,"publisher_fee":3400,"asset":{"currency":0,"appid":730,"contextid":"2","id":"4450043953","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"796369492002647568":{"listingid":"796369492002647568","price":34500,"fee":5175,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2001","steam_fee":1725,"publisher_fee":3450,"asset":{"currency":0,"appid":730,"contextid":"2","id":"4024113558","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D3082226233578562378","name":"Inspect in Game..."}]}},"718684619833530742":{"listingid":"718684619833530742","price":22958,"fee":3442,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2002","steam_fee":1147,"publisher_fee":2295,"asset":{"currency":0,"appid":730,"contextid":"2","id":"4331886445","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"788487401257494747":{"listingid":"788487401257494747","price":34783,"fee":5217,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2001","steam_fee":1739,"publisher_fee":3478,"asset":{"currency":0,"appid":730,"contextid":"2","id":"2315637005","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D1030942533801731526","name":"Inspect in Game..."}]}},"321242020664839911":{"listingid":"321242020664839911","price":34783,"fee":5217,"publisher_fee_app":730,"publisher_fee_percent":"0.10000000149011612","currencyid":"2001","steam_fee":1739,"publisher_fee":3478,"asset":{"currency":0,"appid":730,"contextid":"2","id":"4283078084","amount":"1","market_actions":[{"link":"steam://rungame/730/76561202255233023/+csgo_econ_action_preview%20M%listingid%A%assetid%D6944696178921031564","name":"Inspect in Game..."}]}}}
I put it in here: http://json.parser.online.fr/ this is the
result
The problem I'm having is that I fail to loop through the items. g_rgListingInfo.length is NaN. I tried to use forEach but that failed too.
I want to loop through all these "321242653847396921", "321242653843485871"... wich are always changing and obtain their listingid, price, fee etc.
I am pretty new to node.js so I'm sorry if this is a stupid question.
You have an object, not an array. So, to iterate the results, you have to either convert your object into an array or just iterate it as object.
Converting into array
Depending on what you need, this could be more convenient:
var myData = Object.keys(g_rgListingInfo).map(Number).sort().map(function (c) {
return g_rgListingInfo[c];
});
// Then you can just use the `myData` array
myData.forEach(function (current, index) {
/* do something */
});
// ...or using for loop
for (var i = 0; i < myData.length; ++i) {
var current = myData[i];
/* do something */
}
Iterating the object
You have to get the keys of the object iterate them (optionally you maybe want to sort them first).
// ["3124...", ...]
var numbers = Object.keys(g_rgListingInfo);
// Optional sort
numbers = numbers.map(Number).sort();
// Iterate the object keys
numbers.forEach(function (currentKey) {
var currentObject = g_rgListingInfo[currentKey];
});
Trying to store indexReferences per user, I've found that when I store one (or more) directly in a map, it works fine. However, when stored in an object (or a custom realtime object), the realtime API generates Circular JSON errors.
This works fine:
function doRegisterTypes() {
gapi.drive.realtime.custom.registerType(MyCustomType, "MyCustomType");
MyCustomType.prototype.startPoints = gapi.drive.realtime.custom.collaborativeField('startPoints');
MyCustomType.prototype.endPoints = gapi.drive.realtime.custom.collaborativeField('endPoints');
MyCustomType.prototype.elements = gapi.drive.realtime.custom.collaborativeField('elements');
gapi.drive.realtime.custom.setInitializer(MyCustomType, initializeMyCustomType);
}
function initializeMyCustomType() {
var model = gapi.drive.realtime.custom.getModel(this);
this.startPoints = model.createMap();
this.endPoints = model.createMap();
this.elements = model.createList();
}
function initializeModel(model) {
var o = model.create("MyCustomType");
o.elements.pushAll(["foo", "bar"]);
var startIndex = o.elements.registerReference(0, false);
var endIndex = o.elements.registerReference(0, false);
o.startPoints.set(UserId, startIndex);
o.endPoints.set(UserId, endIndex);
model.getRoot().set("MyCustomObject", o);
}
But this doesn't, failing with circular JSON errors when storing the range object in the map:
function doRegisterTypes() {
gapi.drive.realtime.custom.registerType(MyCustomType, "MyCustomType");
MyCustomType.prototype.ranges = gapi.drive.realtime.custom.collaborativeField('ranges');
MyCustomType.prototype.elements = gapi.drive.realtime.custom.collaborativeField('elements');
gapi.drive.realtime.custom.setInitializer(MyCustomType, initializeMyCustomType);
}
function initializeMyCustomType() {
var model = gapi.drive.realtime.custom.getModel(this);
this.ranges = model.createMap();
this.elements = model.createList();
}
function initializeModel(model) {
var o = model.create("MyCustomType");
o.elements.pushAll(["foo", "bar"]);
var startIndex = o.elements.registerReference(0, false);
var endIndex = o.elements.registerReference(0, false);
// FAILS:
o.ranges.set(UserId, {start:startIndex, end:endIndex});
model.getRoot().set("MyCustomObject", o);
}
I should stress the error appears for a single indexReference, and whether the object is a specific custom type or not, and also WHENEVER the value is set into the map: while initializing the model or later. It's as if the indexReferences cannot be stored at anything but a "top level", though that makes little sense.
Feature? Bug? User stoopidity?
You can't store CollaborativeObjects within arbitrary json within a CollaborativeObject. CollaborativeObjects (including IndexReferences) must be stored directly in other CollaborativeObjects.
(There are a few reasons for this, mostly having to do with how the collaboration works.. json objects are treated as arbitrary blobs whose contents are ignored.)
In this case, you could create a Range custom object type that has a start and end CollaborativeField. (Or a CollaborativeList with 2 elements..)
I'm experimenting with JSON streaming through HTTP with Oboe.js, MongoDB and Express.js.
The point is to do a query in MongoDB (Node.js's mongodb native drive), pipe it (a JavaScript array) to Express.js and parse it in the browser with Oboe.js.
The benchmarks I did compared streaming vs. blocking in both the MongoDB query server-side and the JSON parsing in the client-side.
Here is the source code for the two benchmarks. The first number is the number of milli-seconds for 1000 queries of 100 items (pagination) in a 10 million documents collection and the second number between parenthesis, represents the number of milli-seconds before the very first item in the MongoDB result array is parsed.
The streaming benchmark server-side:
// Oboe.js - 20238 (16.887)
// Native - 16703 (16.69)
collection
.find()
.skip(+req.query.offset)
.limit(+req.query.limit)
.stream()
.pipe(JSONStream.stringify())
.pipe(res);
The blocking benchmark server-side:
// Oboe.js - 17418 (14.267)
// Native - 13706 (13.698)
collection
.find()
.skip(+req.query.offset)
.limit(+req.query.limit)
.toArray(function (e, docs) {
res.json(docs);
});
These results really surprise me because I would have thought that:
Streaming would be quicker than blocking every single time.
Oboe.js would be quicker to parse the entire JSON array compared to the native JSON.parse method.
Oboe.js would be quicker to parse the first element in the array compared to the native JSON.parse method.
Does anyone have an explanation ?
What am I doing wrong ?
Here is the source-code for the two client-side benchmarks too.
The streaming benchmark client-side:
var limit = 100;
var max = 1000;
var oboeFirstTimes = [];
var oboeStart = Date.now();
function paginate (i, offset, limit) {
if (i === max) {
console.log('> OBOE.js time:', (Date.now() - oboeStart));
console.log('> OBOE.js avg. first time:', (
oboeFirstTimes.reduce(function (total, time) {
return total + time;
}, 0) / max
));
return true;
}
var parseStart = Date.now();
var first = true;
oboe('/api/spdy-stream?offset=' + offset + '&limit=' + limit)
.node('![*]', function () {
if (first) {
first = false;
oboeFirstTimes.push(Date.now() - parseStart);
}
})
.done(function () {
paginate(i + 1, offset + limit, limit);
});
}
paginate(0, 0, limit);
The blocking benchmark client-side:
var limit = 100;
var max = 1000;
var nativeFirstTimes = [];
var nativeStart = Date.now();
function paginate (i, offset, limit) {
if (i === max) {
console.log('> NATIVE time:', (Date.now() - nativeStart));
console.log('> NATIVE avg. first time:', (
nativeFirstTimes.reduce(function (total, time) {
return total + time;
}, 0) / max
));
return true;
}
var parseStart = Date.now();
var first = true;
var req = new XMLHttpRequest();
req.open('GET', '/api/spdy-stream?offset=' + offset + '&limit=' + limit, true);
req.onload = function () {
var json = JSON.parse(req.responseText);
json.forEach(function () {
if (first) {
first = false;
nativeFirstTimes.push(Date.now() - parseStart);
}
});
paginate(i + 1, offset + limit, limit);
};
req.send();
}
paginate(0, 0, limit);
Thanks in advance !
I found those comments in Oboe doc at the end of the "Why Oboe?" section:
Because it is a pure Javascript parser, Oboe.js requires more CPU time than JSON.parse. Oboe.js works marginally more slowly for small messages that load very quickly but for most real-world cases using i/o effectively beats optimising CPU time.
SAX parsers require less memory than Oboe’s pattern-based parsing model because they do not build up a parse tree. See Oboe.js vs SAX vs DOM.
If in doubt, benchmark, but don’t forget to use the real internet, including mobile, and think about perceptual performance.