Json Parse Unexpected token with node - json

I am getting a strange unexpected token error trying to parse a JSON file using node. I have tested the code with 2 files that look identical. I used a code compare tool to do a comparison and they do appear identical. However, when I try to parse them, one gets the error and the other does not. The file that does not work is being generated from a PowerShell script. The one that does work was manually created. I am baffled. One thing I noticed that is different about them when I write the json out to the console is, the one that does not work has a ? at the beginning.
The json from the file that does not work:
data = ?{ "stack_name": "perf-a", "parameters": { "StackSet": "b", "MonitoringEnableAutoscalingAlarm": "True", "MachineConfigEnvironment": "Perf", "AppEnvironmentType": "perf", "StackInRotation": "True", "MonitoringEnableNotificationOnlyAlarms": "False", "AMIImage": "ami-123456789" }, "tags": { "CostCenter": "12345", "Owner": "test#test.com" }, "cft_file":
"cft/cft.json"}
The json from the file that does work:
data = { "stack_name": "perf-a", "parameters": { "StackSet": "a", "MonitoringEnableAutoscalingAlarm": "True", "MachineConfigEnvironment": "Perf", "AppEnvironmentType": "perf", "StackInRotation": "True", "MonitoringEnableNotificationOnlyAlarms": "False", "AMIImage": "ami-123456789" }, "tags": { "CostCenter": "45229", "Owner": "test#test.com" }, "cft_file": "
cft/cft.json"}
The code I am using for testing is:
var envFile = "perf2.json";
var fs = require('fs');
console.log('envFile = ' + envFile);
fs.readFile(envFile, 'utf8', function (err, data) {
if (err) {
console.log('error reading variables file');
throw err;
}
try {
var JsonData = JSON.stringify(data);
console.log('JsonData = ' + JsonData);
data = data.replace(/\\n/g, "\\n")
.replace(/\\'/g, "\\'")
.replace(/\\"/g, '\\"')
.replace(/\\&/g, "\\&")
.replace(/\\r/g, "\\r")
.replace(/\\t/g, "\\t")
.replace(/\\b/g, "\\b")
.replace(/\\f/g, "\\f")
.replace(/\\0/g, "")
.replace(/\\v/g, "")
.replace(/\\e/g, "\\e");
data = data.replace(/[\u0000-\u001F]+/g, "");
console.log('data = ' + data);
var cftVariables = JSON.parse(data);
console.log('cftVariables = ' + cftVariables);
console.log('cftVariables stack name = ' + cftVariables.stack_name);
} catch (e) {
console.log('error parsing variables file');
throw e;
}
});
As you can see, I have also tried JSON.stringify but I lose the properties and cftVariables.stack_name becomes undefined.
This problem has been plaguing me for several days and I am now at a loss as to how to fix it.
For reference, here is the snippet of PowerShell that creates the file. The problem might be there.
$savePath = "envs/" +$filetouse + ".json"
$parameters = #{AppEnvironmentType =$AppEnvironmentType;
StackSet = $StackSet;
StackInRotation = $StackInRotation;
AMIImage = $amiid;
MonitoringEnableAutoscalingAlarm = $MonitoringEnableAutoscalingAlarm;
MonitoringEnableNotificationOnlyAlarms= $MonitoringEnableNotificationOnlyAlarms;
MachineConfigEnvironment = $MachineConfigEnvironment;
}
$tags = #{Owner = "test#test.com";
CostCenter = "45229";
}
$envcft = #{stack_name =$stack_name;
cft_file = "cft/cft.json";
parameters = $parameters;
tags = $tags;
} | ConvertTo-Json
Write-host("Saving the env file with the new amiId... ")
$envcft | Out-File $savePath -Encoding UTF8 -force

Assuming you are reading the data in from a file, once you have the string you can use the remove function to get rid of that "?".
Something like this:
s = s.replace('?','');
EDIT
since the replace did not work, try 1) either not specifying an encoding when you save the file, or 2) specify UTF16

Related

JSON.Lua json.encode return nil

I'm new to LUA and tried learning coding this language with Garrys Mod.
I want to get the messages from the Garrys Mod chat and send them into a Discord channel with a webhook.
It works, but I tried expanding this project with embeded messages. I need JSON for this and used json.lua as a library.
But as soon as I send a message I retrieve the following error message:
attempt to index global 'json' (a nil value)
The code that causes the error is the following:
json.encode({ {
["embeds"] = {
["description"] = text,
["author"] = {
["name"] = ply:Nick()
},
},
} }),
The complete code:
AddCSLuaFile()
json = require("json")
webhookURL = "https://discordapp.com/api/webhooks/XXX"
local DiscordWebhook = DiscordWebhook or {}
hook.Add( "PlayerSay", "SendMsg", function( ply, text )
t_post = {
content = json.encode({ {
["embeds"] = {
["description"] = text,
["author"] = {
["name"] = ply:Nick()
},
},
} }),
username = "Log",
}
http.Post(webhookURL, t_post)
end )
I hope somebody can help me
Garry's Mod does provide two functions to work with json.
They are:
util.TableToJSON( table table, boolean prettyPrint=false )
and
util.JSONToTable( string json )
There is no need to import json and if I recall correctly it isn't even possible.
For what you want to do you need to build your arguments as a table like this:
local arguments = {
["key"] = "Some value",
["42"] = "Not always the answer",
["deeper"] = {
["my_age"] = 22,
["my_name"] = getMyName()
},
["even more"] = from_some_variable
and then call
local args_as_json = util.TableToJSON(arguments)
Now you can pass args_as_json to your
http.Post( string url, table parameters, function onSuccess=nil, function onFailure=nil, table headers={} )

DocumentDB: bulkImport Stored Proc - Getting 400 Error on Array/JSON issue

I'm simply trying to execute the standard example bulkImport sproc for documentDB API and I can't seem to pass it an array of objects. I always get 400 errors despite the documentation giving clear direction to send an array of objects
.. very frustrating.
Additional details: Even if I wrap the array in an object with the array under a property of 'items' and include it in my sproc it still errors out saying the same bad request, needs to be an object or JSON-serialized. When I try to do JSON.stringify(docs) before sending it fails to parse on the other side.
Bad Request: The document body must be an object or a string representing a JSON-serialized object.
bulkInsert.js:
https://github.com/Azure/azure-documentdb-js-server/blob/master/samples/stored-procedures/BulkImport.js
My Code (using documentdb-util for async):
execProc(docs, insertProc);
async function execProc(docs, insertProc){
let database = await dbUtil.database('test');
let collection = await dbUtil.collection(database, 'test');
let procInstance = await dbUtil.storedProcedure(collection, insertProc);
try{
let result = await dbUtil.executeStoredProcedure(procInstance, docs);
console.log(result);
} catch(e){
console.log(e.body)
}
}
Header
Object {Cache-Control: "no-cache", x-ms-version: "2017-11-15",
User-Agent: "win32/10.0.16299 Nodejs/v8.9.0 documentdb-nodejs-s…",
x-ms-date: "Mon, 11 Dec 2017 07:32:29 GMT",
Accept:"application/json"
authorization: myauth
Cache-Control:"no-cache"
Content-Type:"application/json"
User-Agent:"win32/10.0.16299 Nodejs/v8.9.0 documentdb-nodejs-sdk/1.14.1"
x-ms-date:"Mon, 11 Dec 2017 07:32:29 GMT"
x-ms-version:"2017-11-15"
Path
"/dbs/myDB/colls/myColl/sprocs/myBulkInsert"
Params
Array(3) [Object, Object, Object]
length:3
0:Object {id: "0001", type: "donut", name: "Cake", …}
1:Object {id: "0002", type: "donut", name: "Raised", …}
2:Object {id: "0003", type: "donut", name: "Old Fashioned", …}
[{
"id": "0001",
"type": "donut",
"name": "Cake",
"ppu": 0.55
},
{
"id": "0002",
"type": "donut",
"name": "Raised",
"ppu": 0.35
},
{
"id": "0003",
"type": "donut",
"name": "Old Fashioned",
"ppu": 0.25
}]
The "docs" must be an array of array of params, otherwise, the procedure executor will treat them as multiple params of the procedure, not a single-array-param.
the following code works when call storedProcedure to pass argument with array type.
JS:
var docs = [{'id':1},{'id':2}];
executeStoredProcedure(proc, [docs])
C#
var docs = new[] {new MyDoc{id=1, source="abc"}, new MyDoc{id=2, source="abc"}];
dynamic[] args = new dynamic[] {docs};
ExecuteStoredProcedureAsync<int>(
procLink,
new RequestOptions {PartitionKey = new PartitionKey("abc")},
args);
NOTE: you must ensure the 'docs' have the same partition key, and pass partion key in RequestionOptions
I had the same problem. I was able to get it to work by Stringify the Array and parse it in the stored procedure. I opened an issue on the github where that code originated as well. Below is what worked for me. Good luck.
---- Stringify Array
var testArr = []
for (var i = 0; i < 50; i++) {
testArr.push({
"id": "test" + i
})
}
var testArrStr = JSON.stringify(testArr)
//pass testArrStr to stored procedure and parse in stored procedure
---- Slightly altered original BulkImport
exports.storedProcedure = {
id: "bulkImportArray",
serverScript:function bulkImportArray(docs) {
var context = getContext();
var collection = context.getCollection();
var docsToCreate = JSON.parse(docs)
var count = 0;
var docsLength = docsToCreate.length;
if (docsLength == 0) {
getContext().getResponse().setBody(0);
}
var totals = ""
function insertDoc(){
var msg = " count=" + count+" docsLength=" +docsLength + " typeof docsToCreate[]=" + typeof docsToCreate+ " length =" + docsToCreate.length
if(typeof docsToCreate[count] != 'undefined' ) {
collection.createDocument(collection.getSelfLink(),
docsToCreate[count],
function (err, documentCreated) {
if (err){
// throw new Error('Error' + err.message);
getContext().getResponse().setBody(count + " : " + err);
}else{
if (count < docsLength -1) {
count++;
insertDoc();
getContext().getResponse().setBody(msg);
} else {
getContext().getResponse().setBody(msg);
}
}
});
}else{
getContext().getResponse().setBody(msg);
}
}
insertDoc()
}
}
If you want to test it in the portal Script Explorer I had to create an escaped string i.e.
var testArr = []
for(var i=200; i<250; i++){
testArr.push({"id":"test"+i})
}
var testArrStr = JSON.stringify(testArr)
console.log('"'+testArrStr.replace(/\"/g,'\\"') + '"')

Using MongoJs and Node.JS to store data

I have successfully connected to MongoDb and used a form to send some data. I would now like to store the data in a json document with a collection. So my current output is:
{
"_id": "53be957007d2c838083046a6",
"subscriberinfo": "X",
"grouporpolicynumber": "X",
"title": "X",
"clientnumber": "X",
"xnumber": "X",
"postedOn": "2014-07-10T13:30:24.499Z"
}
I would like it to look like:
{
"_id": "53be957007d2c838083046a6",
"ReferenceInfo": {
"subscriberinfo": "00003",
"grouporpolicynumber": "Direct",
"title": "SNP",
"clientnumber": "VG00003M",
"HICnumber": "264134187C"
}
"postedOn": "2014-07-10T13:30:24.499Z"
}
Current code looks like this:
function postNewJob(req , res , next){
var job = {};
job.subscriberinfo = req.params.subscriberinfo;
job.grouporpolicynumber = req.params.grouporpolicynumber;
job.title = req.params.clientreportingcategory;
job.clientnumber = req.params.clientnumber;
job.HICnumber = req.params.hicnumber;
job.postedOn = new Date();
res.setHeader('Access-Control-Allow-Origin','*');
memberInfo.save(job , function(err , success){
console.log('Response success '+success);
console.log('Response error '+err);
if(success){
res.send(201 , job);
return next();
}else{
return next(err);
}
});
}
Have you tried something like this?
function postNewJob(req , res , next){
var job = {
referenceInfo: {
subscriberinfo : req.params.subscriberinfo,
grouporpolicynumber : req.params.grouporpolicynumber,
title : req.params.clientreportingcategory,
clientnumber : req.params.clientnumber,
HICnumber : req.params.hicnumber
},
postedOn: new Date();
};
res.setHeader('Access-Control-Allow-Origin','*');
memberInfo.save(job, function(err, job){
if(err) {
console.error('Response error ' + err);
next(err);
} else {
console.log('Response success ' + job);
res.send(201, job);
next(req, res);
}
});
}
On another note, I changed around your callback function that's passed in to save. Mongo will call the function, passing in error and data (data in this case being job) which allows you to conditionally log error or success depending on the status of the save. Also, because those are asynchronous functions, returning the call of next() will not actually impact the code. Also, when calling next(), it is advisable to pass along the req and res from before, so that more actions can be taken from that data.
Hope that helps!

Read csv to object of object for d3 [datamaps]

I'm using datamaps and would like to be able to read the data from a csv file.
The data format that datamaps is expecting is the following:
var loadeddata = {
"JPN":{Rate:17.5,fillKey:"firstCat"},
"DNK":{Rate:16.6,fillKey:"secondCat"}
};
I would like to read a csv file of the following structure and transform it into the format that datamaps is expecting:
ISO, Rate, fillKey
JPN, 17.5, firstCat
DNK, 16.6, secondCat
My 'best attempt' was using the following code:
var csvloadeddata;
d3.csv("simpledata.csv", function (error, csv) {
if (error) return console.log("there was an error loading the csv: " + error);
console.log("there are " + csv.length + " elements in my csv set");
var nestFunction = d3.nest().key(function(d){return d.ISO;});
csvloadeddata = nestFunction.entries(
csv.map(function(d){
d.Rate = +d.Rate;
d.fillKey = d.fillKey;
return d;
})
);
console.log("there are " + csvloadeddata.length + " elements in my data");
});
But this code generates a variable 'csvloadeddata' that looks like this:
var csvloadeddata = [
{"key": "JPN", "values": { 0: {Rate:17.5, fillKey:"firstCat"}} },
{"key": "DNK", values : { 1: {Rate:16.6,fillKey:"secondCat"}} }
];
What am I doing wrong?
Found the answer myself. If somebody is interested – this is what I ended up using:
<script>
d3.csv("simpledata.csv", function(error, csvdata1) {
globalcsvdata1 = csvdata1;
for (var i=0;i<csvdata1.length;i++)
{
globalcsvdata1[ globalcsvdata1[i].ISO] = globalcsvdata1[i] ;
//console.log(globalcsvdata1[i]);
delete globalcsvdata1[i].ISO;
delete globalcsvdata1[i] ;
}
myMap.updateChoropleth(globalcsvdata1);
}
);
var myMap = new Datamap({
element: document.getElementById('map'),
scope: 'world',
geographyConfig: {
popupOnHover: true,
highlightOnHover: false
},
fills: {
'AA': '#1f77b4',
'BB': '#9467bd',
defaultFill: 'grey'
}
});
</script>
</body>
The csv has the following structure:
ISO,fillKey
RUS,AA
USA,BB
Here is a working example: http://www.explainingprogress.com/wp-content/uploads/datamaps/uploaded_gdpPerCapita2011_PWTrgdpe/gdpPerCapita2011_PWTrgdpe.html

ServiceStack.Text.JsonObject.Parse vs. NewtonSoft.Json.Linq.JObject.Parse for nested tree of 'dynamic' instances?

I'd like to try ServiceStack's json parsing, but I've already figured out how to do something I need via Newtonsoft. Can this same thing by done via ServiceStack?
I've tried with the commented out code but it gives exceptions, see below for exception details.
Thanks!
Josh
[Test]
public void TranslateFromGitHubToCommitMessage()
{
const string json =
#"
{
'commits':
[
{
'author': {
'email': 'dev#null.org',
'name': 'The Null Developer'
},
'message': 'okay i give in'
},
{
'author': {
'email': 'author#github.com',
'name': 'Doc U. Mentation'
},
'message': 'Updating the docs, that\'s my job'
},
{
'author': {
'email': 'author#github.com',
'name': 'Doc U. Mentation'
},
'message': 'Oops, typos'
}
]
}
";
dynamic root = JObject.Parse(json);
//dynamic root = ServiceStack.Text.JsonSerializer.DeserializeFromString<JsonObject>(json);
//dynamic root = ServiceStack.Text.JsonObject.Parse(json);
var summaries = new List<string>();
foreach (var commit in root.commits)
{
var author = commit.author;
var message = commit.message;
summaries.Add(string.Format("{0} <{1}>: {2}", author.name, author.email, message));
}
const string expected1 = "The Null Developer <dev#null.org>: okay i give in";
const string expected2 = "Doc U. Mentation <author#github.com>: Updating the docs, that's my job";
const string expected3 = "Doc U. Mentation <author#github.com>: Oops, typos";
Assert.AreEqual(3, summaries.Count);
Assert.AreEqual(expected1, summaries[0]);
Assert.AreEqual(expected2, summaries[1]);
Assert.AreEqual(expected3, summaries[2]);
}
Exceptions Detail
When using the first commented out line:
dynamic root = ServiceStack.Text.JsonSerializer.DeserializeFromString<JsonObject>(json);
This exception occurs when the method is called.
NullReferenceException:
at ServiceStack.Text.Common.DeserializeListWithElements`2.ParseGenericList(String value, Type createListType, ParseStringDelegate parseFn)
at ServiceStack.Text.Common.DeserializeEnumerable`2.<>c__DisplayClass3.<GetParseFn>b__0(String value)
at ServiceStack.Text.Common.DeserializeSpecializedCollections`2.<>c__DisplayClass7. <GetGenericEnumerableParseFn>b__6(String x)
at ServiceStack.Text.Json.JsonReader`1.Parse(String value)
at ServiceStack.Text.JsonSerializer.DeserializeFromString[T](String value)
at GitHubCommitAttemptTranslator.Tests.GitHubCommitAttemptTranslatorTests.TranslateFromGitHubToCommitMessage()
And, the second:
dynamic root = ServiceStack.Text.JsonObject.Parse(json);
var summaries = new List<string>();
foreach (var commit in root.commits) // <-- Happens here
'ServiceStack.Text.JsonObject' does not contain a definition for 'commits'
Note: the message is 'string' does not contain a definition for 'commits' if I use code from line one, but change the type to or to instead of
at CallSite.Target(Closure , CallSite , Object )
at System.Dynamic.UpdateDelegates.UpdateAndExecute1[T0,TRet](CallSite site, T0 arg0)
at GitHubCommitAttemptTranslator.Tests.GitHubCommitAttemptTranslatorTests.TranslateFromGitHubToCommitMessage()
After using DynamicJson from .NET 4.0 ServiceStack
Referring to mythz's comment:
This test case works, but if I modify it like below:
var dog = new { Name = "Spot", Parts = new { Part1 = "black", Part2 = "gray" }, Arr = new [] { "one", "two", "three"} };
var json = DynamicJson.Serialize(dog);
var deserialized = DynamicJson.Deserialize(json);
Then, deserialized.Name and Parts are fine, but Arr is of type string.
Also:
If I use ' quotes it doesn't appear to work. Is that normal? json2 works (to the degree that Arr is also still a string), but json3 does not work at all. It just returns
Immediate Window:
deserialized = DynamicJson.Deserialize(json3);
{}
base {System.Dynamic.DynamicObject}: {}
_hash: Count = 1
----- code: -----
var json2 =
#"
{
""Name"": ""Spot"",
""Parts"": {
""Part1"": ""black"",
""Part2"": ""gray""
},
""Arr"": [
""one"",
""two"",
""three""
]
}";
var json3 =
#"
{
'Name': 'Spot',
'Parts': {
'Part1': 'black',
'Part2': 'gray'
},
'Arr': [
'one',
'two',
'three'
]
}";
var deserialized = DynamicJson.Deserialize(json1);
ServiceStack's JSON Serializer also supports dynamic parsing, see examples of how to parse GitHub's JSON in the Dynamic JSON section of the wiki page.