I'm trying to load a UTF8 json file from disk using node.js (0.10.29) on Windows 8.1. The following is the code that runs:
var http = require('http');
var utils = require('util');
var path = require('path');
var fs = require('fs');
var myconfig;
fs.readFile('./myconfig.json', 'utf8', function (err, data) {
if (err) {
console.log("ERROR: Configuration load - " + err);
throw err;
} else {
try {
myconfig = JSON.parse(data);
console.log("Configuration loaded successfully");
}
catch (ex) {
console.log("ERROR: Configuration parse - " + err);
}
}
});
I get the following error when I run this:
SyntaxError: Unexpected token ´╗┐
at Object.parse (native)
...
Now, when I change the file encoding (using Notepad++) to ANSI, it works without a problem.
Any ideas why this is the case? Whilst development is being done on Windows the final solution will be deployed to a variety of non-Windows servers, I'm worried that I'll run into issues on the server end if I deploy an ANSI file to Linux, for example.
According to my searches here and via Google the code should work on Windows as I am specifically telling it to expect a UTF-8 file.
Sample config I am reading:
{
"ListenIP4": "10.10.1.1",
"ListenPort": 8080
}
Per "fs.readFileSync(filename, 'utf8') doesn't strip BOM markers #1918", fs.readFile is working as designed: BOM is not stripped from the header of the UTF-8 file, if it exists. It at the discretion of the developer to handle this.
Possible workarounds:
data = data.replace(/^\uFEFF/, ''); per https://github.com/joyent/node/issues/1918#issuecomment-2480359
Transform the incoming stream to remove the BOM header with the NPM module bomstrip per https://github.com/joyent/node/issues/1918#issuecomment-38491548
What you are getting is the byte order mark header (BOM) of the UTF-8 file. When JSON.parse sees this, it gives an syntax error (read: "unexpected character" error). You must strip the byte order mark from the file before passing it to JSON.parse:
fs.readFile('./myconfig.json', 'utf8', function (err, data) {
myconfig = JSON.parse(data.toString('utf8').replace(/^\uFEFF/, ''));
});
// note: data is an instance of Buffer
To get this to work without I had to change the encoding from "UTF-8" to "UTF-8 without BOM" using Notepad++ (I assume any decent text editor - not Notepad - has the ability to choose this encoding type).
This solution meant that the deployment guys could deploy to Unix without a hassle, and I could develop without errors during the reading of the file.
In terms of reading the file, the other response I sometimes got in my travels was a question mark appended before the start of the file contents, when trying various encoding options. Naturally with a question mark or ANSI characters appended the JSON.parse fails.
Hope this helps someone!
New answer
As i had the same problem with several different formats I went ahead and made a npm that try to read textfiles and parse it as text, no matter the original format. (as original question was to read a .json it would fit perfect). (files without BOM and unknown BOM is handled as ASCII/latin1)
https://www.npmjs.com/package/textfilereader
So change the code to
var http = require('http');
var utils = require('util');
var path = require('path');
var fs = require('textfilereader');
var myconfig;
fs.readFile('./myconfig.json', 'utf8', function (err, data) {
if (err) {
console.log("ERROR: Configuration load - " + err);
throw err;
} else {
try {
myconfig = JSON.parse(data);
console.log("Configuration loaded successfully");
}
catch (ex) {
console.log("ERROR: Configuration parse - " + err);
}
}
});
Old answer
Run into this problem today and created function to take care of it.
Should have a very small footprint, assume it's better than the accepted replace solution.
function removeBom(input) {
// All alternatives found on https://en.wikipedia.org/wiki/Byte_order_mark
const fc = input[0].charCodeAt(0).toString(16);
switch (fc) {
case 'efbbbf': // UTF-8
case 'feff': // UTF-16 (BE) + UTF-32 (BE)
case 'fffe': // UTF-16 (LE)
case 'fffe0000': // UTF-32 (LE)
case '2B2F76': // UTF-7
case 'f7644c': // UTF-1
case 'dd736673': // UTF-EBCDIC
case 'efeff': // SCSU
case 'fbee28': // BOCU-1
case '84319533': // GB-18030
return input.slice(1);
break;
default:
return input;
}
}
const fileBuffer = removeBom(fs.readFileSync(filePath, "utf8"));
Related
I am having an odd issue, whereby my app is writing JSON into a file, but in some cases, it is leaving spurious characters at the end.
It does not happen all the time, making it difficult to work around.
The JSON files are created inside the app, to be used later inside the app (though some are being sent to an API and I occasionally see this same issue). What I see, is after the final closing brace, are parts of a previously saved data. It is as though the writeAsString is not truncating the file... just writing over the top, and if what it is writing is shorter, then leaving the rest inside the file.
An example...
// Working with my map<String, dynamic> adding or modifying fields
// In this case, my map is called sFormData
await jFile.writeFile("Submitted.json", json.encode(sFormData));
the writeFile routine is...
Future<File> writeFile(String fileName, String content) async {
final path = await localPath;
File file = File('$path/' + fileName
.split('/')
.last);
// Write the file.
return await file.writeAsString(content);
}
which without a FileMode, should default to FileMode.Write, which should truncate the original file during writing.
Mostly, this is fine. However, when it breaks, then either when it is sent to the API, or re-used again inside the app... then the issues start. Inside the app, I am getting errors like...
FormatException: Unexpected character (at line x, character y)
when I try
String filejson = await file.readFile(fileName);
// When I look at filejson, I can see the extra characters, which causes the jsonDecode below to break
List<InProgressOrSubmittedItems> formsList = InProgressOrSubmittedList.fromJson(jsonDecode(filejson)).pForms as List<InProgressOrSubmittedItems>;
This leads me to believe that it is something in the writeAsString method that is not clearing down the file before writing.
=== EDIT ===
After trying things, this appears to work, but I think it is more of a hack. Can anyone see any potential issues with this?
Future<File> writeJsonFile(String fileName, Map jsonData) async {
final path = await localPath;
File file = File('$path/' + fileName
.split('/')
.last);
String encodedJson = json.encode(jsonData);
await file.writeAsString(encodedJson);
try{
// Test the written json...
String jsonContent = await file.readAsString();
jsonData = json.decode(jsonContent);
}
catch (e) {
// Oh-oh... if we got here, the Json did not save properly.
// Let's try again.
// Try deleting the file first this time.
if (await file.exists()) {
await file.delete();
}
await file.writeAsString(encodedJson);
}
// Write the file.
return file;
}
I'm trying to send a sepcial/French character like : ÆÇÈ-1 Çâfé's Çôrp-Ltd in the JSON request, but it is failing with the below error:
{ "error":{ "code":"4000", "message":"Invalid Request From Consumer. Error Description :0x00c30025 Unable to parse JSON and generate JSONx: illegal character 's' at offset 8951
whereas, when I try to pass the same name- ÆÇÈ-1 Çâfé's Çôrp-Ltd as a XML request, it is getting processed with success response.
Please help me resolve the issue with the JSON request
First thing to check is that your service (a MPGW I assume) is set to Request data: JSON and not XML.
If JSON as request data isn't working either try with non-XML.
DataPower runs different parsers/validators depending on the content data type but as long as the JSON itself is valid JSON should work, else open a PMR!
If you still can't figure it out test that the incoming data is detected as UTF-8 and not as Latin1. Add a GWS action and feed the INPUT to it and try something like:
// This is where we start, grab the INPUT as a buffer
session.input.readAsBuffers(function(readAsBuffersError, data) {
if (readAsBuffersError) {
console.error('Error on readAsBuffers:', readAsBuffersError);
} else {
let content = data.toString();
if (content.length === 0) {
console.error('Empty message found for Request message!');
} else {
try {
// This seems like an overkill solution but we need to know if data fetched is UTF-8 or Latin1
// If Latin1 we need to wobble it around to not wreck "double-bytes", e.g. Å, Ä or Ö
const util = require('util');
const td = new util.TextDecoder('utf8', { fatal: true });
td.decode(Buffer.from(content));
console.log('Buffer data is UTF-8!');
} catch (err) {
// It is not UTF-8, since it failed so we'll assume it is Latin1. If not it will throw it's own Buffer error or transformation will fail...
if (err.message === 'The encoded data was not valid for encoding utf-8') {
const l1Buffer = Buffer.from(content, 'latin1');
const l1String = l1Buffer.toString('latin1');
content = Buffer.from(l1String, 'utf8');
console.log('Buffer data was Latin1, now converted into UTF-8 successfully!');
}
}
session.output.write(content);
}
}
});
I'm writing a method that uses async/await and promises to write some JSON to a file and then render a pug template. But for some reason the code that writes the JSON conflicts with the res.render() method resulting in the browser not being able to connect to the server.
The weird thing is that I don't get any errors in the console, and the JSON file is generated as expected — the page just won't render.
I'm using the fs-extra module to write to disk.
const fse = require('fs-extra');
exports.testJSON = async (req, res) => {
await fse.writeJson('./data/foo.json', {Key: '123'})
.then(function(){
console.log('JSON updated.')
})
.catch(function(err){
console.error(err);
});
res.render('frontpage', {
title: 'JSON Updated...',
});
}
I'm starting to think that there is something fundamental I'm not getting that conflicts with promises, writing to disk and/or express' res.render method. It's worth noting that res.send() works fine.
I've also tried a different NPM module to write the file (write-json-file). It gave me the exact same issue.
UPDATE:
So I'm an idiot. The problem has nothing to do with Express og the JSON file. It has to do with the fact that I'm running nodemon to automatically restart the server when files are changed. So as soon as the JSON file was saved the server would restart, stopping the process of rendering the page. Apologies to the awesome people trying to help me anyway. You still helped me get to the problem, so I really appreciate it!
Here's the actual problem:
The OP is running nodemon to restart the server whenever it see filechanges, and this is what stops the code from running, because as soon as the json file is generated the server restarts.
Efforts to troubleshoot:
It's going to take some trouble shooting to figure this out and since I need to show you code, I will put it in an answer even though I don't yet know what is causing the problem. I'd suggest you fully instrument things with this code:
const fse = require('fs-extra');
exports.testJSON = async (req, res) => {
try {
console.log(`1:cwd - ${process.cwd()}`);
await fse.writeJson('./data/foo.json', {Key: '123'})
.then(function(){
console.log('JSON updated.')
}).catch(function(err){
console.error(err);
});
console.log(`2:cwd - ${process.cwd()}`);
console.log("about to call res.render()");
res.render('frontpage', {title: 'JSON Updated...',}, (err, html) => {
if (err) {
console.log(`res.render() error: ${err}`);
res.status(500).send("render error");
} else {
console.log("res.render() success 1");
console.log(`render length: ${html.length}`);
console.log(`render string (first part): ${html.slice(0, 20}`);
res.send(html);
console.log("res.render() success 2");
}
});
console.log("after calling res.render()");
} catch(e) {
console.log(`exception caught: ${e}`);
res.status(500).send("unknown exception");
}
}
I'm writing a node.js module which imports a JSON file:
const distDirPath = "c:/temp/dist/";
const targetPagePath = "c:/temp/index.html";
const cliJsonPath = "C:/CODE/MyApp/.angular-cli.json";
const fs = require('fs');
function deployAot()
{
var version = JSON.parse(fs.readFileSync(cliJsonPath, 'utf8')).version;
}
// export the module
module.exports = {
DeployAot: deployAot
};
I validated the contents of the json file above in https://jsonlint.com/ and it's valid json but the first line of code above in deployAot() returns the following error when I exec the node module:
"Unexpected token in JSON at position 0"
Here's the specific json:
https://jsonblob.com/cd6753d2-9e51-11e7-aa97-2f95b001b178
Any idea what the problem might be here?
As #cartant already mentioned in comments to the question, most probably you get a special character (Byte order mark) at the beginning of the file.
I would try to replace this
fs.readFileSync(cliJsonPath, 'utf8')
with this
fs.readFileSync(cliJsonPath, 'utf8').substring(1)
to get rid of the very first character from the string and would see what happens.
GitHub issue fs.readFileSync(filename, 'utf8') doesn't strip BOM markers
Recommendation from the issue:
Workaround:
body = body.replace(/^\uFEFF/, '');
After reading a UTF8 file where you are uncertain whether it may have a BOM marker in it.
Hello i just started learning Nodejs and made a local server as a start
then i saw that most nodejs apps have config and package files i couldnt find any info on how to do a simple one or use JSON files so i tried myself this is what i got so far
this is the server file
var http = require('http');
var json = require('./package');
var fs = require('fs');
var server = http.createServer(function(req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(addr.port);
console.log('server listening at', addr.address + ':' + addr.port);
and this is the json file
{
"addr": {
"address":"http://127.0.0.1",
"port":"8081"
}
}
i know that it will work with json.address and json.port
but when i added "addr" i thought it would simplify things with addr.port
so in short an explanation would be generously accepted on why it wont/shouldnt work or what im doing wrong
First of you should have a look at some tutorials or introduction sites like:
https://www.w3schools.com/nodejs/default.asp
Second:
The package.json file is the main configuration file of your nodeJS application. Thats the config file that defines your start point of your application as well as all included modules. simply use npm init to create a default package.json file with basic information.
Third:
If you require a json into your application as you did in your example the JSON is included hierarchically. Wich means The object you required has an attribute addr which itself is a new object with an attribute address.
So the correct way to access your information is json.addr.address based on your object description
you could also do something like this:
var network = require('./settings').addr;
console.log("ip => " + network.address);
console.log("port => " + network.port);
You need to list the parent object. You have put addr.address and addr.port, this means you are directly trying to access the addr object, but the this object doesn't exist. Try doing json.addr.address and json.addr.port and it should work.
var http = require('http');
var json = require('./package');
var fs = require('fs');
var server = http.createServer(function(req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(json.addr.port);
console.log('server listening at', json.addr.address + ':' + json.addr.port);