I have this fs.writeFile code that is suppose to update the localdata.json file when a newWorkout is POST'ed to the database, this takes local state data in an attempt to write to the file..
it wont work though and throws a , TypeError: fs.writeFile is not a function error.. working on the fix now but if anyone sees anything help is appreciated.
fs.writeFile(
"./localdata.json",
JSON.stringify(newWorkout.eventDateTime),
"utf-8",
function(err) {
if (err) throw err
console.log("Done!")
}
)
Given that you are working in a Node.js environment, is seems like fs is not a proper Node.js File System object. Copied from the Node.js 10.x documentation:
To use this module:
const fs = require('fs');
Nodes File System cannot be used "browser side" these calls are meant to happen on the server side of things.
Did you include:
var fs = require("fs");
...the rest of your code...
Related
So I am attempting to learn how to use the Google Sheets API with Node.js. In order to get an understanding, I followed along with the node.js quick start guide supplied by Google. I attempted to run it, nearly line for line a copy of the guide, just without documentation. I wind up encountering this: cmd console output that definitely didn't work.
Just in case anyone wants to see if I am not matching the guide, which is entirely possible since I am fairly new to this, here is a link to the Google page and my code.
https://developers.google.com/sheets/api/quickstart/nodejs
var fs = require('fs');
var readline = require('readline');
var google = require('googleapis');
var googleAuth = require('google-auth-library');
var SCOPES = ['https://www.googleapis.com/auth/spreadsheets.readonly'];
var TOKEN_DIR = (process.env.HOME || process.env.HOMEPATH ||
process.env.USERPROFILE) + '/.credentials/';
var TOKEN_PATH = TOKEN_DIR + 'sheets.googleapis.com-nodejs-quickstart.json';
fs.readFile('client_secret.json', function processClientSecrets(err, content) {
if (err) {
console.log('Error loading client secret file: ' + err);
}
authorize(JSON.parse(content), listMajors);
});
I have tried placing the JSON file in each and every part of the directory, but it still won't see it. I've been pulling hairs all day, and a poke in the right direction would be immensely appreciated.
From your command output:
Error loading client secret file
So your if (err) line is being triggered. But since you don't throw the error, the script continues anyway (which is dangerous in general).
SyntaxError: Unexpected token u in JSON at position 0
This means that the data you are passing to JSON.parse() is undefined. It is not a valid JSON string.
You could use load-json-file (or the thing it uses, parse-json) to get more helpful error messages. But it's caused by the fact that your content variable has nothing since the client_secret.json you tried to read could not be found.
As for why the file could not be found, there could be a typo in either the script or the filename you saved the JSON in. Or it may have to do with the current working directory. You may want to use something like this to ensure you end up with the same path regardless of the current working directory.
path.join(__dirname, 'client_secret.json')
Resources
path.join()
__dirname
i've build my first Node app in which i need to use 5-10 global variables a lot. The thing is i would like to be able to change those values without restarting the server.
So what i though was setup an interval and update those files either from a ( JSON ? ) file or through a couple of queries to the database.
Now what would be my better option here ? Both mysql and read file modules are used in the app.
Security based wouldn't it be best to place the json file behind the public folder and read from that ? Although without sql injection being possible i think in the DB should be pretty safe too.
What do you guys think ?? Still a newbie in Node JS.
Thanks
With yamljs, the overhead is that you will need to install it. From your question, it seems you are already using a bunch of 3rd party modules in your app.
Why not use something that is a part of node.js itself?
Use the fs module.
Example:
var fs = require('fs');
var obj;
fs.readFile('myData.json', 'utf8', function (err, data) {
if (err) throw err;
obj = JSON.parse(data);
});
Docs: https://nodejs.org/api/fs.html#fs_fs_readfile_file_options_callback
A common technique for passing configuration variables to a server is via a YAML file. Normally this file is read once when the server starts but you could periodically query they file to see when it was last updated and if the file was changed update the configuration variables currently in use.
yamljs
YAML = require('yamljs');
config = YAML.load('myfile.yml');
then you can periodically check the last time a file was modified using the mtime property of fs.stat
fs.stat(path, [callback])
If you find that the last modified time has changed then you can re-read the YAML file and update your config with the new values. ( you will probably want to do a sanity check to make sure the file wasn't corrupted etc. )
If you don't want to write the file watching logic yourself I recommend checking out chokidar
// Initialize watcher.
var watcher = chokidar.watch('myfile.yml', {
ignored: /[\/\\]\./,
persistent: true
});
// Add event listeners.
watcher.on('change', function(path) {
// Update config
})
I have been getting this error FATAL ERROR: JS Allocation failed - process out of memory and I have pinpointed it to be the problem that I am sending really really large json object to res.json (or JSON.stringify)
To give you some context, I am basically sending around 30,000 config files (each config file has around 10,000 lines) as one json object
My question is, is there a way to send such a huge json object or is there a better way to stream it (like using socket.io?)
I am using: node v0.10.33, express#4.10.2
UPDATE: Sample code
var app = express();
app.route('/events')
.get(function(req, res, next) {
var configdata = [{config:<10,000 lines of config>}, ... 10,000 configs]
res.json(configdata); // The out of memory error comes here
})
After a lot of try, I finally decided to go with socket.io to send each config file at a time rather than all config files at once. This solved the problem of out of memory which was crashing my server. thanks for all your help
Try to use streams. What you need is a readable stream that produces data on demand. I'll write simplified code here:
var Readable = require('stream').Readable;
var rs = Readable();
rs._read = function () {
// assuming 10000 lines of config fits in memory
rs.push({config:<10,000 lines of config>);
};
rs.pipe(res);
You can try increasing the memory node has available with the --max_old_space_size flag on the command line.
There may be a more elegant solution. My first reaction was to suggest using res.json() with a Buffer object rather than trying to send the entire object all in one shot, but then I realize that whatever is converting to JSON will probably want to use the entire object all at once anyway. So you will run out of memory even though you are switching to a stream. Or at least that's what I would expect.
EDIT
The problem seems to be related to WebStorm itself, it seems that it doesn't want to work with objects containing huge amount of nested objects. Neither it wants to show the object contents inside Watches window. The problem is kinda strange because I'm able to inspect the string, it is loaded blazingly fast. Seems like a WebStorm issue
I have a relatively big JSON file 4.9mb that I need to process in NodeJS, the file is stored in file system and is loadded using following lines of code:
var path = require('path');
var filename = path.join(__dirname, 'db_asci.json');
var fs = require('fs');
var content = fs.readFileSync(filename);
debugger;
var decycledObj = JSON.parse(content);
debugger;
The problem is that after the first debugger; breakpoint is hit, the second one is not, I'm waiting for more than 20 minutes and nothing, one processor core is loadded at 100%. I'm unable to debug the function because it's native.
Here is ASCI version of JSON
Here is UTF8 version of JSON
What am I doing wrong?
The problem you are running in to is not JSON parsing taking too long. Indeed, try this:
var start = Date.now();
var obj = JSON.parse(fs.readFileSync(filename));
console.log('Took', Date.now() - start, 'ms');
You'll probably see that it took less than a second or so.
What you are running into is an issue with the debugger itself – the observer effect. The act of observing a system changes that system.
I assume you're using node-inspector. Whenever you have an extremely large, complex object, it is extremely expensive to load the object into the inspector. While it is doing so, your node process will peg the CPU and the event loop is paused.
My guess is that the JSON is parsed and a huge (given that we're dealing with 5MB) object is created. Node then hits the second debugger, and the inspector needs to load locals. The excruciatingly slow process begins, and the inspector won't show that you've hit a breakpoint until it finishes. So to you it just looks frozen.
Try replacing your JSON file with something small (like {a:1}). It should load quickly.
Do you really need to visually inspect the entire object? There are tools better suited for viewing JSON files.
+1 for Pradeep Mahdevu solution, here is another way to the same thing, (edit with the async version)
var fs = require ('fs');
var options = { encoding: 'utf8' };
var jsonData = fs.readFile('db_asci.json', options, function (err, data) {
if (err) throw err;
var object = JSON.parse(data);
});
You can require .json files. So, no need to parse.
var content = require('./db_asci.json');
That should do it!
I'm using Node.exe in the following file structure
Node/
node.exe
index.js
/view
index.html
When running the following code:
var html;
fs.readFileSync('/view/index.html', function(err, data) {
if(err){
throw err;
}
html = data;
});
I get the following error:
Error: ENOENT, The system cannot find the file specified. '/view/index.html'
Can you see what's causing the error? I'm quite new to node.js.
Additional information:
I'm running Windows 7 64 bit, up to date version of node.exe.
I've found the solution!
When node.exe is run through cmd the default directory for node.exe is user.... that's where I was going wrong, it wa using a different directory to where the node.exe is located.
Few things:
You should resolved the relative path first to real path and try reading the file.
Read the file asynchronously to get the callback
Relative path should be corrected. The "./view/index.html" in my code is the path relative to where you start your node.js engine.
Rough code will be:
// get real path from relative path
fs.realpath('./view/index.html', function(err, resolvedPath) {
// pass the resolved path to read asynchronously
fs.readFile(resolvedPath, function(err, data) {
// assign the variable per your use case
html = data;
})
});
Note that I am using version 4.11 (latest stable version)
You might wanna lose the Sync part. Only readFile when you have a callback.
Also: ./path, not /path.