I am trying to use own .extension for .JSON files. But when I require them node don't recognize them.
For example when I do:
var users = require('users.json');
users is now a object like:
name: 'somebody', age: 27
When I do just the same file but other .extension"
require('users.myextension')
users is now empty;
=> {}
Is there a way to fix this, otherwise I have to use just JSON as extension.
You can look at globals
In short you can instruct how require should load a file in this way:
require.extensions['.extension'] = require.extensions['.json'];
Please mind that this is deprecated but as stated in the note the feature probably will not be removed since the module system is locked.
Related
I am trying to read a csv file in a firebase function so that I can process the file and do the rest operations using the data.
import * as csv from "csvtojson";
const csvFilePath = "<gdrive shared link>"
try{
console.log("First Method...")
csv()
.fromFile(csvFilePath)
.then((jsonObj: any)=>{
console.log("jsonObj....",JSON.stringify(jsonObj));
})
console.log("Second Method...")
const jsonArray=await csv().fromFile(csvFilePath);
console.log("jsonArray...", JSON.stringify(jsonArray))
}
catch(e){
console.log("error",JSON.stringify(e))
}
The above mentioned are the 2 methods I have tried for reading the csv but both shows the firebase error
'Error: File does not exist. Check to make sure the file path to your csv is correct.'
In case of 'csvFilePath' I have tried 2 methods
Just added the csv file in same folder of the function and added the code like
const csvFilePath = "./student.csv"
Added the same file to google drive and changed the access permissions to anyone with the link can read and edit and given the path to same
const csvFilePath = "<gdrive shared link>"
Both shows the same error. In case of google drive I don't want to use any sort of google credential because I was intented to read a simple csv file in firebase function.
I will start by proposing that you convert your csv to json locally or without the function and see if it works. This is because I see you are using ES6 imports which might be causing an issue since all the documentation uses require. You can also try CSV Parse or some solutions provided in this question as an alternative, trying them without the function to check if it actually works and discard it. Actually, you can upload the JSON once you have converted it from the csv, but that depends on what you are trying to do.
I think the best way to achieve this, is following the approach given in this question, that first uploads the file into cloud storage and using onFinalize() to trigger the conversion.
Also, will address these three questions that went through similar issues with the path. They were able to fix it by adding __dirname. Each one has some extra useful information.
Context for "relative paths" seems to change to the calling module if a module is imported
The csvtojson converter ignores my file name and just puts undefined
How to avoid the error which throws a csvtojson
I'm building an simple stand-alone angular application in which u can submit information.
This information needs to be saved in a local JSON file that's in the asset folder. I know Angular runs in a web browser that's why I use electron to build it. The problem is that i can't figure out a way to edit the JSON files in angular 5 using electron (local).
I have tried the solutions mentioned in this post, But they didn't work for me, any other solutions?
After having this problem for quite some time i finally figured out how to solve it:
you need to put this in script tags in your index.html
var remote = require('electron').remote
var fs = remote.require('fs');
and in every component you want to use it you need to declare it globally
declare var fs: any;
then you can use it!
was quite a struggel to figure it out...
Because it's just JSON data, might I suggest using localStorage instead? Then, you can do something like:
...// Code that creates the JSON object
var theJSONdata = jsonObj.stringify(); // conver the object to a string
window.localStorage.setItem('mysavedJSON', theJSONdata)
;
Later, when you need to load the JSON to edit it or read it, just use:
jsonObj = JSON.parse(window.localStorage.getItem('mysavedJSON');
i've build my first Node app in which i need to use 5-10 global variables a lot. The thing is i would like to be able to change those values without restarting the server.
So what i though was setup an interval and update those files either from a ( JSON ? ) file or through a couple of queries to the database.
Now what would be my better option here ? Both mysql and read file modules are used in the app.
Security based wouldn't it be best to place the json file behind the public folder and read from that ? Although without sql injection being possible i think in the DB should be pretty safe too.
What do you guys think ?? Still a newbie in Node JS.
Thanks
With yamljs, the overhead is that you will need to install it. From your question, it seems you are already using a bunch of 3rd party modules in your app.
Why not use something that is a part of node.js itself?
Use the fs module.
Example:
var fs = require('fs');
var obj;
fs.readFile('myData.json', 'utf8', function (err, data) {
if (err) throw err;
obj = JSON.parse(data);
});
Docs: https://nodejs.org/api/fs.html#fs_fs_readfile_file_options_callback
A common technique for passing configuration variables to a server is via a YAML file. Normally this file is read once when the server starts but you could periodically query they file to see when it was last updated and if the file was changed update the configuration variables currently in use.
yamljs
YAML = require('yamljs');
config = YAML.load('myfile.yml');
then you can periodically check the last time a file was modified using the mtime property of fs.stat
fs.stat(path, [callback])
If you find that the last modified time has changed then you can re-read the YAML file and update your config with the new values. ( you will probably want to do a sanity check to make sure the file wasn't corrupted etc. )
If you don't want to write the file watching logic yourself I recommend checking out chokidar
// Initialize watcher.
var watcher = chokidar.watch('myfile.yml', {
ignored: /[\/\\]\./,
persistent: true
});
// Add event listeners.
watcher.on('change', function(path) {
// Update config
})
I try to import a local .json-file using d3.json().
The file filename.json is stored in the same folder as my html file.
Yet the (json)-parameter is null.
d3.json("filename.json", function(json) {
root = json;
root.x0 = h / 2;
root.y0 = 0;});
. . .
}
My code is basically the same as in this d3.js example
If you're running in a browser, you cannot load local files.
But it's fairly easy to run a dev server, on the commandline, simply cd into the directory with your files, then:
python -m SimpleHTTPServer
(or python -m http.server using python 3)
Now in your browser, go to localhost:3000 (or :8000 or whatever is shown on the commandline).
The following used to work in older versions of d3:
var json = {"my": "json"};
d3.json(json, function(json) {
root = json;
root.x0 = h / 2;
root.y0 = 0;
});
In version d3.v5, you should do it as
d3.json("file.json").then(function(data){ console.log(data)});
Similarly, with csv and other file formats.
You can find more details at https://github.com/d3/d3/blob/master/CHANGES.md
Adding to the previous answers it's simpler to use an HTTP server provided by most Linux/ Mac machines (just by having python installed).
Run the following command in the root of your project
python -m SimpleHTTPServer
Then instead of accessing file://.....index.html open your browser on http://localhost:8080 or the port provided by running the server. This way will make the browser fetch all the files in your project without being blocked.
http://bl.ocks.org/eyaler/10586116
Refer to this code, this is reading from a file and creating a graph.
I also had the same problem, but later I figured out that the problem was in the json file I was using(an extra comma). If you are getting null here try printing the error you are getting, like this may be.
d3.json("filename.json", function(error, graph) {
alert(error)
})
This is working in firefox, in chrome somehow its not printing the error.
Loading a local csv or json file with (d3)js is not safe to do. They prevent you from doing it. There are some solutions to get it working though. The following line basically does not work (csv or json) because it is a local import:
d3.csv("path_to_your_csv", function(data) {console.log(data) });
Solution 1:
Disable the security in your browser
Different browsers have different security setting that you can disable. This solution can work and you can load your files. Disabling is however not advisable. It will make you vulnerable for all kind of threads. On the other hand, who is going to use your software if you tell them to manually disable the security?
Disable the security in Chrome:
--disable-web-security
--allow-file-access-from-files
Solution 2:
Load your csv/json file from a website.
This may seem like a weird solution but it will work. It is an easy fix but can be unpractical though. See here for an example. Check out the page-source. This is the idea:
d3.csv("https://path_to_your_csv", function(data) {console.log(data) });
Solution 3:
Start you own browser, with e.g. Python.
Such a browser does not include all kind of security checks. This may be a solution when you experiment with your code on your own machine. In many cases, this may not be the solution when you have users. This example will serve HTTP on port 8888 unless it is already taken:
python -m http.server 8888
python -m SimpleHTTPServer 8888 &
Open the (Chrome) browser address bar and type the underneath. This will open the index.html. In case you have a different name, type the path to that local HTML page.
localhost:8888
Solution 4:
Use local-host and CORS
You may can use local-host and CORS but the approach is not user-friendly coz setting up this, may not be so straightforward.
Solution 5:
Embed your data in the HTML file
I like this solution the most. Instead of loading your csv, you can write a script that embeds your data directly in the html. This will allow users use their favorite browser, and there are no security issues. This solution may not be so elegant because your html file can grow very hard depending on your data but it will work though. See here for an example. Check out the page-source.
Remove this line:
d3.csv("path_to_your_csv", function(data) { })
Replace with this:
var data =
[
$DATA_COMES_HERE$
]
You can't readily read local files, at least not in Chrome, and possibly not in other browsers either.
The simplest workaround is to simply include your JSON data in your script file and then simply get rid of your d3.json call and keep the code in the callback you pass to it.
Your code would then look like this:
json = { ... };
root = json;
root.x0 = h / 2;
root.y0 = 0;
...
I have used this
d3.json("graph.json", function(error, xyz) {
if (error) throw error;
// the rest of my d3 graph code here
}
so you can refer to your json file by using the variable xyz and graph is the name of my local json file
Use resource as local variable
var filename = {x0:0,y0:0};
//you can change different name for the function than json
d3.json = (x,cb)=>cb.call(null,x);
d3.json(filename, function(json) {
root = json;
root.x0 = h / 2;
root.y0 = 0;});
//...
}
After turning on Google Drive API access from the management console and getting my Client ID keys, I followed the sample code (using Python 2.7) and I am able to insert a folder, set the appropriate permissions (type=anyone,role=reader), and insert a text/html type file into the new folder.
However the JSON file resource objects I receive from executing insert on the drive service have no 'webViewLink' field! There are 'webContentLink' and 'selfLink' fields but 'webViewLink', which is necessary for static HTML publishing, seems to be missing.
Most perplexing. If this feature hasn't been turned on yet or if I need to configure my account settings to allow HTML publishing please let me know. Any other help would be most appreciated ;)
The webViewLink property is only returned for public folders, and not the single files inside such folders. You can use that as the base url to construct links to your files.
The WebViewLink file property can be retrieved by doing something like this:
$file = $service->files->get($file_id, array('fields' => 'webViewLink'));
$web_link_view = $file->getWebViewLink();
OR
$sheetsList = $drive_service->files->listFiles([
'fields' => 'files(id, name, webViewLink, webContentLink)',
]);
$web_link_view = $sheetsList->current()->getWebViewLink();
Pay attention that you should load the file specifying which fields you wanna bring with it (In this case, webViewLink). If you don't do that, only id and name will be available.
If you also need to configure file permissions, you can do something like:
$permissions = new \Google_Service_Drive_Permission();
$permissions->setRole('writer');
$permissions->setType('anyone');
$drive_service->permissions->create($file_id, $permissions);
Possible values for setRole() and setType() can be found here: https://developers.google.com/drive/api/v3/reference/permissions/create