Nativescript cannot read JSON file - json

I have the following json:
.../src/app/assets/i18n/en.json
{
"TEST": "This is a some test data in a json file"
}
I have the following code:
const folderName = "assets/i18n/en.json";
knownFolders.currentApp().getFile(folderName).readText().then(a => console.log("json file: "+ JSON.parse(a))));
It gives me the following error:
ERROR Error: Uncaught (in promise): SyntaxError: Unexpected end of JSON input
JS: SyntaxError: Unexpected end of JSON input
JS: at JSON.parse (<anonymous>)
I have tried:
Set folderName to "/assets/i18n/en.json"
Rebuild, and reconnect my testing phone
Use HTTP and with this.http.get("~/app/assets/i18n/en.json").toPromise().then(res => console.log("http???", res)).catch(err => console.log("err:", err));
print the object file without parsing it (it's empty...)
But the error stays the same.
update1
it seems, that the file, sadly, does not exists.
this code:
const folderName = "assets/i18n";
const fileName = "en.json";
console.log("exists?", File.exists(folderName + "/" + fileName));
returns a false.
Although see the picture provided from the project files.
(the code proided is in app.component.ts, in the constructor of AppComponent)
What can be the problem here?
update2:
updated my webpack.config.js to copy .json files:
new CopyWebpackPlugin([
{ from: { glob: "fonts/**" } },
{ from: { glob: "**/*.jpg" } },
{ from: { glob: "**/*.json" } },
{ from: { glob: "**/*.png" } },
], { ignore: [`${relative(appPath, appResourcesFullPath)}/**`] }),
but still no luck. The file still does not extist...
Update3:
This is getting ridiculous... the file can be imported as a standard json file, but the Nativescript lib still does not see it.
import { File } from "tns-core-modules/file-system";
import config from "./assets/i18n/hu.json";
....
const folderName = "./assets/i18n";
const fileName = "hu.json";
console.log("config:", config.test)
console.log("exists?", File.exists(folderName + "/" + fileName));
this produces the following output:
JS: config: This is a translated line
JS: exists? false

AFAIK, the path must be split; you can't request a file with a relative path directly.
const folderName = "assets/i18n";
const fileName = "en.json";
console.log(
knownFolders.currentApp().getFolder(folderName).getFile(fileName).readTextSync(),
);

I face similar situation while working on an app. the issue was that the file permissions was not granted.
other way around which worked perfectly, without needing any permission was to fetch the JSON from a url and work through it.

Related

CSV File Processing with Nestjs and Papa Parse

I am trying to process a CSV file in NestJS using Multer and Papa Parse. I do not want to store the file locally. I just want to parse CSV files to extract some information.
However, I am unable to process it, I have tried two different ways. In the first one, I passed the file buffer to Papa.parse function. However, I get the error: ReferenceError: FileReaderSync is not defined
#Post('1')
#UseInterceptors(
FileInterceptor('file', {})
)
async uploadFile(#UploadedFile() file: Express.Multer.File ){
const csvData = papa.parse(file.buffer, {
header: false,
worker: true,
delimiter: ",",
step: function (row){
console.log("Row: ", row.data);
}
});
}
So tried calling the readFileSync() as shown below, but this time I got the error, ERROR [ExceptionsHandler] ENAMETOOLONG: name too long, open
#Post('2')
#UseInterceptors(
FileInterceptor('file', {})
)
async uploadFile(#UploadedFile() file: Express.Multer.File ){
const $file = readFileSync(file.buffer);
const csvData = papa.parse($file, {
header: false,
worker: true,
delimiter: ",",
step: function (row){
console.log("Row: ", row.data);
}
});
}
will appreciate any help to resolve this issue.
As pointed out by #skink, the file buffer needs to be converted to stream before it can be used by papa parse.
const { Readable } = require('stream');
And updated the function, where converting the file.buffer to stream before calling parse()
#Post('1')
#UseInterceptors(
FileInterceptor('file', {})
)
async uploadFile(#UploadedFile() file: Express.Multer.File ){
const stream = Readable.from(file.buffer);
const csvData = papa.parse(stream, {
header: false,
worker: true,
delimiter: ",",
step: function (row){
console.log("Row: ", row.data);
}
});
}

Bulk upload/import to Firebase with auto generated keys

I need to upload bulk rows of data using Firebase console's 'Import JSON' utility. But I could not get a way to enable Auto Generated Keys like -Kop90... for each rows. But instead it created 0,1.. keys.
This is my sample data
[
{"date":1448323200,"description":"test data1","amount":1273},
{"date":1448323200,"description":"25mm pipes","amount":2662}
]
I would like to generate something like this
I had a similar problem to yours,
I ended up finding a nice solution for the firestore using JS & Node: https://www.youtube.com/watch?v=Qg2_VFFcAI8&ab_channel=RetroPortalStudio
But since I needed it for the Realtime Database I just altered it slightly to suit my needs:
Pre-requisites:
JS SDK installation: https://firebase.google.com/docs/web/setup?authuser=0#add-sdk-and-initialize
Using SDK version 8 (namespaced) [Could be easily altered for use in v9]
Steps:
Create folder called files in root project directory:
Add all your json files as shown below:
Add the following code to a file (in this example the file is called "uploader.js"):
NOTE: The only thing missing from the below code is the firebaseConfig obj, you can get this obj by following this guide: https://support.google.com/firebase/answer/7015592#zippy=%2Cin-this-article
var firebase = require("firebase/app");
require("firebase/database");
const firebaseConfig = {
// your config details...
};
firebase.initializeApp(firebaseConfig);
const database = firebase.database();
// File directory details:
const path = require("path");
const fs = require("fs");
const directoryPath = path.join(__dirname, "files");
fs.readdir(directoryPath, function(err, files) {
if (err) {
return console.log("Unable to scan directory: " + err);
}
files.forEach(function(file) {
var lastDotIndex = file.lastIndexOf(".");
var items = require("./files/" + file);
var listRef = database.ref(`${file.substring(0, lastDotIndex)}/`);
items.forEach(function(obj) {
var postRef = listRef.push();
postRef.set(obj)
.then(function(docRef) {
console.log("Document written");
})
.catch(function(error) {
console.error("Error adding document: ", error);
});
});
});
});
Lastly, open the terminal to the directory where uploader.js can be found & run:
node uploader.js
After the running the operation, each file becomes a collection, and all the contents of each file are listed with a unique pushId:

Error while running stubby4node using Gulp

I am trying to setup Stubby Server in my JavaScript environment and I am getting the error below.
The relevant part of my Gulpfile:
gulp.task('stubby', function(cb) {
var options = {
callback: function (server, options) {
server.get(1, function (err, endpoint) {
if (!err)
console.log(endpoint);
});
},
stubs: 8000,
tls: 8443,
admin: 8010,
files: [
'*.*'
]
};
stubby(options, cb);
});
The error:
[12:15:03] Starting 'stubby'...
[12:15:03] 'stubby' errored after 17 ms
[12:15:03] Error: Missing error message
at new PluginError (C:\Users\admin\IdeaProjects\myproject\node_modules\gulp-util\lib\PluginError.js:73:28)
at readJSON (C:\Users\admin\IdeaProjects\myproject\node_modules\gulp-stubby-server\index.js:90:15)
at C:\Users\admin\IdeaProjects\myproject\node_modules\gulp-stubby-server\index.js:149:24
at Array.map (native)
at stubbyPlugin (C:\Users\admin\IdeaProjects\myproject\node_modules\gulp-stubby-server\index.js:136:12)
at Gulp.<anonymous> (C:\Users\admin\IdeaProjects\myproject\gulpfile.js:54:5)
at module.exports (C:\Users\admin\IdeaProjects\myproject\node_modules\orchestrator\lib\runTask.js:34:7)
at Gulp.Orchestrator._runTask (C:\Users\admin\IdeaProjects\myproject\node_modules\orchestrator\index.js:273:3)
at Gulp.Orchestrator._runStep (C:\Users\admin\IdeaProjects\myproject\node_modules\orchestrator\index.js:214:10)
at Gulp.Orchestrator.start (C:\Users\admin\IdeaProjects\myproject\node_modules\orchestrator\index.js:134:8)
Searching the gulp-stubby-server codebase for PluginError yields the follow snippet:
function readJSON(filepath, options) {
var src = fs.readFileSync(filepath, options),
result;
if (!options.mute) {
gutil.log(gutil.colors.yellow('Parsing ' + filepath + '...'));
}
try {
result = JSON.parse(src);
return result;
} catch (e) {
throw new gutil.PluginError(PLUGIN_NAME, 'Unable to parse "' + filepath + '" file (' + e.message + ').', e);
}
}
— Source on GitHub
You can tell this is the likely culprit because of the stack trace you see, where the PluginError is coming from readJSON.
The issue
Take note of the catch block. This is caused by one of the files matching your glob (*.*) not being a valid JSON file.
To fix
Ensure you are using the newest version of gulp-stubby-server
Ensure that you are using the correct glob (that is, do you really mean *.*)
Ensure that all the files in the current working directory are valid JSON files

xls-to-json not working in Node.js

I want to convert .xls file in JSON format I have used xls-to-json module for the same.
when I used xlsx-to-json module it is working fine. but I don't want to read .xlsx file. it is giving me an error with :
TypeError: Cannot set property length of [object Object] which has only a getter.
I am unable to find an error. is there any other module to convert .xls file in JSON.
here is my code :
var node_xj = require("xls-to-json");
app.get('/file',function(req,res){
node_xj({
input: 'file.xls', // input xls
output: "output.json", // output json
sheet: "sheetname", // specific sheetname
}, function(err, result) {
if(err) {
console.error(err);
} else {
console.log(result);
}
});
});
That package supports only the new xlsx format used by MS Excel.
The easiest option would be to save the file as comma-delimited csv file (as the format is available in most softwares) and use a csv to json converter.
There's a nice one here - https://www.npmjs.com/package/csv
You can use path module to get file extension, and if the file extension is match then execute the parsing code
var node_xj = require("xls-to-json");
var path = require('path');
app.get('/file', function(req, res) {
//Give file name with extension e.g, file.xls
if (path.extname('file.xls') === '.xls') {
node_xj({
input: 'file.xls', // input xls
output: "output.json", // output json
sheet: "sheetname", // specific sheetname
}, function(err, result) {
if (err) {
console.error(err);
} else {
console.log(result);
}
});
}
});
Update
This is not an issue of this module, but the issue comes from third-party module dependency xlsjs
here is the opened issue you can see the updates
You can use this module, note I am the author.
The xls-to-json-lc's package has been merged into xlsx-to-json-lc's package. Just use the Xlsx for both types of extensions and will work fine.

Get a local json file on NativeScript

How to get a local big json data?
I have tried this, but I had no success:
var sa = require("./shared/resources/sa.json");
var array = new observableArrayModule.ObservableArray(sa);
Use the file-system module to read the file and then parse it with JSON.parse():
var fs = require('file-system');
var documents = fs.knownFolders.currentApp();
var jsonFile = documents.getFile('shared/resources/sa.json');
var array;
var jsonData;
jsonFile.readText()
.then(function (content) {
try {
jsonData = JSON.parse(content);
array = new observableArrayModule.ObservableArray(jsonData);
} catch (err) {
throw new Error('Could not parse JSON file');
}
}, function (error) {
throw new Error('Could not read JSON file');
});
Here's a real life example of how I'm doing it in a NativeScript app to read a 75kb/250 000 characters big JSON file.
TypeScript:
import {knownFolders} from "tns-core-modules/file-system";
export class Something {
loadFile() {
let appFolder = knownFolders.currentApp();
let cfgFile = appFolder.getFile("config/config.json");
console.log(cfgFile.readTextSync());
}
}
As of TypeScript version 2.9.x and above (in NativeScript 5.x.x is using versions 3.1.1 and above) we can now use resovleJsonModule option for tsconfig.json. With this option, the JSON files can now be imported just as modules and the code is simpler to use, read and maintain.
For example, we can do:
import config from "./config.json";
console.log(config.count); // 42
console.log(config.env); // "debug"
All we need to do is to use TypeScript 2.9.x and above and enable the propety in tsconfig.json
// tsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"resolveJsonModule": true,
"esModuleInterop": true
}
}
A sample project demonstrating the above can be found here
I just wanted to add one more thing, which might be even easier. You can simply write the content of your JSON file in a data.js file, or whatever name you would like to use, and export it as an array. Then you can just require the data.js module.