How do I auto generate .ts files in VSCode? - json

I have a .json file that changes quite often and I would like to auto generate a .ts file every time the .json file is saved.
I want to write the logic of the converter from the .json to .ts myself.
How is this achieved with VSCode?

Depending on your use case, this may not be the best approach, but there is a plugin called Run on Save that executes a command when you save a file. If you have a script in ts/js that converts the .json to .ts, you can execute it with the nodejs runtime.
A sample config could look like:
"emeraldwalk.runonsave": {
"commands": [
{
"match": "file.json$",
"cmd": "ts-node export.ts ${file}"
},
]
}
Your export.ts script would be running in the nodejs runtime, so you'll have access to the fs module. This gives you the ability to read and write files to the disk. A sample could look like:
var fs = require('fs');
fs.readFile(process.argv[2], function (err, data) {
if (err) throw err;
//conversion would be here
let jsonPayload = data;
fs.appendFile('newTsFile.ts', jsonPayload, function (err) {
if (err) throw err;
console.log('Updated!');
});
});

Related

fs.readFileSync cannot find file when deploying with lambda

In my code I am calling a query from my lambda function
let featured_json_data = JSON.parse(fs.readFileSync('data/jsons/featured.json'))
This works locally because my featured.json is in the directory that I am reading from. However when I deploy with serverless, the zip it generates doesn't have those files, I get a
ENOENT: no such file directory, open...
I tried packaging by adding
package:
include:
- data/jsons/featured.json
but it just doesn't work. The only way I get this to work is manually adding the json file and then change my complied handler.js code to read from the json file in the root directory.
In this screenshot I have to add the jsons then manually upload it again and in the compiled handler.js code change the directory to exclude the data/jsons
I want to actually handle this in my servereless.yml
You can load JSON files using require().
const featured_json_data = require('./featured.json')
Or better yet, convert your JSON into JS!
For working with non-JSON files, I found that process.cwd() works for me in most cases. For example:
const fs = require('fs');
const path = require('path');
export default async (event, context, callback) => {
try {
console.log('cwd path', process.cwd());
const html = fs.readFileSync(
path.resolve(process.cwd(), './html/index.html'),
'utf-8'
);
const response = {
statusCode: 200,
headers: {
'Content-Type': 'text/html'
},
body: html
};
callback(null, response);
} catch (err) {
console.log(err);
}
};
I recommend looking at copy-webpack-plugin: https://github.com/webpack-contrib/copy-webpack-plugin
You can use it to package other files to include with your Lambda deployment.
In my project, I had a bunch of files in a /templates directory. In webpack.config.js to package up these templates, for me it looks like:
const CopyWebpackPlugin = require('copy-webpack-plugin');
module.exports = {
plugins: [
new CopyWebpackPlugin([
'./templates/*'
])
]
};
fs.readFileSync cannot find file when deploying with lambda
Check the current directory and check target directory content in deploy environment. Add appropriate code for that checking to your program/script.

Json File Creation in nodeJS

I need to create a JSON file in the below format in nodeJS also i need to traverse into the particular position and add the values if any. I have done it in Java, but no idea in NodeJs. It will be helpful if someone helps me. Thanks.
If my understanding is correct, you are using Protractor to automate AngularJS tests and write the results as JSON for reporting / for parsing it back once done?
If that's the case, you could simply store the results as a Object in Javascript and write it out using fs node package.
Writing the JSON Report file
var fs = require('fs');
//..Protractor .. and other logic..
var results = { Project : "xxx" , ....};
//...more of that assignment....
//You could use JSON.parse() or JSON.stringify() to read/convert this object to string vice versa and fs package to read/write it to file.
fs.writeFile("/tmp/testreport.json", JSON.stringify(results), function(err) {
if(err) {
return console.log(err);
}
console.log("The test report file was saved as JSON file!");
});

Parse and convert xls file (received from GET request to URL) to JSON without writing to disk

The title says everything.
I want to get an xls file from a third party server. (said service keeps fueling
records, and they do not expose any kind of api, only the excel file).
Then parse that file with a library like node-excel-to-json, and convert it into JSON format I can use to import the data in mongo.
I want to manipulate the file in-memory, without writing it to disk.
So, say I am getting the file with this code,
parseFuelingReport() {
let http = require('http');
let fs = require('fs');
// let excel2Json = require('node-excel-to-json');
let file = fs.createWriteStream("document.xls");
let request = http.get("http://www.everydayexcel.com/files/Excel_Test_Basic_1_cumulative_sum.xls", function (response) {
});
},
I want to load the response in memory and parse it with something like
excel2Json(/* this is supposed to be the path to the xls file */, {
'convert_all_sheet': false,
'return_type': 'File',
'sheetName': 'survey'
}, function (err, output) {
console.log('err, res', err, output);
});
I assume you are using https://github.com/kashifeqbal/node-excel-to-json, which is available as node package.
If you take a look at this line,
you can see, two things:
It calls XLSX.readFile(filePath);, what will load a file from disk. Hard to call with an in-memory object in.
Internally it uses a XLSX package, most likely this one: https://www.npmjs.com/package/xlsx
The XLSX API seems not as convenient as the excel2Json, but it provides a read() function which takes a JavaScript object:
/* Call XLSX */
var workbook = XLSX.read(bstr, {type:"binary"});
Hope this helps

Running json-2-csv npm using as input a json file saved to desktop

I try to use this from npm in order to convert a json file I have saved to my desktop.
However I can't understand how to do it because I am not an original programmer.
I have installed the npm and
npm install json-2-csv
after this steps I can't understand how can I import my json file.
If there is anyone who could help me with the following steps or screenshots it could be very very useful for me.
Thank you in advance.
Put your json file to root directory of your Node.js application and rename it to data.json.
Then create app.js file in the same directory with this code:
var converter = require('json-2-csv');
var fs = require('fs');
var jsonData = require('./data.json');
var json2csvCallback = function (err, csv) {
if (err) throw err;
fs.writeFile("./data.csv", csv, function(err) {
if(err) throw err;
console.log("data.csv file has been saved.");
});
};
converter.json2csv(jsonData, json2csvCallback);
It requires data.json file from the root app directory (will work only with a valid JSON, otherwise you will find an error description in the console). If all ok, it will convert JSON to CSV using json-2-csv module and will save data.csv file with the result still in the root app directory.
Just try to run it with: node app.js

node.js read file in, parse to JSON and output

I'm very new to anything much code related, but I'm on a slow and sometimes painful learning curve.
I have a file containing some json which I read into node.js parse and push to a web socket. The script works fine, but I only ever get one json object returned.
devices.json: (Complete file) Not every element has the same data contents, and there is any number of element objects within a data source.
{
"element":"SENS01",
"data":{
"type":"SEN",
"descr":"T100"
},
"element":"SENS02",
"data":{
"type":"SEN",
"descr":"T088",
"uid":"X1A1AA",
"check_on":"2014/06/29"
},
"element":"SENS03",
"data":{
"type":"SEN",
"descr":"T000:",
"uid":"X1A1AB",
"check_on":"2014/06/29"
},
"element":"LED1",
"data":{
"type":"LED",
"state":"0"
}
}
The code which does the stuff is;
server.js:
var app = require('http').createServer(handler),
io = require('socket.io').listen(app),
fs = require('fs');
// creating the server ( localhost:8000 )
app.listen(8000);
// Server started - load page.
function handler(req, res) {
fs.readFile('/var/www/html/dashboard.html', function (err, data) {
if (err) {
console.log(err);
res.writeHead(500);
return res.end('Error loading web page');
}
res.writeHead(200);
res.end(data);
});
}
// creating a new websocket.
io.sockets.on('connection', function (socket) {
console.log();
// 1st READ of json state file.
fs.readFile('devices.json', 'utf8', function (err, data) {
if (err) throw err;
// Parse/check its valid json.
var dump = JSON.parse(data);
socket.volatile.emit('MAP.room1', dump);
});
});
When I connect to the socket the following is sent (as logged from the server console)
debug - websocket writing 5:::{"name":"MAP.room1","args":[{"element":"LED1","data":{"type":"LED","state":"0"}}]}
I never get any of the other objects, only this one. I've had a look round the net about how to iterate over objects, but it was all largely meaningless to me :(
What I am trying to achieve is when you connect to the web socket every object from the devices.json file is pushed out 1 object at a time. So once this is working I would expect to see;
debug - websocket writing 5:::{"name":"MAP.room1","args":[{"element":"LED1","data":{"type":"LED","state":"0"}}]}
debug - websocket writing 5:::{"name":"MAP.room1","args":[{"element":"SENS03","data":{"type":"SEN","descr":"T000:","uid":"X1A1AB","check_on":"2014/06/29"}}]} etc...
If I put a console.log(data) line in my server.js then I see the entire file as expected. Its only once its been parsed am I left with the 1 entry.
Can anyone please explain what's going on, and how I can best overcome this. It needs to be in a really simple way ideally using my own code/dataset as examples so I can understand 'what this means for me' A lot of the web examples and stuff I read tend to use different examples which just confuses me. I know the basics of declaring variables etc, and have an extremely limited experience with Ruby with a simple script to parse some push data received from an API but that's about it.
If you need any more context etc then please let me know, otherwise any help gratefully received.
I think your problem is that you're using the same keys in your JSON. When the parser reads in that JSON, it continuously overwrites previous values of element and data and since those are the only unique key names, those are the only two values you see.
If you modified your JSON so that the same key names are not used on the same "level," then you would see all of the data you are expecting. Here's an example that makes it easy to iterate through each element:
[
{
"element":"SENS01",
"data":{
"type":"SEN",
"descr":"T100"
}
},
{
"element":"SENS02",
"data":{
"type":"SEN",
"descr":"T088",
"uid":"X1A1AA",
"check_on":"2014/06/29"
}
},
{
"element":"SENS03",
"data":{
"type":"SEN",
"descr":"T000:",
"uid":"X1A1AB",
"check_on":"2014/06/29"
}
},
{
"element":"LED1",
"data":{
"type":"LED",
"state":"0"
}
}
]
Or if you can guarantee that the element values are always unique, then perhaps you could also do this:
{
"SENS01":{
"type":"SEN",
"descr":"T100"
},
"SENS02":{
"type":"SEN",
"descr":"T088",
"uid":"X1A1AA",
"check_on":"2014/06/29"
},
"SENS03":{
"type":"SEN",
"descr":"T000:",
"uid":"X1A1AB",
"check_on":"2014/06/29"
},
"LED1":{
"type":"LED",
"state":"0"
}
}
Ok so I found out my data was actually JS objects represented as below in a flat format with every object seperated with a linefeed.
{"SENS01":{"type":"SEN","descr":"T100"}
{"element":"LED1","data":{"type":"LED","state":"0"}
Using linereader (from npm) I was able to read the file in by doing;
lineReader.eachLine('simple.txt', function(line) {
var dump = JSON.parse(line);
socket.emit('MAP.room1', dump);
});
That then output the required data from the web socket.