Can't get xlsx to JSON converter to work properly in Node/Express - json

I am using the package below to try to convert uploaded excel files (.xlsx) to JSON files on my Express web application:
https://www.npmjs.com/package/xlsx-to-json
So here is my form for the user to upload:
form(id = "form1", action="/upload", method="post", enctype="multipart/form-data")
input(type="file", id="control", name="XLupload")
br
input(type="submit" value="Upload" name="Submit")
and here is my routing for the upload back in my main express (app.js) file:
var multer = require('multer');
var upload = multer({dest: './uploads'});
var excel_upload = upload.single('XLupload');
app.post('/upload', excel_upload, function(req, res) {
var fileObject = req.file;
var filePath = fileObject.path;
/*** This is what the file Object looks like when uploaded:
{ fieldname: 'XLupload',
originalname: 'testing.xlsx',
encoding: '7bit',
mimetype: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
destination: './uploads',
filename: 'c1d55ea7d1f6fccc7e3d3d2764db8881',
path: 'uploads\\c1d55ea7d1f6fccc7e3d3d2764db8881',
size: 8013 }
***/
xlsxj({
input: String(filePath),
output: "output.json"
}, function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
});
anyways, to put it shortly, the uploads seem to work fine, that is, they are uploaded to the /uploads folder in the directory. However, the JSON file that I get back from the xlsxj converter is empty and I'm not sure why. I made a small test xlsx file with some words in random cells and it still game me back an empty
[]
in output.json. Anybody can let me know what I am doing wrong?

You can try to use this library XLSX (https://github.com/SheetJS/js-xlsx) and add this code after get workssheet
var roa = XLSX.utils.sheet_to_row_object_array(worksheet);

Related

How to use a Javascript file to refresh/reload a div from an HTML file?

I am using Node JS and have a JS file, which opens a connection to an API, works with the receving API data and then saves the changed data into a JSON file. Next I have an HTML file, which takes the data from the JSON file and puts it into a table. At the end I open the HTML file in my browser to look at the visualized table and its data.
What I would like to happen is, that the table (or more specific a DIV with an ID inside the table) from the HTML file refreshes itself, when the JSON data gets updated from the JS file. Kinda like a "live table/website", that I can watch change over time without the need to presh F5.
Instead of just opening the HTML locally, I have tried it by using the JS file and creating a connection with the file like this:
const http = require('http');
const path = require('path');
const browser = http.createServer(function (request, response) {
var filePath = '.' + request.url;
if (filePath == './') {
filePath = './Table.html';
}
var extname = String(path.extname(filePath)).toLowerCase();
var mimeTypes = {
'.html': 'text/html',
'.css': 'text/css',
'.png': 'image/png',
'.js': 'text/javascript',
'.json': 'application/json'
};
var contentType = mimeTypes[extname] || 'application/octet-stream';
fs.readFile(filePath, function(error, content) {
response.writeHead(200, { 'Content-Type': contentType });
response.end(content, 'utf-8');
});
}).listen(3000);
This creates a working connection and I am able to see it in the browser, but sadly it doesn't update itself like I wish. I thought about some kind of function, which gets called right after the JSON file got saved and tells the div to reload itself.
I also read about something like window.onload, location.load() or getElementById(), but I am not able to figure out the right way.
What can I do?
Thank you.
Websockets!
Though they might sound scary, it's very easy to get started with websockets in NodeJS, especially if you use Socket.io.
You will need two dependencies in your node application:
"socket.io": "^4.1.3",
"socketio-wildcard": "^2.0.0"
your HTML File:
<script type="module" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/4.0.0/socket.io.js"></script>
Your CLIENT SIDE JavaScript file:
var socket = io();
socket.on("update", function (data) { //update can be any sort of string, treat it like an event name
console.log(data);
// the rest of the code to update the html
})
your NODE JS file:
import { Server } from "socket.io";
// other code...
let io = new Server(server);
let activeConnections = {};
io.sockets.on("connection", function (socket) {
// 'connection' is a "magic" key
// track the active connections
activeConnections[socket.id] = socket;
socket.on("disconnect", function () {
/* Not required, but you can add special handling here to prevent errors */
delete activeConnections[socket.id];
})
socket.on("update", (data) => {
// Update is any sort of key
console.log(data)
})
})
// Example with Express
app.get('/some/api/call', function (req, res) {
var data = // your API Processing here
Object.keys(activeConnections).forEach((conn) => {
conn.emit('update', data)
}
res.send(data);
})
Finally, shameful self promotion, here's one of my "dead" side projects using websockets, because I'm sure I forgot some small detail, and this might help. https://github.com/Nhawdge/robert-quest

Bulk upload/import to Firebase with auto generated keys

I need to upload bulk rows of data using Firebase console's 'Import JSON' utility. But I could not get a way to enable Auto Generated Keys like -Kop90... for each rows. But instead it created 0,1.. keys.
This is my sample data
[
{"date":1448323200,"description":"test data1","amount":1273},
{"date":1448323200,"description":"25mm pipes","amount":2662}
]
I would like to generate something like this
I had a similar problem to yours,
I ended up finding a nice solution for the firestore using JS & Node: https://www.youtube.com/watch?v=Qg2_VFFcAI8&ab_channel=RetroPortalStudio
But since I needed it for the Realtime Database I just altered it slightly to suit my needs:
Pre-requisites:
JS SDK installation: https://firebase.google.com/docs/web/setup?authuser=0#add-sdk-and-initialize
Using SDK version 8 (namespaced) [Could be easily altered for use in v9]
Steps:
Create folder called files in root project directory:
Add all your json files as shown below:
Add the following code to a file (in this example the file is called "uploader.js"):
NOTE: The only thing missing from the below code is the firebaseConfig obj, you can get this obj by following this guide: https://support.google.com/firebase/answer/7015592#zippy=%2Cin-this-article
var firebase = require("firebase/app");
require("firebase/database");
const firebaseConfig = {
// your config details...
};
firebase.initializeApp(firebaseConfig);
const database = firebase.database();
// File directory details:
const path = require("path");
const fs = require("fs");
const directoryPath = path.join(__dirname, "files");
fs.readdir(directoryPath, function(err, files) {
if (err) {
return console.log("Unable to scan directory: " + err);
}
files.forEach(function(file) {
var lastDotIndex = file.lastIndexOf(".");
var items = require("./files/" + file);
var listRef = database.ref(`${file.substring(0, lastDotIndex)}/`);
items.forEach(function(obj) {
var postRef = listRef.push();
postRef.set(obj)
.then(function(docRef) {
console.log("Document written");
})
.catch(function(error) {
console.error("Error adding document: ", error);
});
});
});
});
Lastly, open the terminal to the directory where uploader.js can be found & run:
node uploader.js
After the running the operation, each file becomes a collection, and all the contents of each file are listed with a unique pushId:

CSV to JSON using NodeJS

I'd like to convert a CSV file to a JSON object using NodeJS. The problem is that my CSV file is hosted on a special URL.
URL : My CSV here
var fs = require("fs");
var Converter = require("csvtojson").Converter;
var fileStream = fs.createReadStream("myurl");
var converter = new Converter({constructResult:true});
converter.on("end_parsed", function (jsonObj) {
console.log(jsonObj);
});
fileStream.pipe(converter);
Issue :
Error: ENOENT, open 'C:\xampp\htdocs\GestionDettes\http:\www.banque-france.fr\fileadmin\user_upload\banque_de_france\Economie_et_Statistiques\Changes_et_Taux\page3_quot.csv'
at Error (native)
Edit #1 :
Request.get(myurl, function (error, Response, body) {
var converter = new Converter({
constructResult: true,
delimiter: ';'
});
converter.fromString(body,function(err, taux){
console.log(taux); // it works
});
});
I did just that in a module reading and writing on different protocol in different data formats. I used request to get http resources.
If you want take a look at alinex-datastore. With this module it should work like:
const DataStore = require('#alinex/datastore').default;
async function transform() {
const ds = new DataStore('http://any-server.org/my.csv');
await ds.load();
await ds.save('file:/etc/my-app.json');
}
That should do it.

AngularJS File Upload to Backend Express Server

I am trying to do a file upload using angularjs, using angular-file-upload library (https://github.com/danialfarid/angular-file-upload)
Here is my code
// ===============================My HTML File===========================
<input type="file" ng-file-select="onFileSelect($files)">
// ===============================My Controller==========================
var $scope.formObj = {
name: "Test"
};
var fileToUpload;
$scope.onFileSelect = function (file) {
fileToUpload = file[0];
};
// POSt request to /api/items
$scope.addItem = function() {
console.log($scope.formObj);
$scope.upload = $upload.upload({
url: '/api/items',
method: 'POST',
data: { myObj: $scope.formObj },
file: fileToUpload
}).success(function(data, status, headers, config) {
console.log("success");
});
};
// ================================My Backend=============================
// This is the function that will receive POST request to /api/items
exports.create = function(req, res) {
console.log(req.body); // req.body is just an empty object. ==> {}
// apparently, I found all the data to be in req._readableState.buffer[0]
// in the form of a buffer
var buffer = req._readableState.buffer[0];
// trying to console.log the buffer.toString, resulting in something similar to this
// { name: "Test", image: Object }
console.log(buffer.toString());
return res.send(200);
};
So my backend received the formObj with all its properties and values, however, the actual file data itself, whether in the form of buffer, or base64, or whatever, never gets received.
I wonder why. This is my first time working with file uploading, so I don't understand the concept.
Please point me in the right direction
If you are using Latest version of Express, you'd notice that
app.use(express.multipart()); is no longer bundled with express.
So do the following configuration changes. in express.js
var multer = require('multer');
app.use(multer({ dest: './uploads/'}));
You'd find that after doing this you would find the data and file , in req.body req.file respectively.
Hope it helps

ravendb upload documents error

I am trying to save attachments in ravenDb. I am getting a file not found error.
MVC View:
<input type="file" name="file" id="Ids2" style="float:right"/>
Over an ajax call, I am passing the value of the file name selected in the above control to the controller method - which in turns sends the file name to a custom method called "Upload"
public virtual string Upload(string fileName)
{
IDocumentSession session = GetCurrentDocumentSession();
var id = "upload/" + randomGen();
session.Advanced.DatabaseCommands.PutAttachment(id,null,
File.ReadAllBytes(fileName), optionalMetaData);
return id;
}
I am getting C:\ProgramFiles (x86)....does not have the file specified.
Lets say in the view - I browsed to C:/Doc1.txt and clicked on Add button that saves bunch of other fields on the view and also picks up the file name/path from the file upload control.
I get an error at session.advance.databasecommands... line
Could not find file 'C:\Program Files (x86)\Common Files\Microsoft Shared\DevServer\10.0\Doc1.txt'.
If I manually move the Doc1.txt file to the above location, ravenDB saves the attachment and I can see it from localhost:8080/static/upload/keyvalue
How can I make ravenDB take the file from the location the user selects and not from the what it looks like a default location of c:programfiles.....
EDIT:
function () {
var iFile = iContainer.find( '#Ids2' ).val();
var DataToSave = {
'Attachment' : iFile
};
var encodedData = $.toJSON(DataToSave);
$.ajax({
type: 'POST' ,
url: '/AttController/Attach' ,
data: encodedData,
contentType: 'application/json; charset=utf-8' ,
success: function (rc) {
if (rc.Success) {
// more javascript reroutes..business logic
}
else {
alert(rc.Message);
}
},
error: function (xhr, ajaxOptions, thrownError) {
alert( 'Error attaching \n' + xhr.response);
}
});
};
Depending on the browser The html file control does not store the full path to the file. If you use Chrome and debug the script
var iFile = iContainer.find( '#Ids2' ).val();
Will return something like C:\fakepath\yourfile.txt. where as with IE the full path is returned.
Also you in your Ajax you are not pushing the bytes of the file but only the filename which means unless you are going to only ever run this website in a browser on the webserver the chances of the file being in the same place as the webserver is slim.
If you are trying to upload a file via ajax to a MVC controller I would suggest uploadify.
$("#Ids2").uploadify(
{
uploader: '/AttController/Attach',
swf: 'your/path/to/uploadify.swf',
cancelImg: 'your/path/to/cancel.jpg',
buttonText: 'Select File',
fileSizeLimit: '300KB',
fileTypeDesc: 'Image Files',
fileTypeExts: '*.gif; *.jpg; *.png',
auto: 'true',
multiple: 'false',
onError: function(type, info) {
},
onUploadSuccess: function(file, data, response) {
}
});
Then just change your controller action to
public virtual ActionResult Upload(HttpPostedFileBase FileData)
The FileData would have things like the FileName and would also have the file in an input stream.