I need to upload bulk rows of data using Firebase console's 'Import JSON' utility. But I could not get a way to enable Auto Generated Keys like -Kop90... for each rows. But instead it created 0,1.. keys.
This is my sample data
[
{"date":1448323200,"description":"test data1","amount":1273},
{"date":1448323200,"description":"25mm pipes","amount":2662}
]
I would like to generate something like this
I had a similar problem to yours,
I ended up finding a nice solution for the firestore using JS & Node: https://www.youtube.com/watch?v=Qg2_VFFcAI8&ab_channel=RetroPortalStudio
But since I needed it for the Realtime Database I just altered it slightly to suit my needs:
Pre-requisites:
JS SDK installation: https://firebase.google.com/docs/web/setup?authuser=0#add-sdk-and-initialize
Using SDK version 8 (namespaced) [Could be easily altered for use in v9]
Steps:
Create folder called files in root project directory:
Add all your json files as shown below:
Add the following code to a file (in this example the file is called "uploader.js"):
NOTE: The only thing missing from the below code is the firebaseConfig obj, you can get this obj by following this guide: https://support.google.com/firebase/answer/7015592#zippy=%2Cin-this-article
var firebase = require("firebase/app");
require("firebase/database");
const firebaseConfig = {
// your config details...
};
firebase.initializeApp(firebaseConfig);
const database = firebase.database();
// File directory details:
const path = require("path");
const fs = require("fs");
const directoryPath = path.join(__dirname, "files");
fs.readdir(directoryPath, function(err, files) {
if (err) {
return console.log("Unable to scan directory: " + err);
}
files.forEach(function(file) {
var lastDotIndex = file.lastIndexOf(".");
var items = require("./files/" + file);
var listRef = database.ref(`${file.substring(0, lastDotIndex)}/`);
items.forEach(function(obj) {
var postRef = listRef.push();
postRef.set(obj)
.then(function(docRef) {
console.log("Document written");
})
.catch(function(error) {
console.error("Error adding document: ", error);
});
});
});
});
Lastly, open the terminal to the directory where uploader.js can be found & run:
node uploader.js
After the running the operation, each file becomes a collection, and all the contents of each file are listed with a unique pushId:
Related
I am using Node JS and have a JS file, which opens a connection to an API, works with the receving API data and then saves the changed data into a JSON file. Next I have an HTML file, which takes the data from the JSON file and puts it into a table. At the end I open the HTML file in my browser to look at the visualized table and its data.
What I would like to happen is, that the table (or more specific a DIV with an ID inside the table) from the HTML file refreshes itself, when the JSON data gets updated from the JS file. Kinda like a "live table/website", that I can watch change over time without the need to presh F5.
Instead of just opening the HTML locally, I have tried it by using the JS file and creating a connection with the file like this:
const http = require('http');
const path = require('path');
const browser = http.createServer(function (request, response) {
var filePath = '.' + request.url;
if (filePath == './') {
filePath = './Table.html';
}
var extname = String(path.extname(filePath)).toLowerCase();
var mimeTypes = {
'.html': 'text/html',
'.css': 'text/css',
'.png': 'image/png',
'.js': 'text/javascript',
'.json': 'application/json'
};
var contentType = mimeTypes[extname] || 'application/octet-stream';
fs.readFile(filePath, function(error, content) {
response.writeHead(200, { 'Content-Type': contentType });
response.end(content, 'utf-8');
});
}).listen(3000);
This creates a working connection and I am able to see it in the browser, but sadly it doesn't update itself like I wish. I thought about some kind of function, which gets called right after the JSON file got saved and tells the div to reload itself.
I also read about something like window.onload, location.load() or getElementById(), but I am not able to figure out the right way.
What can I do?
Thank you.
Websockets!
Though they might sound scary, it's very easy to get started with websockets in NodeJS, especially if you use Socket.io.
You will need two dependencies in your node application:
"socket.io": "^4.1.3",
"socketio-wildcard": "^2.0.0"
your HTML File:
<script type="module" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/4.0.0/socket.io.js"></script>
Your CLIENT SIDE JavaScript file:
var socket = io();
socket.on("update", function (data) { //update can be any sort of string, treat it like an event name
console.log(data);
// the rest of the code to update the html
})
your NODE JS file:
import { Server } from "socket.io";
// other code...
let io = new Server(server);
let activeConnections = {};
io.sockets.on("connection", function (socket) {
// 'connection' is a "magic" key
// track the active connections
activeConnections[socket.id] = socket;
socket.on("disconnect", function () {
/* Not required, but you can add special handling here to prevent errors */
delete activeConnections[socket.id];
})
socket.on("update", (data) => {
// Update is any sort of key
console.log(data)
})
})
// Example with Express
app.get('/some/api/call', function (req, res) {
var data = // your API Processing here
Object.keys(activeConnections).forEach((conn) => {
conn.emit('update', data)
}
res.send(data);
})
Finally, shameful self promotion, here's one of my "dead" side projects using websockets, because I'm sure I forgot some small detail, and this might help. https://github.com/Nhawdge/robert-quest
Is there a recommended way to handle data from data file in protractor scripts?
If I want to keep all the test data (like login details, user input values) in separate data file then what type of file should I use and how should I import them to my protractor scripts?
If suppose you need to work with json then:
Suppose your json for username and password of a login page looks like:
Example of JSON:
[
{
"username": "kishan",
"password": "patel"
}
]
Then you can simply import this to your code and access it as below.
describe ('Login Page Data Driven' , function() {
browser.ignoreSynchronization = true;
beforeEach(function(){
browser.get('your url');
browser.driver.manage().window().maximize();
});
it('To verify Login, using Data Driven Technique from Json file', function()
{
var testData = require('D:/json path'); //this is the path where your json is stored
var user= element(by.id("username"));
var password = element(by.id("password"));
user.sendKeys(testData[0].username);
password.sendKeys(testData[0].password);
});
This Is just an Example. I hope you can Relate and apply.
Try at your end & Let me know for more concerns.
I typically create a separate data file, and require it as needed in my specs. I have a working example on my github protractor-examples repo. Here's the jist:
// userData.js
var UserData = {
testUser : {'username': 'test', 'password': 'test'},
};
module.exports = UserData;
then in my spec...
// nonAngularLoginSpec.js
it('should goto friend pages on successful login', function() {
loginPage.loginAs(userData.testUser);
expect(friendPage.at()).toBeTruthy();
});
I'd like to convert a CSV file to a JSON object using NodeJS. The problem is that my CSV file is hosted on a special URL.
URL : My CSV here
var fs = require("fs");
var Converter = require("csvtojson").Converter;
var fileStream = fs.createReadStream("myurl");
var converter = new Converter({constructResult:true});
converter.on("end_parsed", function (jsonObj) {
console.log(jsonObj);
});
fileStream.pipe(converter);
Issue :
Error: ENOENT, open 'C:\xampp\htdocs\GestionDettes\http:\www.banque-france.fr\fileadmin\user_upload\banque_de_france\Economie_et_Statistiques\Changes_et_Taux\page3_quot.csv'
at Error (native)
Edit #1 :
Request.get(myurl, function (error, Response, body) {
var converter = new Converter({
constructResult: true,
delimiter: ';'
});
converter.fromString(body,function(err, taux){
console.log(taux); // it works
});
});
I did just that in a module reading and writing on different protocol in different data formats. I used request to get http resources.
If you want take a look at alinex-datastore. With this module it should work like:
const DataStore = require('#alinex/datastore').default;
async function transform() {
const ds = new DataStore('http://any-server.org/my.csv');
await ds.load();
await ds.save('file:/etc/my-app.json');
}
That should do it.
How to get a local big json data?
I have tried this, but I had no success:
var sa = require("./shared/resources/sa.json");
var array = new observableArrayModule.ObservableArray(sa);
Use the file-system module to read the file and then parse it with JSON.parse():
var fs = require('file-system');
var documents = fs.knownFolders.currentApp();
var jsonFile = documents.getFile('shared/resources/sa.json');
var array;
var jsonData;
jsonFile.readText()
.then(function (content) {
try {
jsonData = JSON.parse(content);
array = new observableArrayModule.ObservableArray(jsonData);
} catch (err) {
throw new Error('Could not parse JSON file');
}
}, function (error) {
throw new Error('Could not read JSON file');
});
Here's a real life example of how I'm doing it in a NativeScript app to read a 75kb/250 000 characters big JSON file.
TypeScript:
import {knownFolders} from "tns-core-modules/file-system";
export class Something {
loadFile() {
let appFolder = knownFolders.currentApp();
let cfgFile = appFolder.getFile("config/config.json");
console.log(cfgFile.readTextSync());
}
}
As of TypeScript version 2.9.x and above (in NativeScript 5.x.x is using versions 3.1.1 and above) we can now use resovleJsonModule option for tsconfig.json. With this option, the JSON files can now be imported just as modules and the code is simpler to use, read and maintain.
For example, we can do:
import config from "./config.json";
console.log(config.count); // 42
console.log(config.env); // "debug"
All we need to do is to use TypeScript 2.9.x and above and enable the propety in tsconfig.json
// tsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"resolveJsonModule": true,
"esModuleInterop": true
}
}
A sample project demonstrating the above can be found here
I just wanted to add one more thing, which might be even easier. You can simply write the content of your JSON file in a data.js file, or whatever name you would like to use, and export it as an array. Then you can just require the data.js module.
I am using the package below to try to convert uploaded excel files (.xlsx) to JSON files on my Express web application:
https://www.npmjs.com/package/xlsx-to-json
So here is my form for the user to upload:
form(id = "form1", action="/upload", method="post", enctype="multipart/form-data")
input(type="file", id="control", name="XLupload")
br
input(type="submit" value="Upload" name="Submit")
and here is my routing for the upload back in my main express (app.js) file:
var multer = require('multer');
var upload = multer({dest: './uploads'});
var excel_upload = upload.single('XLupload');
app.post('/upload', excel_upload, function(req, res) {
var fileObject = req.file;
var filePath = fileObject.path;
/*** This is what the file Object looks like when uploaded:
{ fieldname: 'XLupload',
originalname: 'testing.xlsx',
encoding: '7bit',
mimetype: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
destination: './uploads',
filename: 'c1d55ea7d1f6fccc7e3d3d2764db8881',
path: 'uploads\\c1d55ea7d1f6fccc7e3d3d2764db8881',
size: 8013 }
***/
xlsxj({
input: String(filePath),
output: "output.json"
}, function(err, result) {
if (err) {
console.log(err);
} else {
console.log(result);
}
});
});
anyways, to put it shortly, the uploads seem to work fine, that is, they are uploaded to the /uploads folder in the directory. However, the JSON file that I get back from the xlsxj converter is empty and I'm not sure why. I made a small test xlsx file with some words in random cells and it still game me back an empty
[]
in output.json. Anybody can let me know what I am doing wrong?
You can try to use this library XLSX (https://github.com/SheetJS/js-xlsx) and add this code after get workssheet
var roa = XLSX.utils.sheet_to_row_object_array(worksheet);