Exposing a Javascript object as a module in webpack - configuration

Inside webpack.config.js I have computed a javascript map that I'd like import as a module in the browser. I could write the data to disk and then read it back with e.g https://github.com/webpack/json-loader, but is there any more elegant ways to do this in-memory?
webpack.config:
var config = {
key: 'data'
}
// do something with webpack loaders
some.file.that.is.executed.in.browser.js:
var config = require('config')
window.alert('Config is', config.key)

webpack loaders are file preprocessor, thus there needs to be the file that you import. So create a dummy file and use json-string-loader to override the contents.
Create an empty config.json to your project.
In webpack.config.js:
var appConfig = {
key: 'data'
}
...
loaders: [
{
test: /config.json$/,
loader: 'json-string-loader?json=' + JSON.stringify(appConfig)
}
In browser:
var appConfig = require("./config.json");
// => returns {key: 'data'}

I could write the data to disk and then read it back with e.g
https://github.com/webpack/json-loader, but is there any more elegant
ways to do this in-memory?
Why? You can just do whatever you need in config.js (instead of static json) and just return a result:
var data = 'foo' + 'bar';
module.exports = {
key: data
}

Related

node discord js export a function

I've been making a discord.js bot following the official guide.
I have all my commands in the /commands folder as advised.
Then I followed the course to create a currency system with sequelize, following this page from the same guide.
I have a balance.js file inside the commands folder, but when I'm calling it, it gives me this error:
TypeError: currency.getBalance is not a function
I've defined the function in my app.js file, but how can I export it (or use it) inside the balance.js which is called by the app.js file?
This is the function defined in the main file app.js:
Reflect.defineProperty(currency, 'getBalance', {
value: function getBalance(id) {
const user = currency.get(id);
return user ? user.balance : 0;
},
});
This is balance.js:
module.exports = {
name: 'balance',
description: 'Informs you about your balance.',
cooldown : 10,
guildOnly : true,
aliases: ['bal', 'cur', 'gem', 'gems'],
execute(message, args) {
const Discord = require('discord.js');
const { Users, CurrencyShop, UserItems, CardBase, UserCollec } = require('../dbObjects');
const currency = require('../app.js')
async () => { const storedBalances = await Users.findAll();
storedBalances.forEach(b => currency.set(b.user_id, b));
UserCollec.sync(); }
const target = message.author;
return message.channel.send(`${target} has ${currency.getBalance(target.id)}<:Gem:756059891465977886>`);
},
};
EDIT:
I progressed. Now I understand that I have to import the currency variable, which has been declared as a new Discord.Collection() in app.js.
I need to refer to this variable in a module and this module doesn't seem to see it as a collection. How do I import it?

Angular 8 - copy to clipboard a JSON Object

I have a JSON response I get from backend which I'm displaying as {{ response | json }}. There's a copy to clipboard option where i need to copy the contents of response. I have the following code
copy(response){
let val = response;
const selBox = document.createElement('textarea');
selBox.style.position = 'fixed';
selBox.style.left = '0';
selBox.style.top = '0';
selBox.style.opacity = '0';
selBox.value = val;
document.body.appendChild(selBox);
selBox.focus();
selBox.select();
document.execCommand('copy');
document.body.removeChild(selBox);}
This copies as [object object] since response is an object. I can copy it converting the response to a string as let val = JSON.stringyfy(response) . But this does not copy it in a formatted way I display it, instead copies the json in one single line like a string. So how to copy to clipboard a JSON object in a proper formatted way?
There is a built-in Clipboard class in the angular cdk that makes this a little easier to do. You should also use the space parameter with your JSON.stringify
First npm install the #angular/cdk package if you don't have it already.
In your #NgModule import ClipboardModule
import { ClipboardModule } from '#angular/cdk/clipboard';
#NgModule({
imports: [
ClipboardModule
],
})
In your component's typescript file import the Clipboard class
import { Clipboard } from '#angular/cdk/clipboard';
#Component({ /* ... */ })
export class MyComponent {
constructor(private clipboard: Clipboard) { }
public copy() {
// replace this object with your data
const object = {abc:'abc',xy:{x:'1',y:'2'}}
// Note the parameters
this.clipboard.copy(JSON.stringify(object, null, 2));
}
}
In your component's template
<button (click)="copy()">
Copy Data
</button>
The result of stringifying {abc:'abc',xy:{x:'1',y:'2'}} when pasted:
{
"abc": "abc",
"xy": {
"x": "1",
"y": "2"
}
}
With reference to the answer linked by x4rf41, you can make your stringify function whitespace your JSON with let val = JSON.stringify(response,null,2). If you want syntax highlighting, you can use user123444555621's function.
A much neater way to copy text is to add an event listener for the copy event, and set the clipboardData dataTransfer object:
window.addEventListener('copy', (event) => {
if(copying){
let val = JSON.stringify(response,null,2);
event.preventDefault(); //stop the browser overwriting the string
event.clipboardData.setData("text/plain",val); //encode the appropriate string with MIME type "text/plain"
copying = false;}
});
copy = function (){
copying = true;
document.execCommand('copy');}
If you are using the afore-mentioned syntax highlighting function, you probably want to specify MIME type "text/html". Hopefully the formatting options in the linked answer suit your needs.

Resolving an ES6 module imported from a URL with Rollup

It is perfectly valid to import from a URL inside an ES6 module and as such I've been using this technique to reuse modules between microservices that sit on different hosts/ports:
import { authInstance } from "http://auth-microservice/js/authInstance.js"
I'm approaching a release cycle and have started down my usual path of bundling to IIFEs using rollup. Rollup doesn't appear to support es6 module imports from URLs, I think it should as this is allowed in the spec :(
module-name
The module to import from. This is often a relative or absolute path name to the .js file containing the module. Certain bundlers may permit or require the use of the extension; check your environment. Only single quotes and double quotes Strings are allowed. (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import)
I've dug through the interwebs for an hour now and have come up with nothing. Has anybody seen a resolver similar to rollup-plugin-node-resolve for resolving modules from URLs?
I had to move on from this quickly so ended up just writing a skeleton of a rollup plugin. I still feel that resolving absolute paths should be a core feature of rollup.
Updated snippet
We have been using this to transpile production code for several of our apps for a considerable amount of time now.
const fs = require('fs'),
path = require('path'),
axios = require("axios")
const createDir = path => !fs.existsSync(path) && fs.mkdirSync(path)
const mirrorDirectoryPaths = async ({ cacheLocation, url }) => {
createDir(cacheLocation)
const dirs = [], scriptPath = url.replace(/:\/\/|:/g, "-")
let currentDir = path.dirname(scriptPath)
while (currentDir !== '.') {
dirs.unshift(currentDir)
currentDir = path.dirname(currentDir)
}
dirs.forEach(d => createDir(`${cacheLocation}${d}`))
return `${cacheLocation}${scriptPath}`
}
const cacheIndex = {}
const writeToDiskCache = async ({ cacheLocation, url }) => {
//Write a file to the local disk cache for rollup to pick up.
//If the file is already existing use it instead of writing a new one.
const cached = cacheIndex[url]
if (cached) return cached
const cacheFile = await mirrorDirectoryPaths({ cacheLocation, url }),
data = (await axiosInstance.get(url).catch((e) => { console.log(url, e) })).data
fs.writeFileSync(cacheFile, data)
cacheIndex[url] = cacheFile
return cacheFile
}
const urlPlugin = (options = { cacheLocation }) => {
return {
async resolveId(importee, importer) {
//We importing from a URL
if (/^https?:\/\//.test(importee)) {
return await writeToDiskCache({ cacheLocation: options.cacheLocation, url: importee })
}
//We are importing from a file within the cacheLocation (originally from a URL) and need to continue the cache import chain.
if (importer && importer.startsWith(options.cacheLocation) && /^..?\//.test(importee)) {
const importerUrl = Object.keys(cacheIndex).find(key => cacheIndex[key] === importer),
importerPath = path.dirname(importerUrl),
importeeUrl = path.normalize(`${importerPath}/${importee}`).replace(":\\", "://").replace(/\\/g, "/")
return await writeToDiskCache({ cacheLocation: options.cacheLocation, url: importeeUrl })
}
}
}
}
This plugin together with the following config works for me:
https://github.com/mjackson/rollup-plugin-url-resolve
import typescript from "#rollup/plugin-typescript";
import urlResolve from "rollup-plugin-url-resolve";
export default {
output: {
format: "esm",
},
plugins: [
typescript({ lib: ["es5", "es6", "dom"], target: "es5" }),
urlResolve(),
],
};
You can remove the TypeScript plugin obviously.

Bulk upload/import to Firebase with auto generated keys

I need to upload bulk rows of data using Firebase console's 'Import JSON' utility. But I could not get a way to enable Auto Generated Keys like -Kop90... for each rows. But instead it created 0,1.. keys.
This is my sample data
[
{"date":1448323200,"description":"test data1","amount":1273},
{"date":1448323200,"description":"25mm pipes","amount":2662}
]
I would like to generate something like this
I had a similar problem to yours,
I ended up finding a nice solution for the firestore using JS & Node: https://www.youtube.com/watch?v=Qg2_VFFcAI8&ab_channel=RetroPortalStudio
But since I needed it for the Realtime Database I just altered it slightly to suit my needs:
Pre-requisites:
JS SDK installation: https://firebase.google.com/docs/web/setup?authuser=0#add-sdk-and-initialize
Using SDK version 8 (namespaced) [Could be easily altered for use in v9]
Steps:
Create folder called files in root project directory:
Add all your json files as shown below:
Add the following code to a file (in this example the file is called "uploader.js"):
NOTE: The only thing missing from the below code is the firebaseConfig obj, you can get this obj by following this guide: https://support.google.com/firebase/answer/7015592#zippy=%2Cin-this-article
var firebase = require("firebase/app");
require("firebase/database");
const firebaseConfig = {
// your config details...
};
firebase.initializeApp(firebaseConfig);
const database = firebase.database();
// File directory details:
const path = require("path");
const fs = require("fs");
const directoryPath = path.join(__dirname, "files");
fs.readdir(directoryPath, function(err, files) {
if (err) {
return console.log("Unable to scan directory: " + err);
}
files.forEach(function(file) {
var lastDotIndex = file.lastIndexOf(".");
var items = require("./files/" + file);
var listRef = database.ref(`${file.substring(0, lastDotIndex)}/`);
items.forEach(function(obj) {
var postRef = listRef.push();
postRef.set(obj)
.then(function(docRef) {
console.log("Document written");
})
.catch(function(error) {
console.error("Error adding document: ", error);
});
});
});
});
Lastly, open the terminal to the directory where uploader.js can be found & run:
node uploader.js
After the running the operation, each file becomes a collection, and all the contents of each file are listed with a unique pushId:

Get a local json file on NativeScript

How to get a local big json data?
I have tried this, but I had no success:
var sa = require("./shared/resources/sa.json");
var array = new observableArrayModule.ObservableArray(sa);
Use the file-system module to read the file and then parse it with JSON.parse():
var fs = require('file-system');
var documents = fs.knownFolders.currentApp();
var jsonFile = documents.getFile('shared/resources/sa.json');
var array;
var jsonData;
jsonFile.readText()
.then(function (content) {
try {
jsonData = JSON.parse(content);
array = new observableArrayModule.ObservableArray(jsonData);
} catch (err) {
throw new Error('Could not parse JSON file');
}
}, function (error) {
throw new Error('Could not read JSON file');
});
Here's a real life example of how I'm doing it in a NativeScript app to read a 75kb/250 000 characters big JSON file.
TypeScript:
import {knownFolders} from "tns-core-modules/file-system";
export class Something {
loadFile() {
let appFolder = knownFolders.currentApp();
let cfgFile = appFolder.getFile("config/config.json");
console.log(cfgFile.readTextSync());
}
}
As of TypeScript version 2.9.x and above (in NativeScript 5.x.x is using versions 3.1.1 and above) we can now use resovleJsonModule option for tsconfig.json. With this option, the JSON files can now be imported just as modules and the code is simpler to use, read and maintain.
For example, we can do:
import config from "./config.json";
console.log(config.count); // 42
console.log(config.env); // "debug"
All we need to do is to use TypeScript 2.9.x and above and enable the propety in tsconfig.json
// tsconfig.json
{
"compilerOptions": {
"module": "commonjs",
"resolveJsonModule": true,
"esModuleInterop": true
}
}
A sample project demonstrating the above can be found here
I just wanted to add one more thing, which might be even easier. You can simply write the content of your JSON file in a data.js file, or whatever name you would like to use, and export it as an array. Then you can just require the data.js module.