How to add multiple files to IPFS Mutable Files System without revealing the content to the IPFS network - ipfs

I need to be able to add a bunch of files to a common directory using the mutable file system from ipfs.files.* without revealing this content to the IPFS network until it is needed.
This is the code I have been using to test:
const onDrop = useCallback(async (acceptedFiles) => {
const test = JSON.stringify({test:"test"});
const testFile = new File([test], "test.json")
try{
for(let i = 0; i < acceptedFiles.length; i++){
console.log(acceptedFiles[i].path)
await ipfs.files.write(files[i].path, files[i], {parents: true})
}
await ipfs.files.write("/test.json", [testFile], { create: true })
} catch(e){
console.log(e)
}
console.log((await ipfs.files.stat( "/" )).cid.toV1().toString())
}, [])
This throws me the error:
ReferenceError: global is not defined
at toAsyncIterator (to-async-iterator.js?6924:44:1)
at eval (write.js?db08:99:37)
at eval (create-lock.js?5096:33:1)
at async mfsWrite (write.js?db08:98:1)
caused by the ipfs.files.wite() statement in the loop.
I initially added content through the ipfs.addAll() function, but this did not seem to work for adding it to the global MFS directory.
I think I am using the documentation right, but I might be trying to use MFS for something it was not intended for. If anyone knows the best way to achieve combining many files into a directory without revealing the content to the IPFS network, please let me know.

Related

Beginner problem with chrome navigator.serial

I am working on use of serial port access with chrome browser, using "navigator.serial".
My initial experiment is based on a prior posting to stackoverflow:
Is there an example site that uses navigator.serial?
I have duplicated the code example referenced above, and have made the required configuration change #enable-experimental-web-platform-features, again as described above.
I am doing this all on Ubuntu 18.04. There are two USB serial ports attached to the machine, and I have verified using gtkterm that I can send and receive data between the two ports.
From the example given (code duplicated below), I find that I can open the serial port and establish a "reader", and the step await reader.read() does wait until an incoming character appears on the serial port, but at this point the variabler/object "data" remains undefined.
Two questions/issues:
What am I doing wrong that leaves "data" undefined? I added an alert() dialog box that pops up once const {done, data} = await reader.read(); proceeds, however, the dialog box says that "data" is at that point undefined. Is data a promise that I am failing to wait to be fulfilled?
I have not been able to find a (hopefully self-contained) reference on the methods and members of the classes involved (i.e., reader.read() and reader.write() are methods available to my object "readeer"; where can I find a list of available methods, and the properties of these?
Here is a copy of the code (small web page) that was obtained from the year-ago posting above:
<html>
<script>
var port;
var buffy = new ArrayBuffer(1);
var writer;
buffy[0]=10;
const test = async function () {
const requestOptions = {
// Filter on devices with the Arduino USB vendor ID.
//filters: [{ vendorId: 0x2341 }],
};
// Request an Arduino from the user.
port = await navigator.serial.requestPort(requestOptions);
// Open and begin reading.
await port.open({ baudrate: 115200 });
//const reader = port.in.getReader();
const reader = port.readable.getReader();
writer = port.writable.getWriter();
//const writer = port.writable.getWriter();
//writer.write(buffy);
while (true) {
const {done, data} = await reader.read();
if (done) break;
console.log(data);
}
} // end of function
</script>
<button onclick="test()">Click It</button>
</html>
Thank you for any assistance!
I was having the exact same problem and managed to solve it.
Change
const {done, data} = await reader.read();
To
const {value, done} = await reader.read();
The example where you got this from (and a few others) were wrong, params around the wrong way.
Also, not too sure why but when I used
const {data, done} = await reader.read();
it did not work either, it did not like the var data.
Documentation on navigator.serial is not great. Here are some links to help
The API (note this is draft and does not exactly match the Chrome implementation)
https://wicg.github.io/serial/
port.readable.getReader() is a ReadableStream
https://streams.spec.whatwg.org/#readablestream
that uses ReadableStreamDefaultReader which is defined as
dictionary ReadableStreamDefaultReadResult {
any value;
boolean done;
};
https://streams.spec.whatwg.org/#readablestreamdefaultreader
An explainer
https://github.com/WICG/serial/blob/gh-pages/EXPLAINER.md
A tutorial
https://codelabs.developers.google.com/codelabs/web-serial
Chromium tracker
https://goo.gle/fugu-api-tracker
The Web Serial API work item
https://bugs.chromium.org/p/chromium/issues/detail?id=884928

Problem with Firebase Image Resize extension [duplicate]

I am following a tutorial to resize images via Cloud Functions on upload and am experiencing two major issues which I can't figure out:
1) If a PNG is uploaded, it generates the correctly sized thumbnails, but the preview of them won't load in Firestorage (Loading spinner shows indefinitely). It only shows the image after I click on "Generate new access token" (none of the generated thumbnails have an access token initially).
2) If a JPEG or any other format is uploaded, the MIME type shows as "application/octet-stream". I'm not sure how to extract the extension correctly to put into the filename of the newly generated thumbnails?
export const generateThumbs = functions.storage
.object()
.onFinalize(async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
const workingDir = join(tmpdir(), 'thumbs');
const tmpFilePath = join(workingDir, 'source.png');
if (fileName.includes('thumb#') || !object.contentType.includes('image')) {
console.log('exiting function');
return false;
}
// 1. Ensure thumbnail dir exists
await fs.ensureDir(workingDir);
// 2. Download Source File
await bucket.file(filePath).download({
destination: tmpFilePath
});
// 3. Resize the images and define an array of upload promises
const sizes = [64, 128, 256];
const uploadPromises = sizes.map(async size => {
const thumbName = `thumb#${size}_${fileName}`;
const thumbPath = join(workingDir, thumbName);
// Resize source image
await sharp(tmpFilePath)
.resize(size, size)
.toFile(thumbPath);
// Upload to GCS
return bucket.upload(thumbPath, {
destination: join(bucketDir, thumbName)
});
});
// 4. Run the upload operations
await Promise.all(uploadPromises);
// 5. Cleanup remove the tmp/thumbs from the filesystem
return fs.remove(workingDir);
});
Would greatly appreciate any feedback!
I just had the same problem, for unknown reason Firebase's Resize Images on purposely remove the download token from the resized image
to disable deleting Download Access Tokens
goto https://console.cloud.google.com
select Cloud Functions from the left
select ext-storage-resize-images-generateResizedImage
Click EDIT
from Inline Editor goto file FUNCTIONS/LIB/INDEX.JS
Add // before this line (delete metadata.metadata.firebaseStorageDownloadTokens;)
Comment the same line from this file too FUNCTIONS/SRC/INDEX.TS
Press DEPLOY and wait until it finish
note: both original and resized will have the same Token.
I just started using the extension myself. I noticed that I can't access the image preview from the firebase console until I click on "create access token"
I guess that you have to create this token programatically before the image is available.
I hope it helps
November 2020
In connection to #Somebody answer, I can't seem to find ext-storage-resize-images-generateResizedImage in GCP Cloud Functions
The better way to do it, is to reuse the original file's firebaseStorageDownloadTokens
this is how I did mine
functions
.storage
.object()
.onFinalize((object) => {
// some image optimization code here
// get the original file access token
const downloadtoken = object.metadata?.firebaseStorageDownloadTokens;
return bucket.upload(tempLocalFile, {
destination: file,
metadata: {
metadata: {
optimized: true, // other custom flags
firebaseStorageDownloadTokens: downloadtoken, // access token
}
});
});

Passing JSON Config into multiple instances of the same bundled React Application

I have several instances of a react-slick carousel. Each of them requires a different set of config options.
Currently, I have the carousel component bundled up via webpack and then deployed to multiple locations. Unfortunately, this means that the bundle is slightly different in each case, as the config file changes the overall bundle! What's the right approach for this solution?
I feel like I can think of the following solutions:
1) Load the config file asynchronously. Seems like a lazy solution, because making an extra round trip is overkill.
2) Try to use require.ensure to split out the config file into it's own chunk.
What's the right approach for this solution?
Thanks!
To reply for point 1, I've managed to accomplish runtime loading of config this way:
import xhr from 'xhr'
class Config {
load_external_config = (cb) => {
xhr.get("config.json", {
sync: true,
timeout: 3000
},(error, response, body)=>{
if(response.statusCode==200) {
try{
const conf = JSON.parse(body);
for(var i in conf) {
this[i] = conf[i];
}
}catch(e){
/* Manage error */
}
} else {
/* Manage error */
}
})
}
}
export let config = new Config();
The class above has two basic functions, on the one hand it is a "singleton", so every time you import it in each file of your project, the istance remain the same and will not be duplicated. On the other hand, through a XHR package it loads (synchronously) an external json file and puts every config voice in its instance as a first level attribute. Later, you will be able to do this:
import { config } from './config'
config.load_external_config();
config.MY_VAR
For point 2 I would like to see some examples, and I will remain tuned to this post for someone more skilled than me.

How to load and use environment-related values in config phase

I would like to deploy my web application to several environments. Using Continuous Integration I can run a task to generate a config.json for a particular environment. This file will contain, among others, the particular URLs to use for it.
{
"baseUrl": "http://www.myapp.es/",
"baseApiUrl": "http://api.myapp.es/",
"baseAuthUrl": "http://api.myapp.es/auth/"
}
The issue comes up when I try to set my different services through providers in the config phase. Of course, services are not available yet in the phase so I cannot use $http to load that json file and set my providers correctly.
Basically I would like to do something like:
function config($authProvider) {
$authProvider.baseUrl = config.baseAuthUrl;
}
Is there a way to load those values on runtime from a file? The only thing I can think about is having that mentioned task altering this file straight away. However I have several modules and therefore, that would have to do in all of them which doesn´t seem right.
You can create constants in the config of your main module:
Add $provide as a dependency in your config method
use the provider method to add all constants like this
$provide.provider('BASE_API_URL', {
$get: function () {
return 'https://myexample.net/api/';
}
});
You can use BASE_API_URL as a dependency in your services.
I hope this helps
Optionally you can set the url depending of your environment:
$provide.provider('BASE_API_URL', {
$get: function () {
if(window.location.hostname.toLowerCase() == 'myapp.myexample.net')
{
return 'https://myexample.net/api/' //pre-production
}else
{
return 'http://localhost:61132/'; //local
}
}
});
Regards!
Finally, the solution was generating an angular constants file using templating (gulp-template) through a gulp task. At the end, I am using a yaml file instead a json one (which is the one generated my CI engine with the proper values for the environment I want to deploy to).
Basically:
config.yml
baseUrl: 'http://www.myapp.es/'
baseApiUrl: 'http://api.myapp.es/'
auth:
url: 'auth/'
config.module.constants.template
(function () {
'use strict';
angular
.module('app.config')
.constant('env_variables', {
baseUrl: '<%=baseUrl%>',
baseApiUrl: '<%=baseApiUrl%>',
authUrl: '<%=auth.url%>'
});
}());
gulpfile.js
gulp.task('splicing', function(done) {
var yml = path.join(conf.paths.src, '../config/config.yml');
var json = yaml.safeLoad(fs.readFileSync(yml, 'utf8'));
var template = path.join(conf.paths.src, '../config/config.module.constants.template');
var targetFile = path.join(conf.paths.src, '/app/config');
return gulp.src(template)
.pipe($.template(json))
.pipe($.rename("config.module.constants.js"))
.pipe(gulp.dest(targetFile), done);
});
Then you just inject it in the config phase you need:
function config($authProvider, env_variables) {
$authProvider.baseUrl = env_variables.baseApiUrl + env_variables.authUrl;
}
One more benefit about using gulp for this need is that you can integrate the generation of these constants with your build, serve or watch tasks and literally, forget about doing any change from now on. Hope it helps!

How to zip up zip files using Gulp Zip

I've started using Gulp JS and must admit I'm finding it really useful.
One of the tasks I need to perform is zip up a collection of folders into individual zip files, one for each folder and then zip all this zipped files up into one single zip file. Using Gulp-Zip I've managed to get this far:
var modelFolders = [
'ELFH_Check',
'ELFH_DDP',
'ELFH_Free'
];
gulp.task('zipModels', function () {
for (var i = 0; i < modelFolders.length; i++) {
var model = modelFolders[i];
gulp.src('**/*', {cwd: path.join(process.cwd(), '/built_templates/' + model) })
.pipe(zip(model + '.zip'))
.pipe(gulp.dest('./built_templates'));
};
});
This works and outputs ELFH_Check.zip, ELFH_DDP.zip and ELFH_Free.zip. However, I then need to zip up these zip files into one zip file called "Templates.zip" and I've not managed to get this task to work:
// zip up model files
gulp.task('zipTemplate', ['zipModels'], function () {
gulp.src('*.zip', {cwd: path.join(process.cwd(), './built_templates/') })
.pipe(zip('Templates_.zip'))
.pipe(gulp.dest('./built_templates'));
});
Does anyone know if this is possible or what I'm doing wrong?
I saw the problem as well, and it seems to be related to the cwd option somehow. I'll investigate further.
After #OverZealous comment, I investigated further and found two issues:
As he said, you need to hint gulp to wait until the end of the dependency task (zipModels), by returning a stream from it. As you have multiple streams, you can use event-stream.merge to return a bundle stream.
The reason why the bundle zip wouldn't work, is because you cwd points to /built_templates/, and the second slash is causing some problem. To work properly, you need to remove the trailing slash, so it should be path.join(process.cwd(), '/built_templates').
IMPORTANT
Anyway, you should avoid temporary files. Gulp philosophy is to try using pipes to avoid IO. In that direction, what you want to do is to cut the intermediary dest steps, merge the streams, zip them, and finally, output them.
Something like that:
var es = require('event-stream');
var modelFolders = [
'ELFH_Check',
'ELFH_DDP',
'ELFH_Free'
];
gulp.task('zipModels', function () {
var zips = [],
modelZip;
for (var i = 0; i < modelFolders.length; i++) {
var model = modelFolders[i];
modelZip = gulp.src('**/*', {cwd: path.join(process.cwd(), '/built_templates/' + model) })
.pipe(zip(model + '.zip'));
// notice we removed the dest step and store the zip stream (still in memory)
zips.push(modelZip);
};
// we finally merge them (the zips), zip them again, and output.
return es.merge.apply(null, zips)
.pipe(zip('templates.zip'))
.pipe(gulp.dest('./'));
});
By the name of your folder (built_templates), it seems you have some other task that will generate the temporary built files. Preferably, you don't want these as well. You should pipe their streams directly to the ZIP stream, a finally, to the bundle-zip stream. By doing that, you would have a simple stream flow, with one disk read, and one disc write at the end, with no temporary files.
If you need them to be different tasks, consider having a function that will generate the stream up to the step before the gulp.dest pipe, and use this function on all subtasks.
Additionally, always try to hint your async tasks by returning a stream, a promise or receiving a callback function, and advise the end of the task.