The use of Buffer before saving to IPFS - ipfs

I am following an IPFS github example to save to IPFS:
'use strict'
const IPFS = require('ipfs')
const all = require('it-all')
async function main () {
const node = await IPFS.create()
const version = await node.version()
console.log('Version:', version.version)
for await (const file of await node.add({
path: 'hello.txt',
content: Buffer.from('Hello World 101') //<<<=== why Buffer before assigned to content?
})) {
console.log('Added file:', file.path, file.cid.toString())
const data = Buffer.concat(await all(node.cat(file.cid)))
console.log('Added file contents:', data.toString())
}
}
main()
I notice the string is converted to binary with Buffer before saving. Can someone explain the use of Buffer here? What about saving an image or video file?

By default Node.js works with Buffers where use APIs who can be streams, if you works with strings in JavaScript, these works in unicode (utf-8) and these can broke with binary data (for example, an image, a video file, etc).
An easy example to difference strings from Buffers could be compare the size of UTF-8 string, as unicode text (counting by character) and a buffer (counting by byte):
> const str = 'Hello δΈ–η•Œ';
undefined
> str.length
8
> const buf = Buffer.from(str, 'utf8');
undefined
> buf.length
12
> buf.toString('hex');
'48656c6c6f7720e4b896e7958c'
> buf.toString('utf8');
'Hello δΈ–η•Œ'
In summary, it's a standard work with buffers with APIs like FS, Socket, etc.

Related

How to save imported JSON file with Expo Filesystem

I have been working on a React Native project with Expo that uses a json file to store local data. I am importing the data like so
import data from '../database.json'
I am making changes (adding and removing) to the imported JSON by using data.push(new_data). These changes are not persistent when I close the app because I cannot figure out how to save them. I have looked at using the expo-file-system library as so:
import * as FileSystem from 'expo-file-system';
...
FileSystem.writeAsStringAsync(FileSystem.documentDirectory + 'database.json', data);
This is from looking at examples in the API documentations. This however always throws promise rejections and doesn't end up writing the file. Can you point me in the right direction?
Also, should I import the database.json in a different way so I will already have the uri to save it to?
The documentation doesn't give an example of it's returned props in promises, so I was overlooking it for longer than I care to admit πŸ˜…. I was really dedicated to figuring this out so I could use the Expo solution, and totally missed the return Promise for createFileAsync, so hopefully this saves someone a significant amount of time in the future.
import * as FileSystem from 'expo-file-system';
const { StorageAccessFramework } = FileSystem;
const saveFile = async () => {
const permissions = await StorageAccessFramework.requestDirectoryPermissionsAsync();
// Check if permission granted
if (permissions.granted) {
// Get the directory uri that was approved
let directoryUri = permissions.directoryUri;
let data = "Hello World";
// Create file and pass it's SAF URI
await StorageAccessFramework.createFileAsync(directoryUri, "filename", "application/json").then(async(fileUri) => {
// Save data to newly created file
await FileSystem.writeAsStringAsync(fileUri, data, { encoding: FileSystem.EncodingType.UTF8 });
})
.catch((e) => {
console.log(e);
});
} else {
alert("You must allow permission to save.")
}
}
Use AsyncStorage instead. The react native package is deprecated but working, or use #react-native-community/async-storage and convert json to string (AsyncStorage can only store strings)
Set item
import AsyncStorage from '#react-native-community/async-storage';
...
await AsyncStorage.setItem('myData', JSON.stringify(data))
Get item
const data = await AsyncStorage.getItem('myData')
I found #JayMax answer very helpful however it's only for Android.
On iOS all you need to do is use Sharing.shareAsync and then you can save data to the file. Check this example:
const fileUri = FileSystem.documentDirectory + 'data.txt';
FileSystem.writeAsStringAsync(fileUri, 'here goes your data from JSON. You can stringify it :)', {
encoding: FileSystem.EncodingType.UTF8,
});
const UTI = 'public.text';
Sharing.shareAsync(fileUri, {UTI}).catch((error) => {
console.log(error);
});
If you using AsyncStorage, it only store for small data. Maybe 6mb or 10 mb.
You can use expo fileSystem
import * as FileSystem from 'expo-file-system';
...
FileSystem.writeAsStringAsync(FileSystem.documentDirectory + 'database.json', data);
Convert your data (Type json to string) Such as this:
writeData = async () => {
var persons = ''
await axios.get(`http://192.168.0.48:4000/api/sql/student`)
.then(res => {
persons = res.data
})
await FileSystem.writeAsStringAsync(FileSystem.documentDirectory + `offline_queue_stored.json`, JSON.stringify(persons));
}
#1.If the JSON File is in your Project Folder (PC/Laptop)
import data from './database.json';
#2. If the JSON File is in your Phone
import * as FileSystem from 'expo-file-system';
import * as DocumentPicker from 'expo-document-picker';
this.state = {
fileURI: null,
};
componentDidMount = () =>{
this._pickDocument();
}
_pickDocument = async () => {
let result = await DocumentPicker.getDocumentAsync({});
this.setState({
fileURI: result.uri
})
let fileData = await FileSystem.readAsStringAsync(this.state.fileURI)
console.log(fileData)
};

How to save & retrive image to/from mysql?

Two part quersion.
Part 1:
Im uploading an image to my server and want to save it to my database.
So far:
table:
resolver:
registerPhoto: inSequence([
async (obj, { file }) => {
const { filename, mimetype, createReadStream } = await file;
const stream = createReadStream();
const t = await db.images.create({
Name: 'test',
imageData: stream ,
});
},
])
executing query:
Executing (default): INSERT INTO `images` (`Id`,`imageData`,`Name`) VALUES (DEFAULT,?,?);
But nothing is saved.
Im new to this and im probably missing something but dont know what.
Part2:
This is followed by part 1, lets say I manage to save the image, how do I read it and send it back to my FE?
An edit: Ive read alot of guides saving the an image name to the db and then tha actuall image in a folder. This is NOT what im after, want to save the image to the DB and then be able to fetch it from the DB abd present it.
This took me some time but I finaly figured it out.
First step (saving to the db):
Have to get the entire stream data and read it like this:
export const readStream = async (stream, encoding = 'utf8') => {
stream.setEncoding('base64');
return new Promise((resolve, reject) => {
let data = '';
// eslint-disable-next-line no-return-assign
stream.on('data', chunk => (data += chunk));
stream.on('end', () => resolve(data));
stream.on('error', error => reject(error));
});
};
use like this:
const streamData = await readStream(stream);
Before saving I tur the stream into a buffer:
const buff = Buffer.from(streamData);
Finaly the save part:
db.images.create(
{
Name: filename,
imageData: buff,
Length: stream.bytesRead,
Type: mimetype,
},
{ transaction: param }
);
Note that I added Length and Type parameter, this is needed if you like to return a stream when you return the image.
Step 2 (Retrieving the image).
As #xadm said multiple times you can not return an image from GRAPHQL and after some time I had to accept that fact, hopefully graphql will remedy this in the future.
S What I needed to do is set up a route on my fastify backend, send a image Id to this route, fetch the image and then return it.
I had a few diffirent approaches to this but in the end I simpy returned a binary and on the fronted I encoded it to base64.
Backend part:
const handler = async (req, reply) => {
const p: postParams = req.params;
const parser = uuIdParserT();
const img = await db.images.findByPk(parser.setValueAsBIN(p.id));
const binary = img.dataValues.imageData.toString('binary');
const b = Buffer.from(binary);
const myStream = new Readable({
read() {
this.push(Buffer.from(binary));
this.push(null);
},
});
reply.send(myStream);
};
export default (server: FastifyInstance) =>
server.get<null, any>('/:id', opts, handler);
Frontend part:
useEffect(() => {
// axiosState is the obj that holds the image
if (!axiosState.loading && axiosState.data) {
// #ts-ignore
const b64toBlob = (b64Data, contentType = '', sliceSize = 512) => {
const byteCharacters = atob(b64Data);
const byteArrays = [];
for (let offset = 0; offset < byteCharacters.length; offset += sliceSize) {
const slice = byteCharacters.slice(offset, offset + sliceSize);
const byteNumbers = new Array(slice.length);
// #ts-ignore
// eslint-disable-next-line no-plusplus
for (let i = 0; i < slice.length; i++) {
byteNumbers[i] = slice.charCodeAt(i);
}
const byteArray = new Uint8Array(byteNumbers);
byteArrays.push(byteArray);
}
const blob = new Blob(byteArrays, { type: contentType });
return blob;
};
const blob = b64toBlob(axiosState.data, 'image/jpg');
const urlCreator = window.URL || window.webkitURL;
const imageUrl = urlCreator.createObjectURL(blob);
setimgUpl(imageUrl);
}
}, [axiosState]);
and finaly in the html:
<img src={imgUpl} alt="NO" className="imageUpload" />
OTHER:
For anyone who is attempting the same NOTE that this is not a best practice thing to do.
Almost every article I found saved the images on the sever and save an image Id and other metadata in the datbase. For the exact pros and cons for this I have found the following helpful:
Storing Images in DB - Yea or Nay?
I was focusing on finding out how to do it if for some reason I want to save an image in the datbase and finaly solved it.
There are two ways to store images in your SQL database. You either store the actual image on your server and save the image path inside your mySql db OR you create a BLOB using the image and store it in db.
Here is a handy read https://www.technicalkeeda.com/nodejs-tutorials/nodejs-store-image-into-mysql-database
you should save the image in a directory and save the link of this image in the database

How to upload a multer file as a blob in MySQL?

I'm trying to store a file uploaded into a nodejs server into a mysql database. The file is uploaded and processed using a Multer middleware, thus storing all the file information in req.files. I believe, correct me if I am mistaken, that this data/parameters must be somehow converted to a blob format yet I do not know how to do so effectively. How could this be done in order to upload it into a MySQL database?
I am using the latest version of XAMPP with a MySQL InnoDB tables. The file parameters/data generated by Multer are shown in the code snippet. I've attempted converting the entire file, that is req.files[0], into a blob yet the blobs stored in the MySQL database are identical in size despite using different files.
[ { fieldname: 'files',
originalname: 'Event Information 2018-1_2084.pdf',
encoding: '7bit',
mimetype: 'application/pdf',
destination:
'/home/millana/Desktop/Brandink/Code/server/middleware/../designs/',
filename: 'Event Information 2018-1_2084_1559843587360.pdf',
path:
'/home/millana/Desktop/Brandink/Code/server/designs/Event Information 2018-1_2084_1559843587360.pdf',
size: 1125992 } ]
Below is shown the restAPI called to execute the query on an already existing table Designs and database.
router.post('/order', upload.array('files'), (req, res) => {
console.log(req.files[0].buffer);
const order_id = 1;
const position = 'front';
const sql = `INSERT INTO Designs VALUES ('${order_id}', '${position}', '${req.files[0].filename}', '${req.files[0].mimetype}', '${req.files[0].buffer}')`;
db.query(sql, (err, result) => {
if(err) res.status(400).send(err);
res.status(200).send(result);
});
});
The middleware function includes:
const storage = multer.diskStorage({
destination: function(req, file, cb) {
cb(null, __dirname + '/../designs/');
},
filename: function (req, file, cb) {
let format = '';
if ( file.mimetype === 'image/jpeg' ) format = '.jpg';
else if ( file.mimetype === 'image/png' ) format = '.png';
else if ( file.mimetype === 'application/pdf' ) format = '.pdf';
let fileName = _.split(file.originalname, '.')[0] + '_' + Date.now() + format;
cb(null, fileName);
}
});
const upload = multer({
storage: storage,
limits: {
fileSize: MAX_SIZE
},
});
I expect the blob size to vary from one file to another proportionally with their file size. However, the size for all files uploaded is of 9 bytes while the buffers are identical, indicating that the same object/file is being stored. How can I store each different file accordingly?
How about using fileFilter? You can manage the quality of file, or size etc.
https://github.com/expressjs/multer
As you see in links or your knowledge, multer has some options like fileFilter or limits.
fileFilter, is similar to your function, to select file type.
limits,
fileSize : {
5 * 1024 * 1024
}
And I've heard that storing files like image is to convert a binary file in a buffer.
if you console.log(img file), you can see the output such as
Binary {
_bsontype: 'Binary',
sub_type: 0,
position: 260547,
buffer: } }
then change array Buffer to base64, string.
the result is,
data : image/jpeg;base64, v9emlk3SGF45fg...
It is different from 'filename or originalName',
so it could be saved as a different one.

Angular - upload file as base64

I am trying to upload files from Angular 4 app to a JSON API service that accepts base64 strings as file content.
So what I do is - read the file with FileReader.readAsDataURL, then when user confirms the upload I will create a JSON request to the API and send the base64 string of the file I got earlier.
This is where the problem starts - as soon as I do something with the "content" (log it, send it, w/e) the request will be send, but its insanely slow, e.g. 20 seconds for 2MB file.
I have tried:
using ArrayBuffer and manually converting it to base64
storing the base64 string in HTML and retrieving it later
reading the files after user clicks on upload button
using the old client from #angular/common
using plain XHR request
but everything leads to the same result.
I know where the problem lies. But why does it happen? Is it something browser specific or angular specific? Is there a more preferred approach (keep in mind it has to be base64 string)?
Notes:
changing anything in the API is beyond my control
API is fine, sending any file trough postman will finish immediately
Code:
This method runs when user adds file to the dropzone:
public onFileChange(files: File[]) : void {
files.forEach((file: File, index: number) => {
const reader = new FileReader;
// UploadedFile is just a simple model that contains filename, size, type and later base64 content
this.uploadedFiles[index] = new UploadedFile(file);
//region reader.onprogress
reader.onprogress = (event: ProgressEvent) => {
if (event.lengthComputable) {
this.uploadedFiles[index].updateProgress(
Math.round((event.loaded * 100) / event.total)
);
}
};
//endregion
//region reader.onloadend
reader.onloadend = (event: ProgressEvent) => {
const target: FileReader = <FileReader>event.target;
const content = target.result.split(',')[1];
this.uploadedFiles[index].contentLoaded(content);
};
//endregion
reader.readAsDataURL(file);
});
}
This method runs when users clicks save button
public upload(uploadedFiles: UploadedFile[]) : Observable<null> {
const body: object = {
files: uploadedFiles.map((uploadedFile) => {
return {
filename: uploadedFile.name,
// SLOWDOWN HAPPENS HERE
content: uploadedFile.content
};
})
};
return this.http.post('file', body)
}
For sending big files to server you should use FormData to be able to send it as multi-part instead of a single big file.
Something like:
// import {Http, RequestOptions} from '#angular/http';
uploadFileToUrl(files, uploadUrl): Promise<any> {
// Note that setting a content-type header
// for mutlipart forms breaks some built in
// request parsers like multer in express.
const options = new RequestOptions();
const formData = new FormData();
// Append files to the virtual form.
for (const file of files) {
formData.append(file.name, file)
}
// Send it.
return this.http.post(uploadUrl, formData, options);
}
Also don't forget to set the header 'Content-Type': undefined, I've scratched my head over this for hours.

How to create an IPFS compatible multihash

I'm trying to create an IPFS compatible mutihash but it is not matching. I am asking here because I have not yet found an example that takes this from hashing to the end result.
echo -n multihash > multihash.txt
ipfs add multihash.txt
added QmZLXzjiZU39eN8QirMZ2CGXjMLiuEkQriRu7a7FeSB4fg multihash.txt
sha256sum multihash.txt
9cbc07c3f991725836a3aa2a581ca2029198aa420b9d99bc0e131d9f3e2cbe47 multihash.txt
node
> var bs58=require('bs58')
bs58.encode(new Buffer('9cbc07c3f991725836a3aa2a581ca2029198aa420b9d99bc0e131d9f3e2cbe47','hex'))
'BYptxaTgpcBrqZx9tghNCWFfUuYBcGfLydEvDjXqBV7k'
> var mh=require('multihashes')
mh.toB58String(mh.encode(new Buffer('9cbc07c3f991725836a3aa2a581ca2029198aa420b9d99bc0e131d9f3e2cbe47','hex'), 'sha2-256'))
'QmYtUc4iTCbbfVSDNKvtQqrfyezPPnFvE33wFmutw9PBBk'
The intent is to re-create the IPFS path QmZLXzjiZU39eN8QirMZ2CGXjMLiuEkQriRu7a7FeSB4fg using the multihashes package.
I'm able to create the same hash QmYtUc...9PBBk as shown in the example here: https://github.com/multiformats/multihash#example
IPFS uses multihash where the format is the following:
base58(<varint hash function code><varint digest size in bytes><hash function output>)
The list of hash function codes can be found in this table.
Here's some pseudocode of the process using SHA2-256 as the hashing function.
sha2-256 size sha2-256("hello world")
0x12 0x20 0xb94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
Concatenating those three items will produce
1220b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
Which then you encode it to base58
QmaozNR7DZHQK1ZcU9p7QdrshMvXqWK6gpu5rmrkPdT3L4
Here's an example of how to essentially implement multihash in JavaScript:
const crypto = require('crypto')
const bs58 = require('bs58')
const data = 'hello world'
const hashFunction = Buffer.from('12', 'hex') // 0x20
const digest = crypto.createHash('sha256').update(data).digest()
console.log(digest.toString('hex')) // b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
const digestSize = Buffer.from(digest.byteLength.toString(16), 'hex')
console.log(digestSize.toString('hex')) // 20
const combined = Buffer.concat([hashFunction, digestSize, digest])
console.log(combined.toString('hex')) // 1220b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9
const multihash = bs58.encode(combined)
console.log(multihash.toString()) // QmaozNR7DZHQK1ZcU9p7QdrshMvXqWK6gpu5rmrkPdT3L4
There's a CLI you can use to generate multihashes:
$ go get github.com/multiformats/go-multihash/multihash
$ echo -n "hello world" | multihash -a sha2-256
QmaozNR7DZHQK1ZcU9p7QdrshMvXqWK6gpu5rmrkPdT3L4
As #David stated, A file in IPFS is "transformed" into a Unixfs "file", which is a representation of files in a DAG. So when you use add to upload a file to IPFS, the data has metadata wrapper which will give you a different result when you multihash it.
For example:
$ echo -n "hello world" | ipfs add -Q
Qmf412jQZiuVUtdgnB36FXFX7xg5V6KEbSJ4dpQuhkLyfD
Here's an example in Node.js of how to generate the exact same multihash as ipfs add:
const Unixfs = require('ipfs-unixfs')
const {DAGNode} = require('ipld-dag-pb')
const data = Buffer.from('hello world', 'ascii')
const unixFs = new Unixfs('file', data)
DAGNode.create(unixFs.marshal(), (err, dagNode) => {
if (err) return console.error(err)
console.log(dagNode.toJSON().multihash) // Qmf412jQZiuVUtdgnB36FXFX7xg5V6KEbSJ4dpQuhkLyfD
})
A file in IPFS is 'transformed' into a Unixfs file, which is a representation of files in a DAG, in your example, you are hashing directly your multihash.txt with sha2-256, but what happens inside IPFS is:
file gets chunked into 256KiB pieces
each chunk goes into a DAG node inside a Unixfs protobuf https://github.com/ipfs/js-ipfs-unixfs
a dag is created with links to all the chunks.
Some of the other answers are outdated. This is what worked for me:
const { UnixFS } = require('ipfs-unixfs') // #4.0.3
const { DAGNode } = require('ipld-dag-pb') // #0.22.2
const data = Buffer.from("Hello World\n")
const file = new UnixFS({data})
const node = new DAGNode(file.marshal())
const link = await node.toDAGLink()
const cid = link.Hash
console.log(cid.toV0().toString())
// Qmf412jQZiuVUtdgnB36FXFX7xg5V6KEbSJ4dpQuhkLyfD
const { randomBytes } = require('crypto')
const multihash = require('multihashes')
const buffer = Buffer.from(randomBytes(32), 'hex')
const encoded = multihash.encode(buffer, 'sha2-256')
const hash = multihash.toB58String(encoded)
console.log(hash)
The simplest way currently is to simply use ipfs-http-client. All above previous solutions in this thread don't work for me anymore.
import ipfsClient from "ipfs-http-client";
const ipfs = ipfsClient(IPFS_PROVIDER, "5001", { protocol: "https" });
// onlyHash: true only generates the hash and doesn't upload the file
const hash = (await ipfs.add(new Buffer(vuln), { onlyHash: true }))[0].hash