Cannot append to formData object on file upload in React - html

I am new to react and I am attempting to upload a file to my node backend, but I've spent a while at this and cannot get it to work. I appear to send the data correctly to my handle function, but after that I cannot append it to my formData object.
Here is the function that I call on a handle submit button from my upload html.
uploadAction(){
var self = this;
console.log('inside uploadAction');
var data = new FormData();
// var filedata = {};
var filedata = document.querySelector('input[type="file"]').files[0];
data.append('file', filedata);
console.log('this is the value of data in uploadAction ', data);
console.log('this is the value of filedata in uploadAction ', filedata)
const config = { headers: { 'Content-Type': 'multipart/form-data' } };
axios.post('http://localhost:5000/upload',{
filedata: data
},config)
.then((response)=>{
console.log('back again success from upload');
})
.catch((err)=>{
console.error('error from upload ', err);
});
So when I console log out the data and the filedata objects I get the following result.
this is the value of data in uploadAction FormData {}
this is the value of filedata in uploadAction File {name: "suck.jpg"....
So somehow it appears that my filedata is being brought in correctly, but there's a disconnect on how this being appended to the data object. Which I find confusing, as this seems to be the way I've found to append this data object from looking at these things online. I think my axios call is correct, but I am of course getting to an error before that.
Does anyone have any suggestions on how I can fix this? Is there an easier way to do this in react that doesn't involve using querySelector? I probably don't want to use dropzone or a similar library/package, I just want to learn how to do a simple upload file.
If anyone has any suggestions on this I would really appreciate it. I've spent some time on this and I seem to be going in circles.
EDIT: as per the suggestion of the first comment below I have added
for (var pair of data.entries()) {
console.log(pair[0]+ ', ' + pair[1]);
}
to my code to try and console log my data value (ie the dataForm object). The result was:
file, [object File]
while this is true, it doesn't help fix my problem.

Change your call to the following way and it should work (data is your FormData):
axios.post('http://localhost:5000/upload',data,config)
In addition to it, as you are using React, instead of using querySelector, you could use the onChange event from your file input.
Just as an example:
import React,{Component} from 'react'
class UploadComponent extends Component {
constructor(props, context) {
super(props, context);
this.state = {file: null};
this.handleFileChange = this.handleFileChange.bind(this);
this.sendFile = this.sendFile.bind(this);
}
handleFileChange(event) {
this.setState({file:event.target.files[0]})
}
sendFile() {
//In here you can get the file using this.state.file and send
}
render() {
return(
<div>
<input className="fileInput" type="file" onChange={this.handleFileChange}/>
<button type="submit" onClick={this.sendFile}>Upload </button>
</div>
)
}
}

Related

How to use a Javascript file to refresh/reload a div from an HTML file?

I am using Node JS and have a JS file, which opens a connection to an API, works with the receving API data and then saves the changed data into a JSON file. Next I have an HTML file, which takes the data from the JSON file and puts it into a table. At the end I open the HTML file in my browser to look at the visualized table and its data.
What I would like to happen is, that the table (or more specific a DIV with an ID inside the table) from the HTML file refreshes itself, when the JSON data gets updated from the JS file. Kinda like a "live table/website", that I can watch change over time without the need to presh F5.
Instead of just opening the HTML locally, I have tried it by using the JS file and creating a connection with the file like this:
const http = require('http');
const path = require('path');
const browser = http.createServer(function (request, response) {
var filePath = '.' + request.url;
if (filePath == './') {
filePath = './Table.html';
}
var extname = String(path.extname(filePath)).toLowerCase();
var mimeTypes = {
'.html': 'text/html',
'.css': 'text/css',
'.png': 'image/png',
'.js': 'text/javascript',
'.json': 'application/json'
};
var contentType = mimeTypes[extname] || 'application/octet-stream';
fs.readFile(filePath, function(error, content) {
response.writeHead(200, { 'Content-Type': contentType });
response.end(content, 'utf-8');
});
}).listen(3000);
This creates a working connection and I am able to see it in the browser, but sadly it doesn't update itself like I wish. I thought about some kind of function, which gets called right after the JSON file got saved and tells the div to reload itself.
I also read about something like window.onload, location.load() or getElementById(), but I am not able to figure out the right way.
What can I do?
Thank you.
Websockets!
Though they might sound scary, it's very easy to get started with websockets in NodeJS, especially if you use Socket.io.
You will need two dependencies in your node application:
"socket.io": "^4.1.3",
"socketio-wildcard": "^2.0.0"
your HTML File:
<script type="module" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/4.0.0/socket.io.js"></script>
Your CLIENT SIDE JavaScript file:
var socket = io();
socket.on("update", function (data) { //update can be any sort of string, treat it like an event name
console.log(data);
// the rest of the code to update the html
})
your NODE JS file:
import { Server } from "socket.io";
// other code...
let io = new Server(server);
let activeConnections = {};
io.sockets.on("connection", function (socket) {
// 'connection' is a "magic" key
// track the active connections
activeConnections[socket.id] = socket;
socket.on("disconnect", function () {
/* Not required, but you can add special handling here to prevent errors */
delete activeConnections[socket.id];
})
socket.on("update", (data) => {
// Update is any sort of key
console.log(data)
})
})
// Example with Express
app.get('/some/api/call', function (req, res) {
var data = // your API Processing here
Object.keys(activeConnections).forEach((conn) => {
conn.emit('update', data)
}
res.send(data);
})
Finally, shameful self promotion, here's one of my "dead" side projects using websockets, because I'm sure I forgot some small detail, and this might help. https://github.com/Nhawdge/robert-quest

How to save imported JSON file with Expo Filesystem

I have been working on a React Native project with Expo that uses a json file to store local data. I am importing the data like so
import data from '../database.json'
I am making changes (adding and removing) to the imported JSON by using data.push(new_data). These changes are not persistent when I close the app because I cannot figure out how to save them. I have looked at using the expo-file-system library as so:
import * as FileSystem from 'expo-file-system';
...
FileSystem.writeAsStringAsync(FileSystem.documentDirectory + 'database.json', data);
This is from looking at examples in the API documentations. This however always throws promise rejections and doesn't end up writing the file. Can you point me in the right direction?
Also, should I import the database.json in a different way so I will already have the uri to save it to?
The documentation doesn't give an example of it's returned props in promises, so I was overlooking it for longer than I care to admit 😅. I was really dedicated to figuring this out so I could use the Expo solution, and totally missed the return Promise for createFileAsync, so hopefully this saves someone a significant amount of time in the future.
import * as FileSystem from 'expo-file-system';
const { StorageAccessFramework } = FileSystem;
const saveFile = async () => {
const permissions = await StorageAccessFramework.requestDirectoryPermissionsAsync();
// Check if permission granted
if (permissions.granted) {
// Get the directory uri that was approved
let directoryUri = permissions.directoryUri;
let data = "Hello World";
// Create file and pass it's SAF URI
await StorageAccessFramework.createFileAsync(directoryUri, "filename", "application/json").then(async(fileUri) => {
// Save data to newly created file
await FileSystem.writeAsStringAsync(fileUri, data, { encoding: FileSystem.EncodingType.UTF8 });
})
.catch((e) => {
console.log(e);
});
} else {
alert("You must allow permission to save.")
}
}
Use AsyncStorage instead. The react native package is deprecated but working, or use #react-native-community/async-storage and convert json to string (AsyncStorage can only store strings)
Set item
import AsyncStorage from '#react-native-community/async-storage';
...
await AsyncStorage.setItem('myData', JSON.stringify(data))
Get item
const data = await AsyncStorage.getItem('myData')
I found #JayMax answer very helpful however it's only for Android.
On iOS all you need to do is use Sharing.shareAsync and then you can save data to the file. Check this example:
const fileUri = FileSystem.documentDirectory + 'data.txt';
FileSystem.writeAsStringAsync(fileUri, 'here goes your data from JSON. You can stringify it :)', {
encoding: FileSystem.EncodingType.UTF8,
});
const UTI = 'public.text';
Sharing.shareAsync(fileUri, {UTI}).catch((error) => {
console.log(error);
});
If you using AsyncStorage, it only store for small data. Maybe 6mb or 10 mb.
You can use expo fileSystem
import * as FileSystem from 'expo-file-system';
...
FileSystem.writeAsStringAsync(FileSystem.documentDirectory + 'database.json', data);
Convert your data (Type json to string) Such as this:
writeData = async () => {
var persons = ''
await axios.get(`http://192.168.0.48:4000/api/sql/student`)
.then(res => {
persons = res.data
})
await FileSystem.writeAsStringAsync(FileSystem.documentDirectory + `offline_queue_stored.json`, JSON.stringify(persons));
}
#1.If the JSON File is in your Project Folder (PC/Laptop)
import data from './database.json';
#2. If the JSON File is in your Phone
import * as FileSystem from 'expo-file-system';
import * as DocumentPicker from 'expo-document-picker';
this.state = {
fileURI: null,
};
componentDidMount = () =>{
this._pickDocument();
}
_pickDocument = async () => {
let result = await DocumentPicker.getDocumentAsync({});
this.setState({
fileURI: result.uri
})
let fileData = await FileSystem.readAsStringAsync(this.state.fileURI)
console.log(fileData)
};

Angular - upload file as base64

I am trying to upload files from Angular 4 app to a JSON API service that accepts base64 strings as file content.
So what I do is - read the file with FileReader.readAsDataURL, then when user confirms the upload I will create a JSON request to the API and send the base64 string of the file I got earlier.
This is where the problem starts - as soon as I do something with the "content" (log it, send it, w/e) the request will be send, but its insanely slow, e.g. 20 seconds for 2MB file.
I have tried:
using ArrayBuffer and manually converting it to base64
storing the base64 string in HTML and retrieving it later
reading the files after user clicks on upload button
using the old client from #angular/common
using plain XHR request
but everything leads to the same result.
I know where the problem lies. But why does it happen? Is it something browser specific or angular specific? Is there a more preferred approach (keep in mind it has to be base64 string)?
Notes:
changing anything in the API is beyond my control
API is fine, sending any file trough postman will finish immediately
Code:
This method runs when user adds file to the dropzone:
public onFileChange(files: File[]) : void {
files.forEach((file: File, index: number) => {
const reader = new FileReader;
// UploadedFile is just a simple model that contains filename, size, type and later base64 content
this.uploadedFiles[index] = new UploadedFile(file);
//region reader.onprogress
reader.onprogress = (event: ProgressEvent) => {
if (event.lengthComputable) {
this.uploadedFiles[index].updateProgress(
Math.round((event.loaded * 100) / event.total)
);
}
};
//endregion
//region reader.onloadend
reader.onloadend = (event: ProgressEvent) => {
const target: FileReader = <FileReader>event.target;
const content = target.result.split(',')[1];
this.uploadedFiles[index].contentLoaded(content);
};
//endregion
reader.readAsDataURL(file);
});
}
This method runs when users clicks save button
public upload(uploadedFiles: UploadedFile[]) : Observable<null> {
const body: object = {
files: uploadedFiles.map((uploadedFile) => {
return {
filename: uploadedFile.name,
// SLOWDOWN HAPPENS HERE
content: uploadedFile.content
};
})
};
return this.http.post('file', body)
}
For sending big files to server you should use FormData to be able to send it as multi-part instead of a single big file.
Something like:
// import {Http, RequestOptions} from '#angular/http';
uploadFileToUrl(files, uploadUrl): Promise<any> {
// Note that setting a content-type header
// for mutlipart forms breaks some built in
// request parsers like multer in express.
const options = new RequestOptions();
const formData = new FormData();
// Append files to the virtual form.
for (const file of files) {
formData.append(file.name, file)
}
// Send it.
return this.http.post(uploadUrl, formData, options);
}
Also don't forget to set the header 'Content-Type': undefined, I've scratched my head over this for hours.

Understanding the streaming concept in node.js

I am trying to convert a CSV data to JSON data while streaming from a HTTP url by using "csvtojson" package.
const csv = require("csvtojson");
const request = require('request');
let options = {
uri: '',
****
};
let tempArr = [];
csv()
.fromStream(request(options))
.on("json", (jsonObj) => {
if (JSON.parse(jsonObj.Response).intents[0].intent == "None")
tempArr.push(JSON.parse(jsonObj.Response));
})
.on('done', (error) => {
callback(null, tempArr)
})
This is calling under an API. When I starts the server and call this api to convert csv to json, it works perfectly.
And If I call the same API again, the "json" event is not getting triggered, Instead "done" event is triggered directly.
i.e., the streaming is not done from the second time. why is it behaving like this?
and what should I do to solve this problem?

LocomotiveJS access response JSON in controller's after filter

I'm looking for a way to access the JSON being sent back to the requestor in the "after" filter for a controller.
var locomotive = require('locomotive');
var myController = new locomotive.Controller();
myController.after('myAction', function(next) {
var response = {}; //I want to access the JSON being sent back in myAction: {'hello':'world'}
console.log(response); //this should log "{'hello':'world'}"
next();
});
myController.myAction = function myAction() {
this.res.json({'hello':'world'});
}
module.exports = myController;
If anyone has any way of doing this, it would be much appreciated.
In your main action, assign your json to an object on this (res is reserved):
myController.myAction = function myAction() {
this.model = {'hello':'world'};
this.res.json(this.model);
}
Then you can access it in your after filter:
myController.after('myAction', function(next) {
var model = this.model;
console.log(model);
next();
});
I found a "hack" solution... It's not the cleanest, and requires changing the code within the express response.js file in "node_modules"...
If anyone has a better option where you can access the json being sent in response to the request within the controller action (or controller filter) itself, I'd greatly appreciate it.
Thanks.
in the ~/node_modules/locomotive/node_modules/express/lib/response.js file, I altered the "res.json" function (line 174 for me) to include the following line after the declaration of the body variable (which is passed to the send function).
this.responseJSON = body;
This allows you to access this.responseJSON within a controller's after filter, as follows:
myController.after('myAction', function(next) {
**var response = this.res.responseJSON; //ACCESS RESPONSE JSON HERE!!!!!**
console.log(response); //Now logs "{'hello':'world'}"
next();
});
Like I said, not the most elegant, but gets the job done in a pinch. Any more elegant solutions welcome...