Hi I want to show the download progress of a large csv file.
Can anyone suggest me some code or logic when the file comes from backend api response
Related
Im currently working on a new design for a mobile app (only frontend). In the project I need to use states to change the css of a button. If the button has changed, next time the app is refreshed the state should be as you left it.
That is why I have a locally stored JSON file that is structured the same as how the apps current database is. Reading it is no issue, but I can't get the writing to work.
How I read the JSON:
const jsonData = require('../data.json')
function GetBaseState(id){
console.log(jsonData.bases[id].state)
}
How would I go about changing that state in the JSON file?
In order to both of reading and writing Json file , you are able to use react-native-fs. For example by readFile method you can read json file and by writeFile method writing your json file in local storage.
react-native-fs have good documents that you are able to reading that for more information and usage.
I have a 1GB json file to upload to Firebase RTDB but when I press Import, it's loading for a while and then I get this Error:
There was a problem contacting the server. Try uploading your file again.
I have tried to upload a 30mb file and everything is ok.
It sounds like your file it too big to upload to Firebase in one go. There are no parameters to tweak here, and you'll have to use another means of getting the data into the database.
You might want to give the Firebase-Import library ago, the Firebase CLI's database:set command, or write your own import for your file format using the Firebase API.
i want to handle a requirement in polymer webcomponents where user can upload csv file from ui and csv file can be parsed to json and sent to server ,i searched and found for vaadin upload,looked over the api but i am not sure how to receive the csv file and convert to json and sent to server,can anyone show a jsfiddle of vaadin upload or any other web component to handle this scenario?
First of all, I am wondering why you would not simply do the conversion on the server side.
In this case, you would be able to use the vaadin-upload directly indeed.
Here is a snippet that would upload all files to the example.com server, and only allow CSV files.
<vaadin-upload target="https://example.com/upload" method="POST" accept="text/csv">
</vaadin-upload>
There are plenty of resources on how to convert CSV files to JSON.
Here is a snippet
And here is a node library
If you really wanted to do the conversion client side, then I would suggest to create an element that would embed a vaadin-upload, and convert the Files array to Json before manually calling the uploadFiles method.
I'm new in this area. I have right now a file original in excelc(data from sensor), i want to upload it into azure and use stream to process it, as the format of data supports CSV, I'm think about saving the excel in csv and upload it in blob storage (or should I send it into event hub?), however the stream analytics shows nothing in output. The original file looks like below, does anyone know something about this?
Busy building a website for a client using classic ASP (It will reside on an old server) which is going to be used internally only.
The admin is able to view a paginated table of data and export this to CSV. This works fine when I save the CSV data to a CSV file but I have now been asked to try avoid the creation of the file if possibly and create the CSV in memory without the need for a file.
I have my doubts that this is possible but might be completely wrong. Is there anyway to send the CSV data to the browser such that it will open in Excel rather than having to create a CSV file and link to it as I am currently doing ?
TIA
John
Response.ContentType = "text/csv" will help you here. In the past I've paired that with a rewrite rule so that the URL is something like foo.com/example.csv but there are plenty of other ideas to be found in the following question: Response Content type as CSV