React Native: Read CSV File - json

I want to upload records of students of a university using CSV file. I have upload the CSV file using react-native-document-picker. Now the problem is that, I am unable to read CSV Data. My main motive is to upload CSV data to firebase. How to read CSV data in React Native or covert CSVtoJSON?

You need to convert CSV to JSON before pushing data to Firebase. There're numerous utility libraries for that. You can try https://www.npmjs.com/package/csvtojson

Related

How would i save a doc/docx/docm file into directory or S3 bucket using Pyspark

I am trying to save a data frame into a document but it returns saying that the below error
java.lang.ClassNotFoundException: Failed to find data source: docx. Please find packages at http://spark.apache.org/third-party-projects.html
My code is below:
#f_data is my dataframe with data
f_data.write.format("docx").save("dbfs:/FileStore/test/test.csv")
display(f_data)
Note that i could save files of CSV, text and JSON format but is there any way to save a docx file using pyspark?
My question here. Do we have the support for saving data in the format of doc/docx?
if not, Is there any way to store the file like writing a file stream object into particular folder/S3 bucket?
In short: no, Spark does not support DOCX format out of the box. You can still collect the data into the driver node (i.e.: pandas dataframe) and work from there.
Long answer:
A document format like DOCX is meant for presenting information in small tables with style metadata. Spark focus on processing large amount of files at scale and it does not support DOCX format out of the box.
If you want to write DOCX files programmatically, you can:
Collect the data into a Pandas DataFrame pd_f_data = f_data.toDF()
Import python package to create the DOCX document and save it into a stream. See question: Writing a Python Pandas DataFrame to Word document
Upload the stream to a S3 blob using for example boto: Can you upload to S3 using a stream rather than a local file?
Note: if your data has more than one hundred rows, ask the receivers how they are going to use the data. Just use docx for reporting no as a file transfer format.

Is there a way to export collection in a Firestore database to a json or csv file?

I have looked at the import/export documentation here: https://cloud.google.com/firestore/docs/manage-data/export-import but this way only seems to export for use in other databases, and use in BigQuery, but if I want to use the data in say, Excel, I would need a csv file for that.
There is nothing built into the Firestore UI or API for exporting to a CSV or Excel, but you can of course use the API to read the data and write the CSV/XLS file yourself.
There are also some promising links in the results of searching for firestore export to csv, like this tutorial on Exporting Firestore Collection as CSV into Cloud Storage on Demand, the easy way and this tutorial on CSV Exports from Firestore

Moving a json file from databricks to blob storage

I have created a mount in databricks which connects to my blob storage and I am able to read files from blob to databricks using a notebook.
I then transposed a .txt to json format using pyspark and now I would like to load it back to the blob storage. Does anyone know how I would do that?
Here are a few things I have tried:
my_json.write.option("header", "true").json("mnt/my_mount/file_name.json")
write.json(my_json, mnt/my_mount)
Neither work. I can put load a csv file from databricks to blob using:
my_data_frame.write.option("header", "true").csv("mnt/my_mount_name/file name.csv")
This works fine but I can't find a solution for moving a json.
Any ideas?
Disclaimer: I am new to pySpark but this is what I have done.
This is what I did after referencing the docs pyspark.sql.DataFrameWriter.json
# JSON
my_dataframe.write.json("/mnt/my_mount/my_json_file_name.json")
# For a single JSON file
my_dataframe.repartition(1).write.json("/mnt/my_mount/my_json_file_name.json")
# Parquet
my_dataframe.write.mode("Overwrite").partitionBy("myCol").parquet("/mnt/my_mount/my_parquet_file_name.parquet")

Read a CSV file and store data in JSON form in Django

I have a CSV file in media folder inside my django project. I want to read the data from the CSV file and store it in JSON format either directly in the database or convert it into JSON from CSV file and then store it, and enable me to view it on an html page, in my Django Web Application.
as I dont know the data format in your csv file i supposed that you have to element in your csv file ...
now you can write your csv file in a list of dictionaries. so you can do anything you want to...
import pandas as pd
csv = pd.read_csv('./your_file.csv')
dataset = []
for i, (item1, item2) in csv.iterrows():
dataset.append({"item1":item1, "item2":item2})

How to import CSV files into Firebase

I see we can import json files into firebase.
What I would like to know is if there is a way to import CSV files (I have files that could have about 50K or even more records with about 10 columns).
Does it even make sense to have such files in firebase ?
I can't answer if it make sense to have such files in Firebase, you should answer that.
I also had to upload CSV files to Firebase and I finally transformed my CSV into JSON and used firebase-import to add my Json into Firebase.
there's a lot of CSV to JSON converters (even online ones). You can pick the one you like the most (I personnaly used node-csvtojson).
I've uploaded many files (tab separated files) (40MB each) into firebase.
Here are the steps:
I wrote a Java code to translate TSV into JSON files.
I used firebase-import to upload them. To install just type in cmd:
npm install firebase-import
One trick I used on top of all the one already mentioned is to synchronize a google spreadsheet with firebase.
You create a script that upload directly to firebase db base on row / columns. It worked quite well and can be more visual for fine tuning the raw data compared to csv/json format directly.
Ref: https://www.sohamkamani.com/blog/2017/03/09/sync-data-between-google-sheets-and-firebase/
Here is the fastest way to Import your CSV to Firestore:
Create an account in Jet Admin
Connect Firebase as a DataSource
Import CSV to Firestore
Ref:
https://blog.jetadmin.io/how-to-import-csv-to-firestore-database-without-code/