Read a CSV file and store data in JSON form in Django - json

I have a CSV file in media folder inside my django project. I want to read the data from the CSV file and store it in JSON format either directly in the database or convert it into JSON from CSV file and then store it, and enable me to view it on an html page, in my Django Web Application.

as I dont know the data format in your csv file i supposed that you have to element in your csv file ...
now you can write your csv file in a list of dictionaries. so you can do anything you want to...
import pandas as pd
csv = pd.read_csv('./your_file.csv')
dataset = []
for i, (item1, item2) in csv.iterrows():
dataset.append({"item1":item1, "item2":item2})

Related

How would i save a doc/docx/docm file into directory or S3 bucket using Pyspark

I am trying to save a data frame into a document but it returns saying that the below error
java.lang.ClassNotFoundException: Failed to find data source: docx. Please find packages at http://spark.apache.org/third-party-projects.html
My code is below:
#f_data is my dataframe with data
f_data.write.format("docx").save("dbfs:/FileStore/test/test.csv")
display(f_data)
Note that i could save files of CSV, text and JSON format but is there any way to save a docx file using pyspark?
My question here. Do we have the support for saving data in the format of doc/docx?
if not, Is there any way to store the file like writing a file stream object into particular folder/S3 bucket?
In short: no, Spark does not support DOCX format out of the box. You can still collect the data into the driver node (i.e.: pandas dataframe) and work from there.
Long answer:
A document format like DOCX is meant for presenting information in small tables with style metadata. Spark focus on processing large amount of files at scale and it does not support DOCX format out of the box.
If you want to write DOCX files programmatically, you can:
Collect the data into a Pandas DataFrame pd_f_data = f_data.toDF()
Import python package to create the DOCX document and save it into a stream. See question: Writing a Python Pandas DataFrame to Word document
Upload the stream to a S3 blob using for example boto: Can you upload to S3 using a stream rather than a local file?
Note: if your data has more than one hundred rows, ask the receivers how they are going to use the data. Just use docx for reporting no as a file transfer format.

To extract table as a pandas dataframe or a csv from Json output of Azure form recognizer

this is my problem. I have this huge Json extract as output from Azure form Recognizer.What I need is to extract the two tables as shown in the screen shoot
. The Json output file has both the objects extracted from Azure form recoognizer (Json file and both the pdf attached for your kind reference). I need to extract both the tables in a pandas df and append them as one table and then take the output as CSV. Could anyone please help in this regard.
Json and Pdf file link here (since there is no way to attach the file directly here) --> https://drive.google.com/drive/folders/18gAPDuXsp8Td9WysoNcH_l1HoijOf8BK?usp=sharing

React Native: Read CSV File

I want to upload records of students of a university using CSV file. I have upload the CSV file using react-native-document-picker. Now the problem is that, I am unable to read CSV Data. My main motive is to upload CSV data to firebase. How to read CSV data in React Native or covert CSVtoJSON?
You need to convert CSV to JSON before pushing data to Firebase. There're numerous utility libraries for that. You can try https://www.npmjs.com/package/csvtojson

How tosave django output in json file?

I want to save django output in .json file.
How should I do this?
I have a geojson data in my django output .
I want to save that output in .json file.
My output is already in json format but only problem is that in django, the output is in arrary format( i.e output is enclosed in close brackets).
So, I can't use that output directly so, I think first saved it in json file then used that.
Any idea how to do that?

Writing JSON data to a JSON file

Hello in my application I am currently trying to create my own custom log files in .json format. Reason for this is because I want a well structured and accurate log file which can be easily read and would not depend on some special code in my application to read the data.
I have been able to create a json file: activities.json
I have been able to write and append to that file using File::append($path, $json)
This is a sample of the file contents:
{"controller":"TestController","function":"testFunction()","model":"Test","user":"Kayla","description":"Something happened!! Saving some JSON data here!","date":"2016-06-15"}
{"controller":"TestController","function":"testFunction()","model":"Test","user":"Jason","description":"Something happened!! Saving some JSON data here!","date":"2016-06-15"}
{"controller":"UserController","function":"userFunction()","model":"User","user":"Jason","description":"Another event occurred","date":"2016-06-15"}
Now my issue is the above is not a valid JSON. How do I get it in this format:
[
{"controller":"TestController","function":"testFunction()","model":"Test","user":"Kayla","description":"Something happened!! Saving some JSON data here!","date":"2016-06-15"},
{"controller":"TestController","function":"testFunction()","model":"Test","user":"Jason","description":"Something happened!! Saving some JSON data here!","date":"2016-06-15"},
{"controller":"UserController","function":"userFunction()","model":"User","user":"Jason","description":"Another event occurred","date":"2016-06-15"}
]
Is there a way of writing and appending to a json file in laravel? As much as possible I want to avoid reading the entire file before the append and doing a search and replace since the file may contain hundreds to thousands of records.
I will not use the default Laravel Log function.