I have a rather huge JSON file which I want to convert to a excel file so that I can use it for feature selection.
Any suggestions on steps to import the JSON file and convert it to an excel file with column names?
THnaks a lot!
I think you need to parse these json in key value pair using some programming language and store these key and value pair in excel file using coding and it will be easy way in my case.
Related
I have added CSV file to SharePoint Documents library.
I needs to read that CSV file using Power Automate / Flow.
I have created Power Automate flow. Below is the screenshot fro the same.
Which CSV parser do i need to use for read data from file content action?
Can anyone help me for the same?
Thanks
If you want to retrieve the content of the CSV without a premium connector you could use an expression to convert the $content property of the Get File Content action into a string value. You can use the base64tostring function for this.
Below is an example
base64tostring(outputs('Get_file_content')?['body']['$content'])
I need some guidance on how to proceed with a problem.
Our integration team receives xml files which are converted to json and sent to pub/sub. We then ingest the json files (or are supposed to) into bigquery.
The problem is that the xml files do not include all possible objects or values all the time. So, I cant create a correct schema in bq to receive the json files. I got the xsd file with an extension file which gives me all possible objects but I don't know how to convert this to a correct bq schema.
Do you have any suggestions on how to create a bq schema from xsd files? I was thinking that if I create an xml file with dummy data (including all objects and more than one object when creating repeated objects) with help of the xsd maybe that xml file may be converted to json and then use the auto-schema detection of bq.
Any suggestions?
Thanks,
Cris
If you have the XSD schema files, you can convert these to a valid JSON schema. There are a few tools that can help you to accomplish this.
Keep in mind that the tools are for general purposes and not for the particular case of BigQuery, so you'll have to tune the result to get a valid JSON schema. For this check the components of a BigQuery schema, and for quick reference the sample provided in the documentation.
I want to convert JSON files to CSV in nifi. We can achieve this in Python and other programming languages and have multiple articles on it. I have multiple JSON files and each file has different schema(one specific file will have one schema only). I can see there are templates to convert CSV to JSON and other conversions. But I didn't see any template to convert JSON data to CSV. I have gone through the article https://community.hortonworks.com/articles/64069/converting-a-large-json-file-into-csv.html ,however here we are hard coding the schema. As I have multiple files and each file has different schema, I can't hardcode the schema. Any suggestions please.
Conversion between formats is typically done through ConvertRecord by plugging in the appropriate record reader and record writer, in this case a JSON reader and CSV writer.
To make use of the record processors you need to defined Avro schemas for your data and put them in a schema registry, NiFi provides a local one.
There are lots of examples and posts out there about the record stuff, this slide deck shows an example of CSV to JSON, but would be easy to reverse the situation for your scenario:
https://www.slideshare.net/BryanBende/apache-nifi-record-processing
This post has some other info:
https://bryanbende.com/development/2017/06/20/apache-nifi-records-and-schema-registries
I have a 1.9GB tab delimited file that is in the form of an xlsx file. I could write a script to convert it to CSV and then convert THAT to json, but I'm just curious if there is a more direct way to do this. Thanks! :)
Not sure if this is an option, but you can import it into some database (for example mongo) and then export relatively easily with that
I came accross a similar problem recently, and what I found really easy to do was actually go directly from XLSX to JSON using MATLAB.
IMPORTING XLSX: https://www.mathworks.com/help/matlab/ref/xlsread.html
EXPORTING JSON: https://www.mathworks.com/help/matlab/ref/jsonencode.html
It might take a little bit of time for such a large file, but I did it on a file about 400MB in size with no problem.
I'm able to parse json files in MFC but is having a hard time modifying the values. Is there an easier way writing new values, other than converting it to native file types, modifying the contents and converting it back to json again?
I thought it would be as easy as changing values in an XML file where you just look for the tag and change it's value.
thanks...
You can use JSON Spirit library. The way it traverses through the json file is through it's key and value which is treated as a "pair". All you have to do is loop through the objects and search for the pair you want to replace. That's it...
The details aren't shown here, but pretty much gives you the basics -> http://www.codeproject.com/KB/recipes/JSON_Spirit.aspx. It's got a bunch of methods you could use for whatever operation you want.
:)