I have a 1.9GB tab delimited file that is in the form of an xlsx file. I could write a script to convert it to CSV and then convert THAT to json, but I'm just curious if there is a more direct way to do this. Thanks! :)
Not sure if this is an option, but you can import it into some database (for example mongo) and then export relatively easily with that
I came accross a similar problem recently, and what I found really easy to do was actually go directly from XLSX to JSON using MATLAB.
IMPORTING XLSX: https://www.mathworks.com/help/matlab/ref/xlsread.html
EXPORTING JSON: https://www.mathworks.com/help/matlab/ref/jsonencode.html
It might take a little bit of time for such a large file, but I did it on a file about 400MB in size with no problem.
Related
I've exported a dataframe from R into a .csv file and then tried to open it in Tableau. What is the correct way to import these files? I've done connect to data source > to a file > text file then simply clicked on the csv.
However, the columns and rows are all mixed up and I'm not sure what's gone wrong as the files can open in Numbers and Excel just fine!
Please see the incorrectly imported data rem_posts.csv and correctly imported data kylie_posts.csv.
It was an issue with some columns having string data with commas in it so it was messing up the import when using csv format. I resolved this by exporting it in excel instead.
Try using the Data Interpreter
Yo may also try using Tableu Prep , to do some cleanup
I am trying to convert the xlsb files in Nifi to csv. I am using ConvertExcelToCSVProcessor in Nifi at the moment, but it gives me error and does not work. xlsb are the excel binary files. i have googled a lot and tried to make this work, but in vain. please help in this regard.
I just looked through our code base and checked up on POI. The long and short of it is that XLSB support in POI is fairly limited at this point, and the APIs that NiFi calls don't appear to support it. What you can try as a work around for now is look for a Python library that supports XLSB, write a Python script that generates XLSX or CSV from that and call that with ExecuteStreamCommand.
I'm currently in the process of trying to export a Parse app into a MySql database. The tables have a similar setup, so to make the process quicker I was wondering if anyone has figured out an easy to way to export Parse data as a file format the phpmyadmin was accept for importing data (csv, xls, etc.).
I know Parse exports to Json, and I have found several posts around exporting to other file formats, but most are fairly old (a few years atleast), so I was just wondering if anyone had found a way to do this since?
I want to create .csv files with the Report Generation Toolkit in Labview.
They must actually be .csv files which can be opened with Notepad or something similar.
Creating a .csv is not that hard, it's just a matter of adding the extension to the file name that's going to be created.
If I create a .csv file this way it opens nicely in excel just the way it should, but if I open it in Notepad it shows all kind of characters and it doesn't even come close to the data I wrote to the file.
I create the files with the Labview code below:
Link to image (can't post image yet because I've got to few points)
I know .csv files can be created with the Write to Spreadsheet VI but I would like to use the Report Generation Toolkit because it's pretty easy to add columns and rows to the file and that is something I really need.
you can use the Robust CSV package on the lavag.org forum to read and write 2D arrays to CSV files.
http://lavag.org/files/file/239-robust-csv/
Calling a file "csv" does not make it a CSV file. I never used the toolkit to generate an Excel file, but I'm assuming it creates an XLS or XLSX file, regardless of what extension you give it, which is why you're seeing gibberish (probably XLS, since it's been around for a while and I believe XLSX is XML, not binary).
I'm not sure what your problem is with the write spreadsheet VI. It has an append input, so I assume you can use that to at least add rows directly to a file, although I can't say I ever tried it. I would prefer handling all the data in memory explicitly, where you can easily use the array functions to add rows or columns to the array and then overwrite the entire file.
I've been trying to import a couple of .json files into LibreOffice Calc.
Although I can get the raw data in, it isn't sorting as I would think it might (by placing different pieces of info into each cell).
Does LibreOffice provide support for importing JSON files and sorting them out in cells? (In other words, import + sort)?
If there doesn't seem to be direct support for this, would converting to CSV be the next logical step in order to get the data into Calc?
Had the same problem myself (that's how I found this question).
So, for the next person finding this - the answer is no - LibreOffice Calc does not support direct import of JSON.
And the next logical step indeed is converting to CSV. There are free online JSON to CSV converters, and using one of them (http://www.convertcsv.com/json-to-csv.htm), I was easily able to make a correct CSV which Calc imports without a problem.
One possible caveat is if you have complex objects represented in JSON - that may not be convertible to CSV, but then again, if it doesn't fit into CSV, it probably doesn't fit into spreadsheet format either.
There's a LibreOffice GetRest plugin with documentation written in broken English, that has a "parseJSON" formula. It won't convert JSON to CSV (without a lot of grunt work) but it might help your use case.
If you can run Python scripts in Libreoffice Calc, then it should be possible when you see what's in here: http://blog.appliedinformaticsinc.com/how-to-parse-and-convert-json-to-csv-using-python/