How to Read CSV file using Power Automate? - csv

I have added CSV file to SharePoint Documents library.
I needs to read that CSV file using Power Automate / Flow.
I have created Power Automate flow. Below is the screenshot fro the same.
Which CSV parser do i need to use for read data from file content action?
Can anyone help me for the same?
Thanks

If you want to retrieve the content of the CSV without a premium connector you could use an expression to convert the $content property of the Get File Content action into a string value. You can use the base64tostring function for this.
Below is an example
base64tostring(outputs('Get_file_content')?['body']['$content'])

Related

Best data processing software to parse CSV file and make API call per row

I'm looking for ideas for an Open Source ETL or Data Processing software that can monitor a folder for CSV files, then open and parse the CSV.
For each CSV row the software will transform the CSV into a JSON format and make an API call to start a Camunda BPM process, passing the cell data as variables into the process.
Looking for ideas,
Thanks
You can use a Java WatchService or Spring FileSystemWatcher as discussed here with examples:
How to monitor folder/directory in spring?
referencing also:
https://www.baeldung.com/java-nio2-watchservice
Once you have picked up the CSV you can use my example here as inspiration or extend it: https://github.com/rob2universe/csv-process-starter specifically
https://github.com/rob2universe/csv-process-starter/blob/main/src/main/java/com/camunda/example/service/CsvConverter.java#L48
The example starts a configurable process for every row in the CSV and includes the content of the row as a JSON process data.
I wanted to limit the dependencies of this example. The CSV parsing logic applied is very simple. Commas in the file may break the example, special characters may not be handled correctly. A more robust implementation could replace the simple Java String .split(",") with an existing CSV parser library such as Open CSV
The file watcher would actually be a nice extension to the example. I may add it when I get around to it, but would also accept a pull request in case you fork my project.

Convert JSON to CSV using Microsoft Flow

I am trying to parse JSON data from an API into Flow, convert it into a CSV and then output the CSV to my Google Drive.
The API I am trying to work with is located here:
https://www.binance.com/api/v1/klines?symbol=BNBBTC&interval=1h&limit=24
Is this possible using Microsoft flow? I have tried various things without much success.
Thanks in advance.
I'd say it is possible. What have you tried so far?
First you have to get the response body. Then extract the "meat" from each element, which has to be done with flow expressions "body(response_body)[0]" - depending on format. Then feed all these data parts to a newly created excel file.

JMeter read the second sheet of CSV

How can I make JMeter read the second sheet of my CSV?
I want to use CSV Data Set Config.
Normally, it reads the first line of the first sheet but is there any way to be a bit more flexible?
CSV file format doesn't have "sheets", it is a normal plain text file using delimiters in order to represent structured data.
If you are trying to get data from i.e. Microsoft Excel file type - unfortunately you won't be able to do it using CSV Data Set Config. The easiest would be exporting data as separate plain-text CSV files.
If you don't have the possibility to do the export you still can access the data from Excel files but it will be a little bit more tricky as you will have to use JSR223 Test Elements, Groovy language and Apache POI libraries
More information:
Busy Developers' Guide to HSSF and XSSF Features
How to Extract Data From Files With JMeter
Currently you can use CSV Data Set Config for that, you should add external code for example using Apache Commons CSV,
Download the jar file and place it in JMETER_HOME lib folder, and then write the code in JSR223 Element.
Examples exists, code for get second record:
Reader in = new FileReader("path/to/file.csv");
Iterable<CSVRecord> records = CSVFormat.RFC4180.parse(in);
// go to next record
records.next();
CSVRecord secondRecord = records.next();
//columnOne = secondRecord.get(0);

Pentaho Data Integration - Multiple CSV File Inputs

I've been using Pentaho Data Integration lately and currently I intend to use it to a project I'm in. The assist I'm looking for is the following:
There can be variable CSV file inputs in a folder
Is there a way to get all .csv files (the operator/ series of operators) using Pentaho?
After this step I believe what I have to do is pretty simple, as I only have to merge those files together.
Thanks
Use the Text File Input. It allows for folders using a regular expression and can handle csv files
Add the "Get File Names" step before the "CSV file input" step. When the CSV step has input, then a field appears in the configuration dialog allowing you to get the filename from the incoming stream.

How do I download contents of an html table generated by play 1.2.7 backend on java in xls

I've generated a table using play's #{list} tag and get pretty decent results. Now I need to be able to generate and download an xls version of the table and have no idea what to do. Any pointers at all will be much appreciated
Well you have various options.
Excel will open HTML files. So instead of rendering your table as HTML you can it to stream it to the browser and set the content type as XLS.
While Excel will open it this it will still be an HTML file rather than an XLS(X) document.
You can generate as CSV from your data model and stream this to the browser. Again this will be a CSV rather than a proper XLS(X) document.
There also seem to be some solutions around which can do it using Javscript. See as a starting point: Generate excel sheet from html tables using jquery
Finally you can can use something like Apache POI or JXLS to generate a 'proper' xls(x) document and stream this to the browser. I have some code here that will export HTML to 'proper' xlsx file if this is the route you wish to go. Workflow is then to create some HTML from your data model and use this to convert to Excel rather than having to programmatically build the Excel document using POI. https://github.com/alanhay/html-exporter