I am creating a UI that allows users to input data in batches.
Similar to an Excel spreadsheet but the columns are already fixed.
I'm already using PrimeNG DataTable for other parts of my web app, and I would like to continue using it for the aforementioned feature if possible.
Thanks.
I figured it out.
My initial code was correct but because there was an issue when the backing array of the data table was being updated via push.
I had to create a separate array with the empty objects and assign it to the backing array.
Related
I have an SFTP trigger in a logic app which fires when a file is added to a certain file area. It is a CSV-formatted file and I want the rows to be parsed and coverted into json. Which is the best way to convert CSV-data into json without using any custom connectors?
I cannot find any built-in connectors doing this job. And as far as I know there are no logic apps functions doing the job either.
Right now, there is no connector/action in logic app that can provide the out of box solution for your requirement. You need to loop in through the array and perform the calculation as per your requirement but I will not suggest you leverage the loop, variables action as it may take time and cost you more.
The alternative would be leveraging the inline code (JavaScript code) to do the calculation as per your requirement. Please note that you will need Integration Account to run your inline code.
Please refer to javascript code and modified if needed according to your needs. I have used '_' for differentiating the nested objects. For more details you can refer to previous discussion here.
For complex calculation you can offload this functionality to azure function and write your code as per the supported languages and call azure function from logic app.
1.Created logic app as shown below:
2 .Created container in storage account and uploaded a CSV file in container.
3.Next using compose action to split the contents of the CSV file on every new line into an array.
a. Here is the expression used in SplitLines compose action:
split(body('Get_blob_content_(V2)'),decodeUriComponent('%0D%0A'))
b. Follow the below MS Doc to write expressions:
4. Removing last(empty) line from previous output using another compose action as shown below ,
take(outputs('SplitLines'),add(length(outputs('SplitLines')),-1))
5.Separating filed names using compose action
split(first(outputs('SplitLines')), ',')
Forming json as shown below using Select action,
**From**: **`skip(outputs('RemoveLastLine'), 1)`**
**Map:**
**`outputs('SplitFieldName')[0]`** **`split(item(), ',')?[0]`**
**`outputs('SplitFieldName')[1]`** **`split(item(), ',')?[1]`**
Tested logic app and it is running successfully. 
Content of CSV file is as shown below:
Csv data is formatted as json:
Reference:Use data operations in Power Automate (contains video) — Power Automate | Microsoft Docs
Credit: #Iason Koulas
I am working on a GUI (PyQt6) where the user loads some files and adds metadata for each. So for each file, I would like to have a window (QDialog) which is essentially a form with fields which are specified by a json schema.
I am new to PyQt and I can code the mechanism to open a form (QDialog with QLabels and QLineEdit which essentially looks like a form. see image). However, I would like to automatize this step using the schema that's stored in a json file (consider any example json schema).
Is there already a python library for this and I'm absolutely obvious about it?
How would you go about coding this? Thanks!
I have a JSON file with more than 200000 records in it.
I'm using Angular CLI for accessing JSON data with the help of json-server API for accessing it locally.
Im using Angular http request to access JSON, it is getting displayed into browser but not into angular table component.
I checked if ill be able to access it with less number of records then it got displayed, with upto 1000 records i have checked but max i don't know how much it access.
so i just want any solution for accessing large JSON data into angular view efficiently.
please suggest any solution or alternative for the same.
This may be opinionated answer but you've many options.
Try lazy loading table data.
Implement Infinite Scroll by loading 50 rows initially & when user reach end of table, load next 50 and so on. Use following plugin for that to minimize your code.
ngInfiniteScroll
Add pagination to table by using custom filter on ngFor to filter out elements with their index no. You can use following awesome plugin lib
ngx-pagination
Make use of grids. This is best performance, reliable solution. There are many grids like ag-grid, slickgrid, Angular data table, etc use according to requirement. ag-grid is Angular supported and slickgrid is pure js grid but you can surely configure inside angular project. Also you can try data tables from PrimeNg.
With using grids, searching, filtering, exporting table data becomes very efficient.
ag-grid
Angular DataTables
SlickGrid
PrimeNG Table
There is a custom implementation in dhtmlx gantt for upload from MPP/XML which goes to their servlet and renders the gantt. Has anyone tried to build a custom CSV upload or any third parties available to load the csv into the gantt.
https://dhtmlx.com/blog/export-import-ms-project-dhtmlx-gantt-chart/
There is no such solution from DHTMLX (FYI I work for DHTMLX), and I'm not aware if there is any third-party service or ready-to-use solution that could be used for a development.
At the code level, importing csv into gantt breaks down into three steps:
parsing CSV into an object array
mapping columns of CSV to properties of that objects (mandatory properties of gantt tasks - text/start_date/duration/parent)
and inserting the result into database.
The first step is trivial. Mapping columns may require implementing some sort of UI so the user could specify which columns of csv mean what in gantt.
For an inspiration, you can check how it's done in this app https://app.ganttpro.com/ - requires registration, but you can create a free account using google or facebook acc - create new project ("+ CREATE NEW" in lefthand menu), select "Import from" and try uploading some csv file -> here is how the ui looks like.
As for the last step - inserting parsed records into db - you'll need to do some coding in order to insert tasks without losing project hierarchy (task.parent -> task.id relations, given that database ids of your items will likely change after inserting), but overall it shouldn't be very difficult.
If you looking for something more specific - please update your question.
I currently have a chat bot that has an entity for each stock symbol. There are over 3,000. For my dialog I want to be able to detect questions like #get #price #stockSymbol. Is there a way to deal with a large number of entities without writing an if statement for each one?
You are only allowed to have 100 entities in a single workspace. However those entities can have 100,000 values.
So you could create an entity called #StockSymbol and then each value would be the Stock identifier (eg. IBM).
So you would only need one IF statement to determine it is a stock, then pass back the entity information to your calling application to take action on the value.
To put this in programatically, if it is a one time thing you can create a CSV file like the following:
StockSymbol,IBM
StockSymbol,MSFT
StockSymbol,APPL
and so on. Then import that entity file. Alternatively you can use the workspace API to update an already deployed workspace.
I am sorry to say there is no process within the Conversation Service UI that has an automatic dialog creation method. In cases like this, many teams create an external script that can read a file with your entities in it, and then creates a workspace json file with the required dialog nodes. The workspace json file is a relatively simple format, and I have found you can easy merge any new json file into an already created workspace. In fact with the new API's it even possible to load the new elements into a running workspace. ( although if new to this, create a duplicate ws, and merge into this one, or download and merge via a good editor. )