Watson Conversation dialog for large number of entities? - csv

I currently have a chat bot that has an entity for each stock symbol. There are over 3,000. For my dialog I want to be able to detect questions like #get #price #stockSymbol. Is there a way to deal with a large number of entities without writing an if statement for each one?

You are only allowed to have 100 entities in a single workspace. However those entities can have 100,000 values.
So you could create an entity called #StockSymbol and then each value would be the Stock identifier (eg. IBM).
So you would only need one IF statement to determine it is a stock, then pass back the entity information to your calling application to take action on the value.
To put this in programatically, if it is a one time thing you can create a CSV file like the following:
StockSymbol,IBM
StockSymbol,MSFT
StockSymbol,APPL
and so on. Then import that entity file. Alternatively you can use the workspace API to update an already deployed workspace.

I am sorry to say there is no process within the Conversation Service UI that has an automatic dialog creation method. In cases like this, many teams create an external script that can read a file with your entities in it, and then creates a workspace json file with the required dialog nodes. The workspace json file is a relatively simple format, and I have found you can easy merge any new json file into an already created workspace. In fact with the new API's it even possible to load the new elements into a running workspace. ( although if new to this, create a duplicate ws, and merge into this one, or download and merge via a good editor. )

Related

Which is the best way of parsing CSV-data in a logic app without using a custom connector?

I have an SFTP trigger in a logic app which fires when a file is added to a certain file area. It is a CSV-formatted file and I want the rows to be parsed and coverted into json. Which is the best way to convert CSV-data into json without using any custom connectors?
I cannot find any built-in connectors doing this job. And as far as I know there are no logic apps functions doing the job either.
Right now, there is no connector/action in logic app that can provide the out of box solution for your requirement. You need to loop in through the array and perform the calculation as per your requirement but I will not suggest you leverage the loop, variables action as it may take time and cost you more.
The alternative would be leveraging the inline code (JavaScript code) to do the calculation as per your requirement. Please note that you will need Integration Account to run your inline code.
Please refer to javascript code and modified if needed according to your needs. I have used '_' for differentiating the nested objects. For more details you can refer to previous discussion here.
For complex calculation you can offload this functionality to azure function and write your code as per the supported languages and call azure function from logic app.
1.Created logic app as shown below:
2 .Created container in storage account and uploaded a CSV file in container.
3.Next using compose action to split the contents of the CSV file on every new line into an array.
a. Here is the expression used in SplitLines compose action:
split(body('Get_blob_content_(V2)'),decodeUriComponent('%0D%0A'))
b. Follow the below MS Doc to write expressions:
4. Removing last(empty) line from previous output using another compose action as shown below ,
take(outputs('SplitLines'),add(length(outputs('SplitLines')),-1))
5.Separating filed names using compose action
split(first(outputs('SplitLines')), ',')
Forming json as shown below using Select action,
**From**: **`skip(outputs('RemoveLastLine'), 1)`**
**Map:**
**`outputs('SplitFieldName')[0]`** **`split(item(), ',')?[0]`**
**`outputs('SplitFieldName')[1]`** **`split(item(), ',')?[1]`**
Tested logic app and it is running successfully. 
Content of CSV file is as shown below:
Csv data is formatted as json:
Reference:Use data operations in Power Automate (contains video) — Power Automate | Microsoft Docs
Credit: #Iason Koulas

Import Folder of Documents into Container field via script

Im running Filemaker Server 19 and FileMaker 19 client
I have an app that manages Document Templates, ie Word documents
Each Document Template master record contains information about that template, plus connects to a Document Samples table that contains one Word document (the template) and one PDF document (an output of the template file when used , a bit like mail merge).
We've recently had an update to the system I support (not a Filemaker system) and many hundreds of the Word templates have been updated. If I have to do one or two in a session it's pretty simple to drag the latest version of each template to the Container field; but recently I've been having to do 10s/100s of Document updates very quickly to get them into our system. I've put copies of all the Word documents into a folder on my Mac. They all have a unique name and a unique template id in the format of 1234doctemplate01. So, the 1234 equates to my Document template ID in my database and the filename is also mirrored in a 'Template Name' field in the database.
So, my question is how to build a Script to go through the database one record at a time and, for each, to check the Mac folder and see if there's matching document name and, if there is, to paste/copy it into the appropriate Container field
Can anyone advise, please? Further detail available if this isn't too clear
I would do something slightly different:
Import all the files in the folder into a new table (see https://help.claris.com/en/pro-help/content/importing-folder.html) and create a relationship between this table and your existing table, based on matching template ID or name.
Then you can loop over the records in either table and if a match is found, replace the data in the existing container. Or perhaps you could make this a permanent structure, where the file is stored in a related record.
Doing it the way you have proposed would be awkward using only Filemaker's built-in tools (and outright impossible if the folder is not somewhere within the Documents folder). It might be somewhat easier if you use a plugin (e.g. BaseElements) to read the folder's contents.
You can use base element plugin to select multiple files in a folder with
BE_FileSelectDialog
or check if a file exist with
BE_FileExists
and then looping over the list to insert every file into a container via
set field ["your_container" ; BE_FileImport ( $file )]

Upload from CSV file to DHTMLX GANTT

There is a custom implementation in dhtmlx gantt for upload from MPP/XML which goes to their servlet and renders the gantt. Has anyone tried to build a custom CSV upload or any third parties available to load the csv into the gantt.
https://dhtmlx.com/blog/export-import-ms-project-dhtmlx-gantt-chart/
There is no such solution from DHTMLX (FYI I work for DHTMLX), and I'm not aware if there is any third-party service or ready-to-use solution that could be used for a development.
At the code level, importing csv into gantt breaks down into three steps:
parsing CSV into an object array
mapping columns of CSV to properties of that objects (mandatory properties of gantt tasks - text/start_date/duration/parent)
and inserting the result into database.
The first step is trivial. Mapping columns may require implementing some sort of UI so the user could specify which columns of csv mean what in gantt.
For an inspiration, you can check how it's done in this app https://app.ganttpro.com/ - requires registration, but you can create a free account using google or facebook acc - create new project ("+ CREATE NEW" in lefthand menu), select "Import from" and try uploading some csv file -> here is how the ui looks like.
As for the last step - inserting parsed records into db - you'll need to do some coding in order to insert tasks without losing project hierarchy (task.parent -> task.id relations, given that database ids of your items will likely change after inserting), but overall it shouldn't be very difficult.
If you looking for something more specific - please update your question.

Sharepoint mail enabled library receives csv file, import into list

I am wondering what the best approach is (if any) to handle this scenario:
I will set up a mail enabled library to which an external party can send emails with a csv file attached.
What i want to do next is, on each received file into this library (or even once a day process them all) use this csv file and import it into a list for easier viewing.
I was thinking to create a custom workflow on item added that opens the csv file and imports it into another list.
Are there better/easier setups?
I'd say an event receiver may be more suitable for the scenario. That way you can capture the CSV file, extract it's contents, and add each line as an item. I use event receivers for a one-hop event, and workflows as mutiple hops (approvals, feedback, etc.).
Here is more on event receivers: http://msdn.microsoft.com/en-us/library/ee231563.aspx

csv file upload

In my Grails app, I would like admin users to be able to upload a CSV file that contains data such as:
List of users to be added to system
List of groups to be added to system
Assignment of users to groups
I have no idea how the user will generate these CSV files - most likely from Excel, Access or similar, and therefore I've no way of knowing which column will contain which data. So I'm planning to allow the user to specify which column contains users, groups, etc.
I'm wondering if there's a JavaScript component that could help with this. Ideally I'd like to implement the following:
User uploads file
In browser, user is shown first N lines of uploaded file and prompted to select the column that contains the users, groups, etc.
Column information is uploaded to server
Is there a client/server side component that could help with this, or an entirely different approach which would be superior to that outlined above?
I should emphasise that the users of this system will not be technically gifted, so expecting them to provide an XML/JSON file instead is out of the question (and you can definitely forget about asking them to call a Web Service instead of uploading a file).
Thanks,
Don
I like your solution so far, given that the users are non-technical, and that you want to be able to accept this data as a file upload, rather than have the users enter it directly into your application.
I would simply suggest that when the user uploads the file, the server returns the first five (or so) lines back to the client as an HTML table. Then you can have <select> drop-downs as the headers for each column, with the pre-set options you're looking for. You can validate that the user has assigned all available options to each column (use JS to remove options from the select as they use them, but be sure to provide a method to undo and change selections), and allow some columns not to be labeled (which the server will just ignore when parsing the file.
If possible, also illustrate (perhaps in a graph format or just an example sentence, if applicable) how their label choices will apply to the relationships. For example, "New user ABC will be a member of new group XYZ." If ABC and XYZ are unexpectedly backwards, the user will recognize they made a mistake.
Also, some users will inevitably upload a file where they used rows as columns and columns as rows. Either provide a GUI function to reverse this ("rotate" the table), or let them choose which axis to label.
I would also suggest providing your users with a collection of example files in various formats (Excel, Access, etc), and give them explicit instructions for how to enter the data they want, and step by step instructions to export as CSV and upload.
I have no idea how the user will generate these CSV files - most likely from Excel, Access or similar, and therefore I've no way of knowing which column will contain which data.
I should emphasize that the users of this system will not be technically gifted
With these two things in mind, are you sure that CSV import is the best way to handle bulk user creation? It's a great technical solution, but the question is, will your users be able to take advantage of it?
It may be worth implementing an alternative bulk create option for those who don't get CSV or are scared off by Excel. Perhaps a JS grid that has the required fields where they could manually enter the data for each field and enter as many as they need at once, with a link to upload a CSV file as an option for those who would use it.
For the CSV option, since your users are not technically-minded, it would be better to give them instructions on how to create the csv files that specify the order fields should be in. Along with a screen shot and a sample file.
Another option is to require the field names be the first row of the document, and require that they use specific labels for the fields. If you do that, you could figure out from the first row what order the data is in. You could also put in a check that looks for the titles in the first row and if they're not found, tell the user they need to add the field names to the CSV and re-upload.