I am writing an SSIS package that reads XML files and loads the data from the XML files into the CRM 2011 database. As it stands, the package reads the title of a client record, compares it to the hard-coded values in my script, and if there is a match, the value of the title option set is assigned.
However, my client wishes to now have the option set expandable. This would require the SSIS package to retrieve the entire option set from CRM, to compare the current possible values with the value in the record. But the SSIS does not use the Microsoft.Xrm.Sdk assembly - the package uses condition expressions and invokes the CRM API to get and set entities.
So I cannot use a RetrieveAttributeRequest (the normal way when using the Microsoft.Xrm.Sdk assembly) to retrieve the option set. Is there another way?
You can get this information by looking at the SQL view "FilteredStringMap". You need to know the entity name, attribute name and also which language code you want to look at.
This is supported by Microsoft as it is using a filtered view.
Related
I have an SFTP trigger in a logic app which fires when a file is added to a certain file area. It is a CSV-formatted file and I want the rows to be parsed and coverted into json. Which is the best way to convert CSV-data into json without using any custom connectors?
I cannot find any built-in connectors doing this job. And as far as I know there are no logic apps functions doing the job either.
Right now, there is no connector/action in logic app that can provide the out of box solution for your requirement. You need to loop in through the array and perform the calculation as per your requirement but I will not suggest you leverage the loop, variables action as it may take time and cost you more.
The alternative would be leveraging the inline code (JavaScript code) to do the calculation as per your requirement. Please note that you will need Integration Account to run your inline code.
Please refer to javascript code and modified if needed according to your needs. I have used '_' for differentiating the nested objects. For more details you can refer to previous discussion here.
For complex calculation you can offload this functionality to azure function and write your code as per the supported languages and call azure function from logic app.
1.Created logic app as shown below:
2 .Created container in storage account and uploaded a CSV file in container.
3.Next using compose action to split the contents of the CSV file on every new line into an array.
a. Here is the expression used in SplitLines compose action:
split(body('Get_blob_content_(V2)'),decodeUriComponent('%0D%0A'))
b. Follow the below MS Doc to write expressions:
4. Removing last(empty) line from previous output using another compose action as shown below ,
take(outputs('SplitLines'),add(length(outputs('SplitLines')),-1))
5.Separating filed names using compose action
split(first(outputs('SplitLines')), ',')
Forming json as shown below using Select action,
**From**: **`skip(outputs('RemoveLastLine'), 1)`**
**Map:**
**`outputs('SplitFieldName')[0]`** **`split(item(), ',')?[0]`**
**`outputs('SplitFieldName')[1]`** **`split(item(), ',')?[1]`**
Tested logic app and it is running successfully. 
Content of CSV file is as shown below:
Csv data is formatted as json:
Reference:Use data operations in Power Automate (contains video) — Power Automate | Microsoft Docs
Credit: #Iason Koulas
I am working on a Microsoft Integration Service project, and I have a flat source file (Product.txt) that contains some data that I am saving in a SQL Server DB when I run the project.
The data is saved successfully, but when I change some values in my source Product.txt and re-run the project, the data in the SQL server are not updated.
Is there any thing that must be done to enable the update? Thank you.
There are several things you can do but you haven't provided enough info. I am guessing here based on the word "changed file", to me that means update.
That generally means in your data flow you should start with source, then use a lookup based on your destination to see if your "key" exists. Change the test to redirect no match.
Map no match to your inserts. And map matches to an update SQL statement.
I currently have a chat bot that has an entity for each stock symbol. There are over 3,000. For my dialog I want to be able to detect questions like #get #price #stockSymbol. Is there a way to deal with a large number of entities without writing an if statement for each one?
You are only allowed to have 100 entities in a single workspace. However those entities can have 100,000 values.
So you could create an entity called #StockSymbol and then each value would be the Stock identifier (eg. IBM).
So you would only need one IF statement to determine it is a stock, then pass back the entity information to your calling application to take action on the value.
To put this in programatically, if it is a one time thing you can create a CSV file like the following:
StockSymbol,IBM
StockSymbol,MSFT
StockSymbol,APPL
and so on. Then import that entity file. Alternatively you can use the workspace API to update an already deployed workspace.
I am sorry to say there is no process within the Conversation Service UI that has an automatic dialog creation method. In cases like this, many teams create an external script that can read a file with your entities in it, and then creates a workspace json file with the required dialog nodes. The workspace json file is a relatively simple format, and I have found you can easy merge any new json file into an already created workspace. In fact with the new API's it even possible to load the new elements into a running workspace. ( although if new to this, create a duplicate ws, and merge into this one, or download and merge via a good editor. )
I'm looking to write a program to update cells in LibreOffice Calc in real-time (or at least at some fixed tick) with data pulled from a MySQL database. Ideally, when the values in the database are updated, the corresponding cells in the spreadsheet would be updated such that any formulas or calculations existing in calc would continue to operate on the new values. So far, I have yet to find a way to dynamically and programatically insert data in this manner. Is it possible?
The LibreOffice component Base is a database front-end that handles queries, forms, and reports. While by default it uses an embedded version of HyperSQL database to manage the tables, it comes with drivers for any number of other back-end programs, including MySQL.
I think the easiest way to approach this would be to create a Base file with your MySQL database as its back-end (note Base will only be able to see tables and views from MySQL - it won't import queries; although you can save queries in the Base file if you want). Make sure to 'register' the Base file so the rest of LibreOffice can 'see' it. Once the file is registered, any open LibreOffice component can access the data from MySQL (Base file can be closed).
Now you can import any tables or views (from the MySQL component) or queries (from the Base file) into Calc: [Tutorial] Using registered datasources in Calc
Refreshing the imported data can be done through an API call. Here is an example in StarBasic code:
Sub refresh_DBRanges
Dim oDBRangesEnum
Dim oNext
oDBRangesEnum = thisComponent.DatabaseRanges.createEnumeration()
while oDBRangesEnum.hasMoreElements()
oNext = oDBRangesEnum.nextElement()
oNext.refresh()
wend
End Sub
Note in the second posting of the 'registered data sources' tutorial, it gives the API call to set the import ranges on a refresh timer.
Just a note that the Registered DataSources Tutorial is updated further down its page. It says the list of registered data sources can be accessed by hitting F4. That was true once but was changed with version 5. It's now Ctrl+Shft+F4.
I have created a transformation component, and basically it accepts data from a source, and will do very transformations before it can save the information in a database.
But, I want to map between the source columns and the database columns that are called up from my transformation component.
I would like it to look like the mapping in Lookup Transformation Editor.
How do I create an editor that will have the same functionality for mapping, so that it will allow a user to draw a line from a source column to a destination column.
I am using Sql Server 2008 and VS2008 for this project.
I realize I will need to create a custom UI for this component, replacing the Advanced Editor that comes up by default.
Use my wrapper for the DtsMappingSurfaceControl: http://toddmcdermid.blogspot.com/2009/07/using-dtsmappingsurfacecontrol-in-your.html