Palantir Foundry - Is it possible to retrieve the user that last updated an uploaded table? - palantir-foundry

I want to create a historical dataset from a snapshot uploaded table in Palantir Foundry. I want to add into that table the user who did the upload. Is that possible?
Can't find on the docs info about doing this.

TransformInput object doesn't contain that information.
You could use an API call to the Foundry build service to get the information about the latest update performed on the input dataset but that might give you wrong information if there were multiple updates from different people since you last ran the transform. This also requires Palantir to allow calling Foundry APIs from your repository if it's not already enabled.

Related

How i get backing dataset name with phonograph id?

I have a Phonograph table id. How can I know what is the backing dataset?
If you're using the Phonograph service API through Slate, you should be able to use the Get Table Metadata endpoint from the Table Registry Service. This endpoint takes in the Table RID and returns the schema and sync metadata, including the RID of the source dataset.
If you're not using these endpoints through Slate, but rather through a CLI or 3rd-party application, you can find the full Phonograph API definition in the platform documentation under the Developer tab.
Assuming this Phonograph table is used to write to an ontology object type, you can go into the Ontology Management Application and paste the id into the object type search field on the top left (see screenshot).
This will automatically find the related object type. If you select that object type and then click on the Datasources tab, you will see the corresponding backing dataset.

Import custom field options from one Infusionsoft app to another

I am trying to find a way to update a custom field automatically within multiple Partner Infusionsoft apps when I manually update the field with new options within our main Infusionsoft app.
The idea is to avoid manually logging into every Partner Infusionsoft app that we manage, individually, to update the custom field options so that they match our main app custom field options when it is updated. The custom field is constantly being updated with new options that need to be mirrored within all of our partner apps.
The process would not need to be entirely automatic. We could manage using a trigger to update the rest of the apps whenever we have manually updated the custom field in our main app.
Can anyone please steer me in the right direction or tell me if this isn't even possible?
Yes this is possible with the API. Unfortunately there isn't a REST webhook for when a custom field is modified, so it would require a constant poll against what you would consider the master application. Constantly check the custom field and see if it's values changed. This is the rest documentation for that:
https://developer.infusionsoft.com/docs/rest/#!/Contact/retrieveContactModelUsingGET
That will include all the possible options for the field.
Unfortunately you can't modify a custom field in the newer REST API so you will have to use xmlrpc. The rest API only let's you create according to the documentation.
Here are the docs for the endpoint you would use with xmlrpc to update all the other apps to match the master application.
https://developer.infusionsoft.com/docs/xml-rpc/#data
https://developer.infusionsoft.com/docs/table-schema/
DataFormField is the custom field information. You would use the data endpoint to modify the values.
Be careful when you are polling the master application to not go over usage limits and apply best practices.

Salesforce Development and configuration

I want to merge 6 different profiles into one, consolidating FLS, Record Type, Permission sets, Page layouts in salesforce. May I know whats the best possible and easiest way to do it?
Thanks
Use the Salesforce Metadata API to retrieve the 6 profiles in xml form. You can then view all the permissions each profile has and consolidate into one. Then use the Metadata API deploy() to deploy the profile to your Salesforce org.
Salesforce give examples on how to retrieve profiles and the required package.xml to include in the request here: https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_profile.htm
Note: the retrieval content of profiles is relative. i.e to see the profile permissions for Account object, you must include the Account object in your retrieve request.
Salesforce documentation:
https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/file_based.htm
https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_retrieve.htm

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo

mySQL data into Google Analytics

I have a client who wants to retrospectively add data into GA from their back end mySQL database. The data will be transnational, for example when an existing customer makes a recurring payment via BACS/Bank Transfer.
Is it possible to do this, if so a) how and b) can it be automated?
I'm not sure if there is some script we can implement or whether we have to manually export the data into GA.
Thanks!
This is quite possible depending on what sort of data in in your database. Essentially, to upload anything to Google Analytics, you'll need some sort of key to key off of. E.g. transactionId, etc. Then you can use the Data Import (located under the Property of the account) to upload your data.
To automate it, you can use the Management API.