How i get backing dataset name with phonograph id? - palantir-foundry

I have a Phonograph table id. How can I know what is the backing dataset?

If you're using the Phonograph service API through Slate, you should be able to use the Get Table Metadata endpoint from the Table Registry Service. This endpoint takes in the Table RID and returns the schema and sync metadata, including the RID of the source dataset.
If you're not using these endpoints through Slate, but rather through a CLI or 3rd-party application, you can find the full Phonograph API definition in the platform documentation under the Developer tab.

Assuming this Phonograph table is used to write to an ontology object type, you can go into the Ontology Management Application and paste the id into the object type search field on the top left (see screenshot).
This will automatically find the related object type. If you select that object type and then click on the Datasources tab, you will see the corresponding backing dataset.

Related

Palantir Foundry - Is it possible to retrieve the user that last updated an uploaded table?

I want to create a historical dataset from a snapshot uploaded table in Palantir Foundry. I want to add into that table the user who did the upload. Is that possible?
Can't find on the docs info about doing this.
TransformInput object doesn't contain that information.
You could use an API call to the Foundry build service to get the information about the latest update performed on the input dataset but that might give you wrong information if there were multiple updates from different people since you last ran the transform. This also requires Palantir to allow calling Foundry APIs from your repository if it's not already enabled.

How do I filter out step count that user input manually from Google Fitness REST API

I can now retrieve step count data from Google Fitness REST API.
https://www.googleapis.com/fitness/v1/users/me/dataset:aggregate
However, I can't tell which data is reliable (data not generate by user input).
After some research I found there is a orginalDataSourceId in document
https://developers.google.com/fit/rest/v1/reference/users/dataSources/datasets#resource
But the description say
WARNING: do not rely on this field for anything other than debugging. The value of this field, if it is set at all, is an implementation detail and is not guaranteed to remain consistent.
So I really don't know how to do. How do I filter out step count that user input manually from Google Fitness REST API?
I found a solution now. The originDataSourceId of dataset is not reliable when you get by aggregate. (You may get data that merged from differnt source)
So you can get the data source list first.
https://developers.google.com/fit/rest/v1/reference/users/dataSources/list
You can filter the Data Source by dataTypeName, device, ... etc.
Then you can use the dataStreamId of the Data Source as dataSourceId to aggregate data.
https://developers.google.com/fit/rest/v1/reference/users/dataset/aggregate
(aggregateBy)

Import custom field options from one Infusionsoft app to another

I am trying to find a way to update a custom field automatically within multiple Partner Infusionsoft apps when I manually update the field with new options within our main Infusionsoft app.
The idea is to avoid manually logging into every Partner Infusionsoft app that we manage, individually, to update the custom field options so that they match our main app custom field options when it is updated. The custom field is constantly being updated with new options that need to be mirrored within all of our partner apps.
The process would not need to be entirely automatic. We could manage using a trigger to update the rest of the apps whenever we have manually updated the custom field in our main app.
Can anyone please steer me in the right direction or tell me if this isn't even possible?
Yes this is possible with the API. Unfortunately there isn't a REST webhook for when a custom field is modified, so it would require a constant poll against what you would consider the master application. Constantly check the custom field and see if it's values changed. This is the rest documentation for that:
https://developer.infusionsoft.com/docs/rest/#!/Contact/retrieveContactModelUsingGET
That will include all the possible options for the field.
Unfortunately you can't modify a custom field in the newer REST API so you will have to use xmlrpc. The rest API only let's you create according to the documentation.
Here are the docs for the endpoint you would use with xmlrpc to update all the other apps to match the master application.
https://developer.infusionsoft.com/docs/xml-rpc/#data
https://developer.infusionsoft.com/docs/table-schema/
DataFormField is the custom field information. You would use the data endpoint to modify the values.
Be careful when you are polling the master application to not go over usage limits and apply best practices.

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo

mySQL data into Google Analytics

I have a client who wants to retrospectively add data into GA from their back end mySQL database. The data will be transnational, for example when an existing customer makes a recurring payment via BACS/Bank Transfer.
Is it possible to do this, if so a) how and b) can it be automated?
I'm not sure if there is some script we can implement or whether we have to manually export the data into GA.
Thanks!
This is quite possible depending on what sort of data in in your database. Essentially, to upload anything to Google Analytics, you'll need some sort of key to key off of. E.g. transactionId, etc. Then you can use the Data Import (located under the Property of the account) to upload your data.
To automate it, you can use the Management API.