I have a client who wants to retrospectively add data into GA from their back end mySQL database. The data will be transnational, for example when an existing customer makes a recurring payment via BACS/Bank Transfer.
Is it possible to do this, if so a) how and b) can it be automated?
I'm not sure if there is some script we can implement or whether we have to manually export the data into GA.
Thanks!
This is quite possible depending on what sort of data in in your database. Essentially, to upload anything to Google Analytics, you'll need some sort of key to key off of. E.g. transactionId, etc. Then you can use the Data Import (located under the Property of the account) to upload your data.
To automate it, you can use the Management API.
Related
I want to create a historical dataset from a snapshot uploaded table in Palantir Foundry. I want to add into that table the user who did the upload. Is that possible?
Can't find on the docs info about doing this.
TransformInput object doesn't contain that information.
You could use an API call to the Foundry build service to get the information about the latest update performed on the input dataset but that might give you wrong information if there were multiple updates from different people since you last ran the transform. This also requires Palantir to allow calling Foundry APIs from your repository if it's not already enabled.
I have a HubSpot account where my team updates my customers' details in three sheets such as Contacts, activities etc. Now I want to integrate the data available in these sheets to my MySQL database.
I want the changes done in the HubSpot sheets to be automatically transferred to my MySQL database.
Is there any way to do so?
Unfortunately, there is no any direct connector between HubSpot and MySQL out of the box. You have to create all the relevant tables yourself in your MySQL database and then populate them with the desired data via HubSpot APIs, for instance, use https://legacydocs.hubspot.com/docs/methods/lists/get_lists to retrieve all contact lists.
Alternatively, you can look at some commercial solutions such as Zapier, Skyvia, Hull etc.
For example, configuration of a Zapier connector is straight forward:
Sign up for free here, it will give you a chance to explore Zapier features within 2 weeks;
Authenticate MySQL in Zapier -> MyApps -> Connect a new account (select MySQL) to allow Zapier to access your MySQL account;
Authenticate HubSpot similar to the above step for MySQL;
Pick up HubSpot app as a trigger to kick off your data sync, for example, "Contact Recently Created or Updated";
Choose a resulting action from the MySQL app, for example, "Update Row";
Select the data you want to send from HubSpot to MySQL;
That is all, enjoy!
You can trial Zapier for 2 weeks, it should be enough to validate the integration aspect before signing up for a certain commercial plan; once logged in to Zapier you can see all available plans here.
Sequin recently launched a HubSpot/Postgres integration along the lines you're describing. There's a technical writeup describing how it works, which should be cross-applicable to MySQL.
The API strategy:
Use list endpoints to establish a base state in SQL.
Use /search endpoints and a stored cursor: fetch new changes and apply them to the base state.
I wrote a longer post on why we don't use the Webhooks API for this.
I have an external URL that returns JSON data, is there anyway within Firebase to schedule my database to import the data at the URL?
I'd like to schedule the import daily however I can't find any tutorials to do this.
Thanks
The dashboard isn't the right way to do this. If you want to put data into Realtime Database on a schedule, use an external scheduling system, such as the cron system found in App Engine, to trigger some code that you write that adds the data you want. You can use the Admin SDK to write that code or simply use the REST API to put JSON into the place you want.
I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo
I have a web app with a MySQL database we maintain in the cloud that we are trying to integrate with our QuickBooks Online account. We want to sync data between or web app's database and QuickBooks online, such as customer names and addresses. If they update their address in or web app, it's easy to then update it in QuickBooks online using the QuickBooks Online API. However, if they tell us their new address over the phone and we change it directly in QuickBooks online, we have no idea how to have that trigger something so that it automatically updates our MySQL web app. How do we go about doing this or learning about this process?
Intuit/QuickBooks has an API that's specifically geared towards this use-case. From the docs:
The change data capture (CDC) operation returns a list of entities that have changed since a specified time. This operation is for an app that periodically polls Data Services and then refreshes its local copy of entity data.
Docs are here:
https://developer.intuit.com/docs/0100_accounting/0300_developer_guides/change_data_capture
Basically you make an OAuth signed HTTP GET request like this:
https://quickbooks.api.intuit.com/v3/company/1234/cdc?entities=Class,Item,Invoice&changedSince=2012-07-20T22:25:51-07:00
And you get back a list of objects that have changed since the given date/time.
Your application can remember the last time you called this, and periodically call this API to get things that have changed since the last time you called it.
You get back something like this:
<IntuitResponse xmlns="http://schema.intuit.com/finance/v3" time="2013-04-03T10:36:19.393Z">
<CDCResponse>
<QueryResponse>
<Customer>...
</Customer>
<Customer>...
</Customer>
</QueryResponse>
<QueryResponse>
<Invoice>...
</Invoice>
<Invoice>...
</Invoice>
</QueryResponse>
</CDCResponse>
</IntuitResponse>