Publish Data via Kafka with Palantir Foundry - palantir-foundry

We would like to publish data (records of an dataset) via Kafka to our enterprise service bus. We are on Prem. I know that it is possible to consume data via Kafka, but I have not found any documentation on how to publish it. Is this possible?

Related

How to recover recently deleted API from Azure APIM?

Is there any way to restore the deleted API on Azure APIM? The API was deleted recently (3 days back).
We have azure APIM instance backup taken daily, but we want to avoid complete instance restore as APIM service has 100+ APIs. Can we restore a single API from backup?
You cannot recover a deleted API from the Azure API management instance.
As Per the current Azure documentation, we can recover a soft-delete API Management Instance & currently this feature is in preview.
It depends on how you delete an API Management instance, the instance is either soft-deleted and recoverable during a retention period, or it's permanently deleted:
When you use the Azure portal or REST API version 2020-06-01-preview or later to delete an API Management instance, it's soft-deleted.
An API Management instance deleted using a REST API version before 2020-06-01-preview is permanently deleted.
An API Management instance deleted using API Management commands in Azure PowerShell or Azure CLI is permanently deleted..

How to publish AWS SNS data to MySql database

I am new to AWS/Database.
Since i am completely beginner to this, any suggestions will be appreciated.
Currently in the project it has been planned like data from AWS database will be pushed using SNS HTTP fanout to external MySql Database.
NOTE :
1.The data will be pushed by the Client using AWS SNS
2. We have no access to the AWS account nor we are planning to have a AWS account.
3. External MySql database is a private database running on Linux Server
I have gone through the Official documentation of AWS SNS, and also some websites. This is all i found :
Use external applications like Zapier to map the data.
Develop some application to map the data.
Is it like using a Servlet application in the receiver side to update the table, or is there any other methods?
AWS DB -----> SNS -----> _________ -----> External MySql DB
Thanks
If you cannot have an AWS Account, you can have your own web server consume the SNS Messages. SNS can deliver messages to an HTTP/HTTPS endpoint in a predefined structure. Read more details here. You can enable such an endpoint on your own server and share your server URL with the AWS Account owner. They can create a subscription from their SNS topic to your endpoint.
For setting up this endpoint, there are many options. ExpressJS is one such popular framework to quickly implement HTTP APIs.
Probably, option two would be more suited, or at least first to be considered. For that option you would have have to develop a lambda function which would receive data from SNS, re-format if needed and upload it to MySQL. So your architecture would look like:
Data--->SNS--->Lambda function---> MySQL
Depending on the amount of incoming data to the SNS, you may add SQS queue as well to the mix, to buffer the records and enable fun-out architecture. For example:
/---> SQS queue 1---> Lambda function 1---> MySQL
Data -->SNS --/
\
\--- SQS queue 2 ---> Lambda function 2, EC2 instance, Container ---> Other destination
Other solutions are possible. But I would first consider the above, before looking into other ways.

Kafka Integration with Salesforce

We have to read CSV file from SFTP server/folder and with the help of Kafka I have to push this data into Salesforce Sobject .Similarly we have to read any Sboject Data from salesforce and convert this data into into CSV file and send to SFTP server.
Can you please give me some idea about that ?How i can achieve this?
i did not check it very careful but maybe you can have a reading for this ,if you could query data from salesforce then you can change it to any formats.
Salesforce Connector (Source and Sink) for Confluent Platform — Confluent Platform
Salesforce Platform Events Sink Connector for Confluent Platform — Confluent Platform

Sync RDBMS with Apache Directory Ldap

Currently, I am on a requirement to sync data from apache direcotry ldap to any of the RDBMS Databases (MySQL, PostgreSQL). Directory approximately holds a few million of records for now and may grow in future. Ldap directory is being the primary data source for now but the motive is to have real time data in both Ldap as well as in RDBMS since We have a plan to use RDBMS for real-time analytics purpose.
Option1:
Thinking of using spring cloud data flow. A source spring boot app to read ldap data that are changed after the last sync run. Source app pushes data to queue(RabbitMQ for now). Sink would be another spring boot app that collects data directly from queue and persists the data into RDBMS. We will be able to better track and manage the sync process jobs using spring cloud data flow dashboard offerings.
Option2:
Spring LdapTemplate helps us to talk to ldap directory in our application. One approach would be to intercept the ldapTemplate calls wherever applicable and push the data to queue and then an intermediate app reads data from queue(RabbitMQ) and converts the ldap response to the required format that can be updated into RDBMS DB.
I am new to Ldap and spring cloud data flow. So far, I have got only these 2 approaches considering my project's existing technology and system landscape. Any other suggestions/ approach are really appreciated. Thanks in advance.
One another approach if LDAP is Microsoft ad server then creating windows service in C# which will connect to your LDAP server and fetch data every day and send data to your rdbms through socket connection. Which is reliable and consistent.

Azure VM Web Access

I am constructing a scaling ETL process and am considering using small Azure VMs to do this, but can't tell from reading the documentation if these VMs have web access.
Example: one ETL process reads a web API and imports data into MySQL and another reads from an email account and imports the data.
Is Azure compatible with reading web APIs/email APIs inside these VMs - both for Linux and Windows?
VMs in azure have internet access, so grabbing data from external api is possible