Kafka Integration with Salesforce - csv

We have to read CSV file from SFTP server/folder and with the help of Kafka I have to push this data into Salesforce Sobject .Similarly we have to read any Sboject Data from salesforce and convert this data into into CSV file and send to SFTP server.
Can you please give me some idea about that ?How i can achieve this?

i did not check it very careful but maybe you can have a reading for this ,if you could query data from salesforce then you can change it to any formats.
Salesforce Connector (Source and Sink) for Confluent Platform — Confluent Platform
Salesforce Platform Events Sink Connector for Confluent Platform — Confluent Platform

Related

Publish Data via Kafka with Palantir Foundry

We would like to publish data (records of an dataset) via Kafka to our enterprise service bus. We are on Prem. I know that it is possible to consume data via Kafka, but I have not found any documentation on how to publish it. Is this possible?

Kafka JDBC Sink connector with json messages without schema

I am trying to load json messages into a Postgres database, using the Postgres sink connector. I have been reading online and have only found the option to have the schema in the JSON message, however, ideally, i would like not to include the schema in the message. Is there a way to register the JSON schema in the schema registry and use that like it's done with Avro?
Also, i'm currently running kafka by downloading the bin, as I had several problems with running kafka connect with docker due to ARM compatibility issues. Is there a similar install for schema registry? Because i'm only finding the option of downloading it through confluent and running it on docker. Is it possible to only run schema registry with docker, keeping my current set up?
Thanks
JSON without schema
The JDBC sink requires a schema
Is there a way to register the JSON schema in the schema registry and use that like it's done with Avro?
Yes, the Registry supports JSONSchema (as well as Protobuf), in addition to Avro. This requires you to use a specific serializer; you cannot just send plain JSON to the topic.
currently running kafka by downloading the bin... Is there a similar install for schema registry?
The Confluent Schema Registry is not a standalone package outside of Docker. You'd have to download Confluent Platform in place of Kafka and then copy over your existing zookeeper.properties and server.properties into that. Then run Schema Registry. Otherwise, compile it from source and build a standalone distribution of it with mvn -Pstandalone package
There are other registries that exist, such as Apicurio

Does Foundry Data Connection support SFTP sources?

When using Data Connection, can I ingest files into Foundry from an SFTP (SSH File Transfer Protocol, not FTPS) source?
Data Connection doesn't support STFP out of the box. However, there is a custom source.
Ask your Palantir team to install it.

Can we use rollback(transactions) in Azure blob while uploading multiple files using SSIS package?

I am using ssis to upload multiple files to azure blob storage. Requirement is when any of the file upload fail we need to rollback transactions. I have tried the transaction option in ssis but so far i am not able to rollback data from blob storage.
Has anyone tried the rollback option in azure blob storage? Please do let me know your thoughts on this.
Thanks
Vidya
There are two options. If you are importing .csv or AVRO files, you can use the SQL Server Integration Services Feature Pack for Azure installed where:
The Azure Blob Source component enables an SSIS package to read data from an Azure blob. The supported file formats are: CSV and AVRO.
If you are not working with these file formats, then the solution offered in this SO thread: How to transaction rollback in ssis? with the use of Sequence Containers.
The SSIS Feature Pack for Azure contains connectors for various other data services and might be useful if you are going to be consuming additional services in the future.

How can I connect my MySQL database with Cloud JIRA Database?

I am having one asp.net application in which I used MySQL as database. The MySQL database is having the list of Issues in one of the table. I want to access the Database of our Cloud JIRA and want to insert the issues of MySQL DB into the cloud JIRA DB directly from that asp.net application.
Is there any API available for us to do so? What will be the best approach to fulfill the above requirement?
If you are using JIRA Cloud, you are not able to access the database at all. It's part of the limitation of Cloud. You might be able to use REST API depends on what you are looking to achieve exactly.
However, I would recommend you to find a way to export the issues from your ASP.net Application to CSV file and then restore the CSV file into your Cloud JIRA since JIRA has an ability to create issues from Excel file.
Note that, you may need to contact Atlassian Support for restoring JIRA from CSV file since they do have full control over instances.