Kafka JDBC Sink connector with json messages without schema - json

I am trying to load json messages into a Postgres database, using the Postgres sink connector. I have been reading online and have only found the option to have the schema in the JSON message, however, ideally, i would like not to include the schema in the message. Is there a way to register the JSON schema in the schema registry and use that like it's done with Avro?
Also, i'm currently running kafka by downloading the bin, as I had several problems with running kafka connect with docker due to ARM compatibility issues. Is there a similar install for schema registry? Because i'm only finding the option of downloading it through confluent and running it on docker. Is it possible to only run schema registry with docker, keeping my current set up?
Thanks

JSON without schema
The JDBC sink requires a schema
Is there a way to register the JSON schema in the schema registry and use that like it's done with Avro?
Yes, the Registry supports JSONSchema (as well as Protobuf), in addition to Avro. This requires you to use a specific serializer; you cannot just send plain JSON to the topic.
currently running kafka by downloading the bin... Is there a similar install for schema registry?
The Confluent Schema Registry is not a standalone package outside of Docker. You'd have to download Confluent Platform in place of Kafka and then copy over your existing zookeeper.properties and server.properties into that. Then run Schema Registry. Otherwise, compile it from source and build a standalone distribution of it with mvn -Pstandalone package
There are other registries that exist, such as Apicurio

Related

Kafka Integration with Salesforce

We have to read CSV file from SFTP server/folder and with the help of Kafka I have to push this data into Salesforce Sobject .Similarly we have to read any Sboject Data from salesforce and convert this data into into CSV file and send to SFTP server.
Can you please give me some idea about that ?How i can achieve this?
i did not check it very careful but maybe you can have a reading for this ,if you could query data from salesforce then you can change it to any formats.
Salesforce Connector (Source and Sink) for Confluent Platform — Confluent Platform
Salesforce Platform Events Sink Connector for Confluent Platform — Confluent Platform

How to deploy a HDP cluster without HDFS?

How to deploy a HDP cluster without HDFS, as I don't want HDFS for storage and will be using in-house in-memory storage system. How can this be done?
HDFS and MapReduce are the main internal part of Hadoop. They comes inbuilt with the package of Hadoop . You can not exclude HDFS while HDP cluster Deployment. You can exlcude other sevices then HDFS but you can not exclude HDFS.
HDFS is an implementation of the Hadoop FileSystem API, which models POSIX file system behavior.
You are probably referring to Object Stores in places, often using the term Blobstore. Hadoop does provide FileSystem client classes for some of these even though they violate many of the requirements. This is why, although Hadoop can read and write data in an object store, the two which Hadoop ships with direct support for — Amazon S3 and OpenStack Swift — cannot be used as direct replacements for HDFS.
For further reading, Object Stores vs Filesystems

Config Server using Database as repository

Currently we are maintaining all our properties in the database and applications are referenced through their Spring Profile Name , now we are transitioning into Cloud Foundry, keeping this in focus how can we build Spring Cloud Config Server using existing database to read application properties, so far in the documentation i see reference to Git Repository
http://cloud.spring.io/spring-cloud-config/spring-cloud-config.html#_spring_cloud_config_server
Not currently, we are limited to git and svn. There is a pull request for mongodb for an example though.
No longer true, now support is available for jdbc http://cloud.spring.io/spring-cloud-config/single/spring-cloud-config.html#_jdbc_backend

Jmeter not connecting to AWS RDS

I am trying to perform some testing on my rds instance by using jmeter.
I have succesfully created a testplan with the help of this tutorial. when i execute the testplan, i get 100% error in summary report.
Summary Report
When i checked jmeter.log, it shows
https://docs.google.com/document/d/1XxBkAq21_k3lj27uqd0GDxQ3kPJD37ey9Gy3aJsT6Ag/edit?usp=sharing
i think jmeter is not connecting with the rds instance, but i am not able to comprehend the reason.
i have copied the mysql-connector jar to the JMETER_HOME/lib
also, i have not edited jmeter.properties
You need to restart JMeter after copying mysql-connector.jar to /lib folder of your JMeter installation. It can also be done on Test Plan level via "Add directory or jar to classpath" input.
You need to provide appropriate "JDBC Driver Class" value in JDBC Connection Configuration
See The Real Secret to Building a Database Test Plan With JMeter guide for end-to-end walkthrough.

How to connect lift and MySQL?

I want to create a simple lift web application which read input throug form and insert it into database.
You need to add mysql jdbc driver to your project (as a maven artifact, or via sbt, or just put jar in CLASSPATH) and edit properties file (props/default.props):
db.driver=com.mysql.jdbc.Driver
db.url=jdbc:mysql://localhost/your_db_name
db.user=user
db.password=password
Also, you can setup db context in app container (Tomcat, etc.) instead. After that you can use Lift's ORM (Mapper).