Need to parse protocol-buffer data - mysql

I have never used the protocol-buffer, I am getting data in protocol-buffer and need to parse the data and query the data from mySQL database to show in view using Nodejs.
Can any one help me with any links and solution to retrieve data and parse from protocol-buffer.

Related

From Kafka json message to Snowflake table

I am trying to implement a Snowflake Sink connector so that can i load messages coming to a Kafka topic directly to an appropriate Snowflake table. So far, I could only get to the point of loading the raw json into a table with two columns (RECORD_METADATA and RECORD_CONTENT). My goal would be to directly load the json messages into an appropriate table by flattening them. I have the structure of what the table should be, so I could create a table and directly load into that. But I need a way for the load process to flatten the messages.
I have been looking online and through the documentation, but haven't found a clear way to do this.
Is it possible or do I have to first load the raw json and then do transformations to get the table that I want?
Thanks
You have to load the raw JSON first then you can do transformations.
Each Kafka message is passed to Snowflake in JSON format or Avro format. The Kafka connector stores that formatted information in a single column of type VARIANT. The data is not parsed, and the data is not split into multiple columns in the Snowflake table.
For more information you can read here

Azure Data Factory - copy task using Rest API is only returning first row upon execution

I have a copy task in ADF that is pulling data from a REST API into an Azure SQL Database. I've created the mappings, and pulled in a collection reference as follows:
preview of json data
source
sink
mappings
output
You will notice it's only outputting 1 row (the first row) when running the copy task. I know this is usually because you are pulling from a nested JSON array, in which the collection reference should resolve this to pull from the array - but I can't for the life of me get it to pull multiple records even after setting the collection.
There's a trick to this. You import schemas, then you put the name of the array in collection reference then you import schemas again then it works
Screen shot from azure data factory
Because of Azure Data Factory design limitation, pulling JSON data and inserting into Azure SQL Database isn't a good approach. Even after using the "Collective reference" you might not get the desired results.
The recommended approach is to store the output of REST API as a JSON file in Azure blob storage by Copy Data activity. Then you can use that file as Source and do transformation in Data Flow. Also you can use Lookup activity to get the JSON data and invoke the Stored Procedure to store the data in Azure SQL Database(This way will be cheaper and it's performance will be better).
Use the flatten transformation to take array values inside hierarchical structures such as JSON and unroll them into individual rows. This process is known as denormalization.
Refer this third-party tutorial for more details.
Hey I had this issue and I noticed that the default column names for the json branches were really long and in my target csv the header row got truncated after a bit and I was able to get ADF working by just renaming them in the mapping section.
For example i had:
['hours']['monday']['openIntervals'][0]['endTime'] in source and changed it to MondayCloseTime in destination.
Just started working. Can also just turn off the header on the output for a quick test before re writing all the column names as that also got it working for me
I assume it writes out the truncated header row at the same time as the 1st row of data and then tries to use that header row afterwards but as it doesn't match what its expecting it just ends. Bit annoying it doesn't give an error or anything but anyway this worked for me.

Decode json data while querying data from mysql database

Is it possible to retrieve decoded data from a mysql db column in which data is saved as json encoded text ie. instead of fetching json encoded text from the db and decoding it separately, is there any method to fetch decoded data from the select query itself?
Any help?
Thanks in advance!
The answer linked by jyoti mishra is quite good; MySQL handles JSON data internally now and json_extract is definitely the preferred solution if you're on MySQL 5.7 or later.
However, if you're not, you can use the phpMyAdmin "Transformation" feature to modify the displayed output when you're viewing through phpMyAdmin.
To start, you have to have at least partially configured the "phpMyAdmin configuration storage". From the table's Structure tab, "Change" the column containing the JSON data, then look for the "Browser display transformation" dropdown and select JSON. See this photo for an example:

Can data from a sql query be inserted into elasticsearch?

I'm new to elasticsearch.I have learnt how to give different queries and get search results with the understanding that each document is stored in json format.Is it possible to insert records that were obtained from an sql query on a relational database?If it is possible,how is it done? by converting each record into json format?
You need to build an index in elasticsearch similar to the way you've got your tables in the RMDBS, this can be done in a lot of ways and it really depends on what data you would need to access via elasticsearch. You shouldnt just dump your complete RMDBS data into ES.
If you search around you may find bulk data importers/synchronisers/rivers(deprecated) for your RMDBS to ES, some of these can run in the background and keep the indexes in ES upto date with your RMDBS.
You can create your own code as well which updates ES whenever any data is changed in your RMDBS. Look into the API for your platform Elastic Search Client APIhttps://www.elastic.co/guide/en/elasticsearch/client/index.html

How to retrieve information after specific strings in my sql

I am making csv file by sql2excell component for joomla. It calls the data by SQL queries, so I retrieve all the information through SQL queries. The problem is some that fields have extra information which I don't want to display to the end user.
For example, the fields show the data below when I retrieve by a query:
a:3:{s:4:"city";s:3:"000";s:3:"ext";s:3:"000";s:3:"tel";s:4:"0000";}
I only want the values that are in quotes, like this: 0000000000
Is there any way I can get this by an SQL query?
No.
These values are serialized and MySQL doesn't have functions to deserialize them. You have to make a bridge between MySQL and sql2excel which would decode it, but Im not familiar with this tool so can't be of much help.