Convert JSON to CSV in Azure Cosmos DB or Azure Blob - json

I need to move JSON data in Azure Cosmos DB to Azure Blob and eventually need data to be in CSV format.
Found out there's a feature that converts CSV data to JSON but can't find the other way around..
It really doesn't matter where I convert data from JSON to CSV, either in Azure Cosmos DB or in Azure Blob. How could I do this? Thanks!

Based on you requirements, I think Azure Data Factory is your perfect option.
You could follow this tutorial to configure Cosmos DB Output and Azure Blob Storage Input.
Input:
Output:
Then use Copy Pipelines to process data.
Copy Pipelines:
Result:
Hope it helps you.

Related

What is an alternative to CSV data set config in JMeter?

We want to use 100 credentials from .csv but I would rather like to know if there is any other alternative to this available in jmeter.
If you have the credentials in the CSV file there are no better ways of "feeding" them to JMeter than CSV Data Set Config.
Just in case if you're still looking for alternatives:
__CSVRead() function. The disadvantage is that the function reads the whole file into memory which might be a problem for large CSV files. The advantage is that you can choose/change the name of the CSV file dynamically (in the runtime) while with the CSV Data Set Config it has to be immutable and cannot be changed once it's initialized.
JDBC Test Elements - allows fetching data (i.e. credentials) from the database rather than from file
Redis Data Set - allows fetching data from Redis data storage
HTTP Simple Table Server - exposes simple HTTP API for fetching data from CSV (useful for distributed architecture when you want to ensure that different JMeter slaves will use the different data), this way you don't have to copy .csv file to slave machines and split it
There are few alternatives
JMeter Plugin for reading random CVS data : Random CSV Data Set Config
JMeter function : __CSVRead
Reading CSV file data from a JSR223 Pre Processor
CSV Data Set Config is simple, easier to user and available out of the box.

Read csv file using Prometheus

I have data in csv formate, I want to add that data into prometheus please any help me.
Are there any exporter exist or API etc.
You need an exporter that exposes the data from the CSV files over HTTP in Prometheus format, so that Prometheus can scrape it.
See, for example, https://github.com/stohrendorf/csv-prometheus-exporter if it does what you want.
Alternatively, you can expose the raw data directly from your application with a Prometheus client library or use this to create your own customised exporter.

Azure Data Factory v2 Data Transformation

I am new to Azure Data Factory. And my question is, I have a requirement to move the data from an on-premise Oracle and on-premise SQL Server to a Blob storage. The data need to be transformed into JSON format. Each row as one JSON file. This will be moved to an Event Hub. How can I achieve this. Any suggestions.
You could use lookup activity + foreach activity. And inside the foreach, there is a copy activity. Please reference this post. How to copy СosmosDb docs to Blob storage (each doc in single json file) with Azure Data Factory
The Data copy tool as part of the azure data factory is an option to copy on premises data to azure.
the data copy tool comes with a configuration wizard where you do all the required steps like configuring the source, sink, integration pipeline etc.
In the source you need to write a custom query to fetch data from the tables you require in json format.
In case of SQL server to select json you would use the options OPENJSON, FOR JSON AUTO to convert the rows to json. Supported in SQL 2016. For older versions you need to explore the options available. Worst case you can write a simple console app in C#/java to fetch the rows and then convert them to json file. And then you can upload the file to azure blob storage. If this is an one time activity this option should work and you may not require a data factory.
In case of ORACLE you can use the JSON_OBJECT function.

Can I use AWS Glue to convert json data on S3 to columnar format and push it to Redshift?

I have data in nested JSON format on S3 which keeps on updating. I want the data to periodically push this data to Redshift cluster after the conversion. Can AWS Glue be helpful in configuring the periodic runs that converting the format to columnar and pushing it to Redshift?
AWS Glue ETL can run a Python script that transforms the data any way you want. Here's an example they put together where they flatten JSON via the Relationalize transform.

Using blob as a JSON data feed

As Azure blob storage is a very cheap data storage solution, would it make sense to store JSON data in blobs so clients can access it like a data API? This way we don't need to spin up Web Apps/APIs to server JSON data.
That could work, depending on the scenario. The blobs will be updated on-the-fly when you push a new version of the JSON files.
I demonstrated this a while ago with a simple app that uploads and updates a file. Clients could target the URL and for them is seemed like they were accessing a JSON data feed that kept being updated.