Configuring a linked service between zoho (CRM) and Azure Data Factory - json

I am trying to configure a linked service in Azure Data Factory (ADF) in order to load the ZOHO data to my SQL database. I am using a REST API linked service for this which successfully connects to ZOHO. I am mainly struggling to select the proper relative URL. Currently I use the following settings:
Base URL: https://www.zohoapis.eu/crm/v2/
Relative URL: /orgxxxxxxxxxx
When I try to preview the data, this results in the following error:
Error occurred when deserializing source JSON file ''. Check if the data is in valid JSON object format.
Unexpected character encountered while parsing value: <. Path '', line 0, position 0.
Activity ID: 8d32386a-eee0-4d2a-920a-5c70dc15ef06
Does anyone know what I have to do to make this work such that I can load all required Zoho tables into ADF?

As per Microsoft official document:
The Zoho connector is supported for the following activities:
1. Copy activity with supported source/sink matrix
2. Lookup activity
REST API doesn't support Zoho Connector.
You are expecting a JSON file as source but your Sink is Azure SQL Database. You need to change your approach.
You can use directly use Zoho Linked Service in ADF to fetch the data from zoho (CRM).
You can copy data from Zoho to any supported sink data store. This
connector supports Xero access token authentication and OAuth 2.0
authentication.
You need to use Copy activity to fetch the data from Zoho (CRM) and store it in Azure Storage account. Once data received, you need to Denormalize the JSON file to store it in Azure SQL Database. Use data flow activity to denormalize the data using flatten transformation.
Use the flatten transformation in mapping data flow to take array values inside
hierarchical structures such as JSON and unroll them into individual
rows. This process is known as denormalization.
Learn more here about Flatten transformation in mapping data flow.
Once flatten done, you can store the data in Azure SQL Database.

Related

Azure Data Factory Copy Data from Rest API to CSV - Json object value isn't output

The values for data.CustomerNumber do not get written to the file, the other elements do.
The CustomerNumber is Pascal case because it is a custom value in the system I am pulling the data out of.
I have tried the map complex values option, advanced editor, deleted, started over. I originally was trying to save this to a Azure Sql table but gave up , and just tried to write to a file. I have been able to do this type of mapping in other pipelines so I am really at a loss for why it doesn't work in this case.
The actual json content for the mapping

Apache beam for data validation, GCP

I have a requirement to validate the incoming csv data against the metadata which is coming from a bigquery table. I am using dataflow in local to do this, but I am unable to find out a way how can I use apache beam transform to implement this logic.
Note: At the moment, I saw some code where ParDo was getting used in which sideinput is given. Is this a right approach?
Validation required: Before loading the data to a bigquery table I have to check it against the metadata, checking if it passes the validation. I have to insert the data to BQ table.
Sample code:
"Process Data into TableRows" >> beam.ParDo(ProcessDF(),
beam.pvalue.AsList(metadata_for_validation))

Retrieving forecast data from OpenWeatherMap in FIWARE ORION

I am trying to get weather forecasts data from OpenWeatherMap and integrate them in Orion by performing a registeration request.
I was able to register and get the API key from OpenWeatherMap, however, the latter returns a JSON file with all the data inside, which is not supported by ORION.
I have followed the step by step tutorial https://fiware-tutorials.readthedocs.io/en/latest/context-providers/index.html#context-provider-ngsi-proxy where they have acquired the data from OpenWeatherMap using NGSI proxy, an API key is required to be indicated in the docker-compose file as an environment variable, however, the data acquired is the "current data" and not forecast and also specific to Berlin.
I have tried to access the files inside the container "fiware/tutorials.context-provider" and try to modify and match the parameters to my needs but I feel like I am taking a long blocked path.
I don't think that's even considered as good practice but I have run out of ideas :(
Can anyone suggest how I could bring the forecast data to Orion and register it as a context provider?
Thank you in advance.
I imagine you aim to implement a context provider, able to speak NGSI with Orion.
OpenWeatherMap surely doesn't implement NGSI ...
If you have the data from OpenWeatherMap, as a JSON string, perhaps you should parse the JSON and create your entities using some select key-values from the parsed OpenWeatherMap? Save the entity (entities) locally and then register those keys in Orion.
Alternatively (easier but I wouldn't recommend it), create local entities with the entire OpenWeatherMap data as the value of an attribute of the entity:
{
"id": "id-from-OpenWeatherMap",
"type": "OpenWeatherMap",
"weatherData": {
"value":
...
}
...
}
Then you register id/weatherData in Orion.

How to copy mysql data to azure blob using python in Azure data factory

I have followed this link https://learn.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-python and created the pipeline to copy data from Azure blob to blog.
Then I tried to extend this code to copy data from mysql table and store to azure blob.
I'm creating the linked services by using:
ls_mysql = AzureMySqlLinkedService(connection_string=mysql_string)
and creating dataset like this:
ds_azure_mysql = AzureMySqlTableDataset(ds_ls,table_name=table1)
And I'm getting the following exception:
"Errors: 'AzureMySqlTable' is not a supported data set type for Azure Storage linked service. Supported types are 'AzureBlob' and 'AzureTable'"
Can you help to identify the correct method to use here?
Or any API reference related to mysql to blob copy would really help.

load data into mysql from restful web services

I want to load some data into mysql from a restful web services.
The API accepts http requests and returns xml output.
I want to load the xml output into mysql.
How can I do this ?
as I see you would like to do two steps in one ;) First that you should do, creating a Java object from XML (unmarshalling) and 2nd, save the object to DB.
For 1st, you can use JAXB or other free tool. For 2nd, use Hibernate ;)