Convert json data from firebase to structured format to run big query - json

I am new to databases. I have data on firebase in the form of json and I have to run bigquery on it . For this reason I want to convert it to a structured database. Which is the best way to do that ? I want to do it in realtime.

If you're looking forward to send your Firebase data to BigQuery you should take a look at this. Here you can find how to link your Firebase project to BigQuery
As you can see in the link above:
You can export Google Analytics, Crashlytics, Predictions, Cloud
Messaging, and Performance Monitoring data to the BigQuery sandbox
free of charge (Sandbox limits apply).
With the BigQuery sandbox integration, you have access to:
Data from Google Analytics App and App + Web properties
Data from Google Analytics for Firebase
Details from Crashlytics fatal and non-fatal crash events and stack traces
Raw Predictions data Detailed Cloud Messaging data
Details of each captured Performance Monitoring event Learn more about upgrading from the Sandbox and BigQuery
I hope it helps

Related

Writing a code to receive messages from iot hub and store them in a container in BlobStorage

I would like to write a function using function app in microsoft azure to receive messages from an iot hub convert them from base64 format to string and store them in a container in BlobStorage. Could you please help me to do this?
Thanks in advance.
Br
Masoud
Create the Azure Function IoT Hub Trigger to receive the messages and this template comes by default when you create it.
Connection value should be provided in the local.settings.json file, which is IoT Hub Connection string, can get it from the Azure Portal > IoT Hub Resource > Buit-in endpoints under Hub Settings.
Runt the function where you will see your messages flowing from IoTHub to your Azure Functions (assuming you have devices or simulators connected that are sending data).
Refer to this article for step-by-step information on receiving the messages of IoT Hub to Azure Function.
To save those messages came from Sensors in Azure, you can use Azure Storage Account either Blob Container or Table Storage.
Step 1: Go to IoT Hub resource > Message routing > Custom Endpoints > Add the New Endpoint for the Storage Service.
Step 2: Give an endpoint name, pick a container created for storing the data (messages), encoding format either JSON or AVRO, File Name Format, Authentication Type, etc. in this window:
Now, we added Storage Account endpoint to route messages received by IoT Hub to the Azure Blob Container Storage.
Final Step is to add the route for routing data to the storage account.
Please visit these sources for detailed information:
Saving IoT Hub messages to Azure Blob Storage.
Saving IoT Hub Sensor Data to Azure Table Storage
Official Documentation of Azure Functions IoT Hub Trigger
Note: Azure Blob Storage is cost-effective (High Price). Basically, it is recommended this variant only for proof-of-concepts or very small, simple projects. Please refer this article for more information on using which storage account for IoT Hub Trigger to optimize cost.

Is there a simple way to host a JSON document you can read and update in Google Cloud Platform?

What I'm trying to do is host a JSON document that will then, essentially, serve as a hosted version of json-server. I'm aware I can do something similar with My JSON Server, but I plan to move my entire architecture to GCP so want to get more familiar with it.
At first I looked into the Storage JSON API, but it seems like that's just for getting data about buckets rather than the items in the buckets itself. I created a bucket called test-json-api and added a test-data.json, but there's seemingly no way to access the data in the json file via this API.
I'm trying to keep it as simple as possible for testing purposes. In time, I'll probably use a firestore allocation, but for now I'd like to avoid all that complexity, and instead have a simple GET and a PUT/PATCH to a json file.
The Storage JSON API you are talking about are only for getting and updating the metadata and not for getting and updating the data inside the object. Objects inside the Google Cloud Storage bucket are immutable and one way to update them may be to get the object data from Google Cloud Storage bucket within the code, updating it, then uploading it again into the Google Cloud Storage bucket.
As you want to deal with JSON files you may explore using Cloud Datastore or Cloud Firestore. Also if you wish to use Firebase then you may explore Firebase Realtime Database.
A very quick and dirty way to make it easy to read some of the information in the json doc is to use that information in the blob name. For example the key information in
doc = {'id':3, 'name':'my name', ... }
could be stored in an object called "doc_3_my name", so that it can be read while browsing the bucket. You can then download the right doc if you need to see all the original data. An object name can be up to 1024 bytes of UTF-8 (with some exclusions), which is normally sufficient for surfacing basic information.
Note that you should never store PII like this.
https://cloud.google.com/storage/docs/objects

Is it possible for firebase to update itself?

I need to provide every user with data from different sites. Each of the site provides data in JSON format, but some of them have restriction to maximal number of request.
My idea for solution is to download the data to firebase periodically, than users will access just the firebase database.
From docs it seems to me that firebase can somehow use http requests.
Can I use firebase to periodically update itself by http request?
Or should I establish server which will do the task?
I am pretty new to those topics so any tip where to look for information will be appreciated.

Retrieveing wso2 das API Usage information

I have a custom front-end set up and running in Grails. This is linked to WSO2 user creation and subscription and consequent clicks where user can invoke the API's published. I also have DAS configured with a custom dayabase(MySQL). I want to keep a graph in my frontend which will show the API Usage graph. How do I get this information? Is it possible by some REST call or probably directly by querying database? I have great scaling needs so querying might incur high latency costs.
The API usage data are collected,analysed and exposed by DAS.DAS has REST api as well javascript api to retrieve the summarised data

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo