Implementing IoT PowerBI table schema - json

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?

there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.

One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo

Related

Is there a simple way to host a JSON document you can read and update in Google Cloud Platform?

What I'm trying to do is host a JSON document that will then, essentially, serve as a hosted version of json-server. I'm aware I can do something similar with My JSON Server, but I plan to move my entire architecture to GCP so want to get more familiar with it.
At first I looked into the Storage JSON API, but it seems like that's just for getting data about buckets rather than the items in the buckets itself. I created a bucket called test-json-api and added a test-data.json, but there's seemingly no way to access the data in the json file via this API.
I'm trying to keep it as simple as possible for testing purposes. In time, I'll probably use a firestore allocation, but for now I'd like to avoid all that complexity, and instead have a simple GET and a PUT/PATCH to a json file.
The Storage JSON API you are talking about are only for getting and updating the metadata and not for getting and updating the data inside the object. Objects inside the Google Cloud Storage bucket are immutable and one way to update them may be to get the object data from Google Cloud Storage bucket within the code, updating it, then uploading it again into the Google Cloud Storage bucket.
As you want to deal with JSON files you may explore using Cloud Datastore or Cloud Firestore. Also if you wish to use Firebase then you may explore Firebase Realtime Database.
A very quick and dirty way to make it easy to read some of the information in the json doc is to use that information in the blob name. For example the key information in
doc = {'id':3, 'name':'my name', ... }
could be stored in an object called "doc_3_my name", so that it can be read while browsing the bucket. You can then download the right doc if you need to see all the original data. An object name can be up to 1024 bytes of UTF-8 (with some exclusions), which is normally sufficient for surfacing basic information.
Note that you should never store PII like this.
https://cloud.google.com/storage/docs/objects

How to test WebHooks without an on-premise external system?

I'm trying to teach myself about integrating systems via WebHooks.
In a free/hosted GIS system, I can create a WebHook that would, in theory, POST a JSON object to an external system.
The problem is, I don't have an external system that's available right now for for receiving the POST.
I think I need some sort of publicly available sample server that would:
Receive the POST requests
Do something with the requests (ie. create some sort of record)
...so that I could determine if the WebHook worked correctly or not.
How can I test my WebHooks without having an on-premise external system?
I've poked around websites like Postman Echo and Amazon Lambda. But to my untrained eye, it seems like they're not quite designed for what I need.
You could use any of these options depending on your requirements:
You could use webhooks modules in services like Integromat or Zapier to receive webhook data and then apply transformation.
You could deploy a script on heroku and use the URL generated there to send the webhooks calls.
You could also use services like requestbin, webhook.site etc if you just want to receive webhooks data.
Regards

Is it possible for firebase to update itself?

I need to provide every user with data from different sites. Each of the site provides data in JSON format, but some of them have restriction to maximal number of request.
My idea for solution is to download the data to firebase periodically, than users will access just the firebase database.
From docs it seems to me that firebase can somehow use http requests.
Can I use firebase to periodically update itself by http request?
Or should I establish server which will do the task?
I am pretty new to those topics so any tip where to look for information will be appreciated.

Using Web Services to Load Customer Orders using .Net

I would like to create web service(s) that I can publish to external facing network to allow our customers team to send us CRUD operations on customer orders.
What would be the best practice in this case of using Microsoft or Open-Source technologies to serve the customer reqeusts?
Option1:
The web service accepts data XML/JSON
Stores the data locally in a file
A task picks up the file and attempts data load it in the background
Send an email for records that failed
Drawback here is the response from the web service will not be realtime and validation will be limited.
Option2:
The web service accepts data XML/JSON
Attempt data load
Respond immediately if load was success or failure
Drawback here is if the volume of orders increases increases several folds in near future if the infrastructure can handle it.
I am open to using REST with WCF or Web API and any other helpful technologies that can be scaled when demand grows.
Have you tried message queueing?
Basically, in this architecture, there is a client application (called producer) that submits a message to the message broker (message queue), and there is another application (called consumer) that connects to the broker and subscribes for the message to be processed.
The message can be just simple information or a task that will be processed by another application.
The application can act both as producer and consumer.
There are many message queue software, one of them is rabbitmq.
Here is the complete intro about this: https://www.cloudamqp.com/blog/2015-05-18-part1-rabbitmq-for-beginners-what-is-rabbitmq.html
Since the communication is done through the middleman (aka the message queue) it will not provide an immediate response. But you don't need to send the process result (i.e. Order processing in your case) to the email since the application can subscribe for the message of the result.
It is perfect to handle a huge load of processes. As always you can start small (even free) and scale up in the future.
Take a look at the pricing details https://www.cloudamqp.com/ that provides rabbitmq software as a service.
I do this using ActiveMQ as a central message broker (you can also use Azure Service Bus if you have an Azure subscription) and pre-baked domain objects. For your scenario it might look like:
The web service accepts data XML/JSON
yes, you have a REST service that accepts multipart requests, lets say JSON as it's easier to work with on the client side. Before you can send messages it's usually best to convert the incoming client message to a domain message, so all message consumers know the exact format to expect and can therefore validate the message. I usually create these using xsd.exe on Windows using an XSD file that describes the format of the object. xsd.exe turns that XSD into a C# class. It's then just a process of taking the JSON fields and populating an Order class. That Order then gets sent as a message to the broker. At that point you're now in guaranteed messaging land as JMS will take care of that and ActiveMQ will take care of message persistence.
Stores the data locally in a file
rather than a file, you convert the incoming JSON to a domain class, e.g. an Order instance. You'll never see JSON or XML beyond this point as it's all domain classes from here.
A task picks up the file and attempts data load it in the background
yes, the broker has routes defined in a Camel config that tells it to, for example, send messages coming in on the /client topic to the /orders topic. The task is set up as a durable topic subscriber so automatically gets that Order domain object.
Send an email for records that failed
if the Order object contains information about the client (email etc) then the task can send the email on failure but a better pattern is to route the failed Order to the /error topic where a different task, which again is a durable topic subscriber picks it up and logs/sends email/audits etc
if the volume of orders increases increases several folds in near
future
you can cluster the brokers and run multiple Order consumers. If you separate the failure handling into another route, all the order task has to do is process the order and route the message to either the /error or /success topic depending on the outcome. So each route provides a small piece of the puzzle and you can scale up the pieces if the puzzle gets too big.

Retrieveing wso2 das API Usage information

I have a custom front-end set up and running in Grails. This is linked to WSO2 user creation and subscription and consequent clicks where user can invoke the API's published. I also have DAS configured with a custom dayabase(MySQL). I want to keep a graph in my frontend which will show the API Usage graph. How do I get this information? Is it possible by some REST call or probably directly by querying database? I have great scaling needs so querying might incur high latency costs.
The API usage data are collected,analysed and exposed by DAS.DAS has REST api as well javascript api to retrieve the summarised data