RestAPI Data transfer restriction - json

Is there a way to allow for a restcall to receive json data for presentation in the callers UI at the time of the call but prevent the user from being able to write the data to a datastore

Related

save local data fulfillment dialogflow nodejs

when I use dialogflow + google assistant, I usually save a temporary data with conv.data and save to user storage with conv.user.storage in webhook cloud function javascript
but, when I use dialogflow + telegram, I didn't find that.
so I want to save a score in quiz, so in 10 question, the system able to store like a temporary storage to save the score and calculate after the quiz is finish
Both conv.data and conv.user.storage are Actions on Google concepts, so are only available with that integration.
The rough equivalent of conv.data would be to have a Context with a long lifespan. Your webhook would save the data you're interested in preserving for the session in this context and then retrieve it in a later webhook call.
This is only maintained for a session, however. Any longer-term storage would require you to have a unique identifier for the user which you can use as a key into a database or data store you maintain.

Authentication on a REST API in Power BI

I would like to use Power BI to query data obtained through a REST API. This requires initial authentication with a username and password throuhg a dedicated endpoint, which returns an authentication token in a JSON response.
This token then must be passed in the header of subsequent calls. Is this possible in Power BI? I couldn't find any obvious way to first authenticate, store the results in a variable to be reused to get the data.
Thanks!

Asp.net Web API http response data size

I have an asp.net web api with getting a list of data from database with a very heavy sql query(using store procedure) then serialize to json, my data result could return sometime more than 100,000 rows of data and is beyond the max limitation of http JSON response which is 4MB, I've been trying to use pagination to limit my result size but it pulls down performance as everytime user click on next page will trigger a heavy sql command, but if I don't use pagination, sometimes the result data size is more than 4MB, and my client side grid won't render properly. Since I don't have a way to check the JSON data size before sending back to client from web api. So my questions would be:
Is there any way to check data size in asp.net web api before sending back to client? For example, if it's more than 4MB then send a response saying "please modify your date range to have less data"? Would this be a good idea in application design?
Is there any way to save the entire data result in cache or
somewhere with asp.net web api so that every time when user perform a
pagination, it will not get result again from database but from the
cache.
Is there any way to store the entire data result in cache or in a temp file on client side(using Angular 5) so that when user perform a pagination, it will not request another http call to web api.
I would be more than happy to listen any experience or suggestion from anyone! Thank you very much!

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo

How do I insert/update data into offsite databases that don't have an API available?

I'm trying to figure out how to insert/update data into offsite databases that don't have an API available. Since they don't have an API, I thought of an approach I can take to insert/update data into their database.
They would first need to build a script and place it in an accessible location on their webserver that I can access via a URL. They would be required to supply the URL to me. I then can do a cURL POST request to that URL and pass a JSON array of the data that needs to be inserted. The script on their server would handle the parsing of the JSON array and the insert/update into the database.
I think this should work, but what security issues would I be opening them up to?
What you described is them creating an API. Just because the url invokes a script and isn't written in something like Java or PhP doesn't mean its not an api.
You need to make sure your url is secure so only authorized people can invoke it, and they would probably want to do data validation.
You should let them decide whether that is easier than standing up a more robust/non-script based solution