when I use dialogflow + google assistant, I usually save a temporary data with conv.data and save to user storage with conv.user.storage in webhook cloud function javascript
but, when I use dialogflow + telegram, I didn't find that.
so I want to save a score in quiz, so in 10 question, the system able to store like a temporary storage to save the score and calculate after the quiz is finish
Both conv.data and conv.user.storage are Actions on Google concepts, so are only available with that integration.
The rough equivalent of conv.data would be to have a Context with a long lifespan. Your webhook would save the data you're interested in preserving for the session in this context and then retrieve it in a later webhook call.
This is only maintained for a session, however. Any longer-term storage would require you to have a unique identifier for the user which you can use as a key into a database or data store you maintain.
Related
What I'm trying to do is host a JSON document that will then, essentially, serve as a hosted version of json-server. I'm aware I can do something similar with My JSON Server, but I plan to move my entire architecture to GCP so want to get more familiar with it.
At first I looked into the Storage JSON API, but it seems like that's just for getting data about buckets rather than the items in the buckets itself. I created a bucket called test-json-api and added a test-data.json, but there's seemingly no way to access the data in the json file via this API.
I'm trying to keep it as simple as possible for testing purposes. In time, I'll probably use a firestore allocation, but for now I'd like to avoid all that complexity, and instead have a simple GET and a PUT/PATCH to a json file.
The Storage JSON API you are talking about are only for getting and updating the metadata and not for getting and updating the data inside the object. Objects inside the Google Cloud Storage bucket are immutable and one way to update them may be to get the object data from Google Cloud Storage bucket within the code, updating it, then uploading it again into the Google Cloud Storage bucket.
As you want to deal with JSON files you may explore using Cloud Datastore or Cloud Firestore. Also if you wish to use Firebase then you may explore Firebase Realtime Database.
A very quick and dirty way to make it easy to read some of the information in the json doc is to use that information in the blob name. For example the key information in
doc = {'id':3, 'name':'my name', ... }
could be stored in an object called "doc_3_my name", so that it can be read while browsing the bucket. You can then download the right doc if you need to see all the original data. An object name can be up to 1024 bytes of UTF-8 (with some exclusions), which is normally sufficient for surfacing basic information.
Note that you should never store PII like this.
https://cloud.google.com/storage/docs/objects
I am using Google Cloud Tasks to create task and Firebase functions to handle it.
I can pass payload to my handler using body parameter
But i am not sure is it secure to pass private data using payload without encoding or better use some cipher?
Also I found this
In App Engine tasks, both the queue and the task handler run within the same Cloud project. Traffic is encrypted during transport and never leaves Google datacenters
but I'm not sure if it applies to this case
Thanks!
All depends what you put behind the word "secure".
Is your data encrypted at rest? Yes, even in Cloud Task (before triggering the task)
Is your data encrypted in transit? Yes, even when Cloud Task trigger your endpoint.
Can someone see the data of a task? Yes, if he has enough right, the user can perform a GET on the task and view its body data.
So, yes it's secure if you set the correct permission to the users. you can cipher the data, with Cloud KMS for example, if the Cloud Task data are private and you don't want that an Ops, that manage Cloud Task, accesses to the private data; but in this case, don't grant the user the permission on CLoud KMS.
I'm considering crating a simple API to serve my app. Basically it would be a very simple monitoring system (sending a true/false value) and uploading it to database or BigQuery. According to the Firebase documentation, the Functions service stores the IP address temporarily (which is considered personal data) - https://firebase.google.com/support/privacy/.
Does it mean that I would need to ask the users for the consent to use this API to comply with GDPR?
I am working with an API for automating tasks in a company I work for.
The software will run from a single server and there will only one instance of the sensitive data.
I have a tool that our team uses at the end of every day.
The token only needs to be requested once since it has a +-30 minute timeout.
Since I work with Salesforce API, the user has to enter his/her password either way since it relates the ticket to their account.
The API oAuth2 tokens and all of its sensitive components need to be secured.
I use PowerShell & a module called FileCryptograhy to produce an AES version of my config.json.
In my config file, I store all the component keys that need to be used to generate the token itself.
Steps
Base64 encode strings
Use FileCyptography module to encrypt the JSON file with a secret key into an AES file.
When API needs to produce a token, it works in reverse to get all the data.
Is this a valid way of securing sensitive API data, or is there a more efficient way?
P.S: I understand that nothing is very secure and can be reverse engineered, I just need something that will keep at least 90% of people away from this data.
I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo