I've been going over the docs for Google Cloud Datastore. I have connected to it via a Google Apps Script and want to experiment with using it instead of GAS's built in ScriptDB which has many problems.
I can't figure out how to define a "kind" with the JSON API. It looks like I can write objects, but it is required that each object be a "kind" of a certain type. I've seen how to define them in the Google App Engine using Python, but I don't think that applies here.
As a (realtively) schemaless transactional database solution, the Cloud Datastore does not need to be told about a kind before you write an Entity of that kind. Simply construct a Entity with the desired kind in its Key's path.
See the Kinds and Identifiers documentation for examples.
Related
I have several json files that represent the payload for different API's(I can map which API to call based on the file name, but other methods could be applied as well),
what is the best practice to populate my data on the application with the help of those json files?
My first though was to use some automation framework(rest assured for example) to accomplish my task, but I think it might be an overkill for my scenario.
p.s. snapshot of DB/query direct to DB is not an option because of the nature of the application.
I'm working with gmail API and need to save the historyID to determine the changes that happened in the email from the pubsub events.
However, I don't need to store all the historyIDs and just need to pull the old historyID, use it in my function, and overwrite it with the new one.
Wondering what kind of architecture would be best for this. I can't use the temp storage of google cloud functions because it would not be persistent.
Using google sheets requires extra authorization within the cloud function. Do I really need to make a new cloud bucket for one text file?
It seems like Cloud Datastore would be your best alternative to Cloud Storage if your use case is to persist, retrieve and update the historyID as log data at low-cost. Using Google Cloud Functions and Cloud Datastore would be like a server-less log system.
Datastore is a NoSQL document database built for automatic scaling, high performance, and ease of application development. It can handle large amount of non-relational data with relatively low price. It also has a user-friendly Web Console.
I found a very useful web tutorial which you can use to help you architect a Cloud Functions with Cloud Datastore solution like so:
Create Cloud Datastore
Obtain Datastore Credential
Update your Code
Deploy your Cloud Function
Send a Request to Cloud Function
Check the Log on Datastore
Take a look at the full tutorial here. Hope this helps you.
Our company is in the process of migrating our Google App Engine website's data from Google Cloud Datastore to mysql (CloudSQL to be precise)
I have written all the conversion routines to copy all of the current data entity tables from datastore to mysql and I am at the point where the code in our repos needs rewriting to interact with mysql instead of datastore.
Pretty much all of the current datastore entities subclass ndb.Model apart from one key table, our User entities which subclass from webapp2_extras.appengine.auth.models.User.
Subclassing from this does a lot of nice things behind the scenes such as setting up Unique and UserToken entries as well as setting up and taking care of sessions, setting them up when a user logs in and destroying them when a user logs out.
Now that the user table will live in mysql all these niceties that datastore was providing will need implementing separately and I am at a loss where to start.
What would be the best way to achieve this?
The site uses Google App Engine Standard Environment using the Python 2.7 runtime with webapp2 framework. I am using SqlAlchemy to interact with the mysql instance in the backend.
Any help or advice on this would be greatly appreciated and please let me know if you need any further specifics.
Searching around the webapp2 docs I found this information.
http://webapp2.readthedocs.io/en/latest/tutorials/auth.html#login-with-sessions
I did a little poking in the google SDK to see the same interface methods setup within webapp2_extras.appengine.auth.models.User.
I created an AuthModel interface with the basic properties webapp2 would need to create a session and I implemented the methods described on the page. The methods call the SqlAlchemy query which would in turn interact with the SqlAlchemy User class I had set up and allow me to return the parameters needed for webapp2 and GAE to do the rest.
I am having problems trying to get WF4 to call JSON services. I wanted to know if anyone has used any other types of JSON based workflow engines that are either free or open source, and have a good designer.
What do you want to do? Do you want an activity to call a specific service that returns a json object? If so, than you just need to create a custom activity which calls this service with http client for instance.
Have a look at Workflow Engine. Though the engine itself uses XML, I'm sure it won't be a bother implementing a converter. And it does have a visual HTML5-based designer.
We would like to hook up an iPhone app directly to a database in the cloud. From the iPhone app, we would like to use some REST API to retrieve data from the cloud database. We prefer using MySQL or Mongo as the data store.
We prefer not setting up our database system on AWS or another hosted provider. We would like a cloud service that offers the database layer out of the box.
What database-as-a-service companies support this, and what are the pros/cons of each?
We looked around, and it seems like MongoLab is the only service offering a REST API.
Mongo is a great database for API interaction as the query language uses javascript expressions. Here are the open source API libraries that are implemented in Python
Sleepy Mongoose
Eve
Primary advantage
JavaScript can handle the querying part of data retrieval
Primary disadvantage
The API libraries above have difficulty capturing complex queries