Google Cloud Platform - I need a webhook to get JSON data. How to approach it? - json

I am fairly new to the google-cloud-platform world and I am struggling with my 1st steps there.
So, what I want to know is how to make a webhook app, that will run 24/7 and "catch" data sent from another 3rd party service (later I will try to do something with this data - manipulate it and push into a DB, but that's anohter question to ask).
I have set up an instance on GCP which Linux based, but , what's next?
I am familiar with PHP, but I want to do it this time in Phyton (learning it nowdays).
Which service in GCP should I use and how I set up the server to catch every data the 3rd party service is sending?

This sounds like a perfect fit for Google App Engine. As long as the 3rd-party service makes HTTP requests, App Engine is a great fit. You can write your application in Python, PHP, Java, or just about anything else, then GAE takes care of the rest. No need to manage Linux, instances, firewall rules, or anything else.
If your load is minimal, you may even fit into the free tier and pay nothing to run your app.
Check out the GAE Python docs at https://cloud.google.com/appengine/docs/python/.

If you want to run your Web Hook continuously, then you can run it as a Cron job. Here is a guide on how to run Python scripts as cron jobs on Google App Engine: Scheduling Tasks With Cron for Python

Related

Running python script on database file located in google drive

I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.

coding, data migration and deployment process using jive

I am new to Jive, currently going through the documentation provided on https://docs.jivesoftware.com/.
what I am looking for is
Any specific editor to write code in jive x
How to migrate data into jive
Deployment process followed by jive. Like where to develop, test, deploy.
So anyone who has worked using jive, can provide some links/tips.
There is no specific editor to write code for Jive. This is a personal preference which might also depend if you are writing a Plugin in Java or an Add-On in JS. I prefer to use IntelliJ in general.
The best option to migrate data into Jive is to use the REST API. It is important to rate-limit the requests to not overload the instance but the API should be able to handle a considerable number of requests, depending on the underlying infrastructure. You could in theory also use the DB to migrate data into Jive but that would require a deep knowledge of the Jive architecture and the chances of breaking something are high.
For development and early testing the best is a local instance, which you can setup following these steps. For full end-to-end testing the best is to have a UAT environment which replicates as much as possible the production instance/infrastructure.

Firebase Automatic Sync with Local PC

I'm working on a project that can take data from a Weintek HMI, put them on a webserver and then send them to an application that I created on android studio.
I've found firebase that can help me in this task.
In easybuilder that works with my hmi, I can create a mysql database that can store the data.
The problem is how can I update automatically firebase database with mysql database with an interval of time in order to access them on the android app.
If there is no solution with mysql, can someone suggest other method to extract the data and use some web server to sync it with the android app?
I don't know your specific need, in terms of data volume or application, but as a workaround, maybe this can help you:
I usually apply MQTT, which many Weintek HMIs have, to send telemetry data, and then use NodeRed to process and redirect the data to a database, email, SMS, Telegram, CSV, TXT... depending on the need , which in your case could be Firebase (I never used it).
It works great for me as I don't have to worry about HMI limitations.
The problem is the reliability of the data, in terms of confirming that when the HMI sends, the server listens and writes, but there are certainly ways to deal with this, and the fact that you need to have a server with NodeRed running.
If you have never done so, in Weintek HMIs you can send the MQTT payload cyclically using macros easily.

Azure Web Jobs - Triggered from queue - triggered or continuous?

I have a web job I've created and I want to deploy to azure. However I'm confused about the configuration when you create it.
It's a web job that's triggered from an azure storage queue. It works fine locally.
However when I go to create the web job in azure, I'm confused by the choices...my choices are Triggered or Continuous.
If I choose Continuous, I get a choice single or Multi.
If I choose Triggered, I'm given a choice of Scheduled or Manual. I don't want a Scheduled, and I'm not sure what Manual means...that doesn't seem like it's right either.
I know the web job that's triggered from the azure queue is really "polling" and not triggered...so it seems like continuous is the right choice. But I'm not sure.
So the question is...when creating a Web Job that's triggered from an Azure Queue, what is the right deploy configuration?
I sounds like you are using the Azure WebJobs SDK. In SDK scenarios, even though your individual functions are 'triggered', the WebJob as a while runs continuously (i.e. your exe keeps running and does its own internal triggering). So what you want is Continuous Multi. No reason to use singleton in most cases, and it's not relevant anyway until you scale out to multiple instances.

Schedule funtion to Pull CSV file to database

Sorry, I have not done any coding in WEB API so I don't have any sample to share here.
But I just wanted to know that, how we can schedule a task on server for WEB API to read a csv file after every 15 or 30 mins.
From client we are successfully able to push csv to my ftp server where my application is located. But how to retrieve the data from csv after every fixed interval?
WebAPI is the wrong tool for the job.
You can certainly push (or upload specifically) CSV data to a WebAPI method, and that method can do whatever you need to do with that data. But WebAPI doesn't run on a timer. It's not going to periodically look for data to process, you have to send it that data.
A web application is a request/response system. In order to do something, you have to send it a request.
For an application that periodically runs on a timer, consider a Windows Service or even simply a Console Application which is scheduled to run with some job scheduler (Windows Task Scheduler, cron, etc.). That would run periodically without the need for input or a user interface at all.
Now, if the code you need to use is in the WebAPI application already, that's ok. You can refactor that code into a separate Class Library in the same solution as the WebAPI application. Then add the new application (Windows Service or Console Application) to that solution and reference that same Class Library. Both applications can share the same supporting code. Indeed, the application layer should be as thin as possible over a set of common business logic that can be shared by multiple applications.