How to schedule a REST API call with SSIS? - ssis

I have built a RESTFUL API for my webscraper and was wondering if there was a way to schedule API calls in SSIS? Something similar to cronjob scheduler

You can have an SSIS package with a Script Task inside for calling the API. Then you deploy the package. Then you create a Job in which you call the package and schedule it (daily/weekly/monthly/every x minutes etc)
Hope this helps.

Related

Can Timer Triggers call Activity Triggers in Azure?

I am creating a timer trigger function that is going to do certain checks. If a certain conditional is hit, I want to send an email and have set this up through an Activity Trigger. I keep getting the error
The function 'activityTriggerName` doesn't exist', additional info: No orchestrator functions are currently registered!.
I am brand new to durable functions and triggers in Azure, any direction on if this is allowed or other ideas would be appreciated.
Activity functions can only be initiated by an Orchestrator. The usual pattern is to have a trigger function create an orchestrator and then the orchestrator initiates one or more activity functions.
If you’re using Visual Studio then the Orchestrator template will create a sample which includes an HTTP trigger, an orchestrator and an activity.

How to user Parallel.Foreach for ADO RecordSet in SSIS

I am doing SSIS project for ETL. My senario is as follow:
I want to have parallel looping for API call script component. How can I call API using parallel.foreach OR any other alternative?
Thanks for the help in advance.

How to call a Dataflow job written in Python(3.x) from Cloud Functions in GCP

My goal is creating a mechanism that when a new file is uploaded into the Cloud Storage, it'll trigger a Cloud Function. Eventually, this Cloud function will trigger a Cloud Dataflow job.
I have a restriction that the Cloud Dataflow job should be written in Python, and the Cloud Function also should be in Python.
he problem I have been facing right now is, I cannot call Cloud Dataflow job from a Cloud Function.
Yes you can! Start by packaging your Dataflow Job in a template. There is some update to perform for reading the input values. This change only the job header/configuration, not the "real" processing
Then, trigger your template with API Rest. There is no python library for that, but there is a answer here with code example

How to invoke Python script from google apps script

I have a small Google apps script which is invoked with an onClick event to retrieve the last row added. Next step is I need the apps script to invoke a python script internally. Is there a way to do that?
I am new to Apps script so any help is appreciated. Thank you in advance
Google Apps Script is quite similar to JavaScript itself. So, can you execute python inside JS code. I guess no.
A way that I can think of exposing that python code via Ajax Call. You can invoke the URL from the Apps Script. REST APIs come in handy to break the boundaries of languages.
But yeah, in this approach you do need to host an API. I do that using Google Cloud Functions which are really brilliant where building an API in python/nodejs is quick and easy(few minutes) with a ready HTTPS URL.

how to run google app script function from another project

I have few sheets that have functions bound to the spreadsheet. Each spreadsheet has its own functions and uses SpreadsheetApp.getUi to run html service. I would like to initiate function calls in all the sheets from a master spreadsheet project? is it possible? Something like getting a handle to the spreadsheet project and run a script in that project?
You have two options:
Publish your scripts as libraries and subscribe to each in each other of your script projects.
Publish your scripts as web apps with specific functions as individual pseudo webhooks. Sort of like a distributed API.
There are pros and cons of each. Neither is about maintainability really.
The library option will afford code completion whereas the web app option will enable (if you wish) for you to run code asynchronously.
Both have different speed penalties. Library enabled scripts are slower as described in the documentation. Web apps will be slower because of urlfetch latency.
Library functions will use the runtime allowed for them in the host script, whereas web apps will extend runtime and some quotas.
Documentation:
Publish your scripts as a library
Running apps script as an endpoint
using library seems to me is the fastest and simplest way to protect your code.
for this you need :
1 spreadsheet or any google doc containing Code - SCRIPT A
1 stand-alone script SCRIPT B that you publish as a library.
in the editor of SCRIPT A -
you add the library Key
in the SCRIPT A code you can call the function of SCRIPT B
function callFunctionOfScriptB(){
LibraryIdentifier.functionNameinScriptB()
}
you find LibraryIdentifier when you click on Resources + libraries in the column Identifier of the popup
functionNameinScriptB = the name of the function you want to call in Script B