I am looking to create a SSIS package which downloads a file from an Azure DevOps GIT repository.
The file is a excel spread sheet.
The SSIS package should download this file in a local directory which can then be used for further processing.
Is this possible to achieve using SSIS?
Is this possible to achieve using SSIS?
I'm afraid, no, this could not be achieved via SSIS until now. Because what is available in Azure devops is build and deploy SSIS package. We haven't support to run SSIS package in Azure devops pipeline at this time.
At present, you can make use Azure Data Factory, which is the one service of Azure. See this doc.
But also, as far as I know, it does not support activity of copy file via SSIS. And if you want the excel file can be further processing with SSIS, you may consider to use Azure file copy task to copy the excel file to Azure blob:
You can do just about anything you want in Azure Devops using either the Powershell release template or the Run Powershell on Remote Machines release template. The only catch here is setting up WinRM on the remote servers, opening ports/firewall settings, using X509 self-signed certs if you want to do this over SSL, WinRM listeners, setting up trusted hosts, etc. etc. It took me two months to finally get WinRM setup and working correctly from Azure Devops (in my case that included setting up a special security group policy to allow the WinRM services to run unimpeded on the remote machines joined to the domain). Once you have WinRM working though, you can have Azure Devops drive anything that you could script with Powershell so the effort was worth it for me. If you embark on this adventure, take the time to write some powershell test scripts that call Invoke-command to test all of the WinRM security features I mention above, this will save you a lot of time with troubleshooting a remote connection over WinRM.
Related
I have configured Azure Release pipeline for my deployment. But I want to run Mysql scripts on Mysql Sever using azure devops tasks, can someone help me if there is a best way to run the scripts ?
0- what task should I use from azure marketplace ?
1- should I run all scripts in one task or each script as a separate task ?
2- how to wait while script is running ?
MySQL Toolkit for Windows is a VSTS / TFS extension and contains helpful tasks for build and release definition for MySQL servers. You can run ad-hoc MySQL command, script or scripts collection on Windows Agents including Windows Hosted Agents (Linux Agents not supported).
In addition, you can add a step to run a PowerShell/batch script to execute the SQL script, and you can also create a custom build task and publish it to VSTS.
BTW, you could add PowerShell Sleep function to wait while script is running.
Update>>You could use Copy Files task to copy files from a source folder to a target folder using match patterns.
I have 2 server at which I am working locally. The first is a front-end in Vuejs, and the second is back-end in Flask. From the client I request an api to the second.
I have to upload these two on a remote Linux VM (Debian), for which I have credentials and I can successfully connect it via PuTTy.
How do I transer my 2 directories to the VM?
Then, I should change the address that the client uses for api requests of the server, that is all? Or I will have to do something else?
You can copy directories by the scp or sftp protocol. In your case, this can be done most easily by the winscp software.
Both scp, sftp (implemented by winscp) and ssh (implemented by putty) use the ssh protocol. Putty is for remote terminal (i.e. you can give commands to the server), while winscp uploads, downloads and manages files on it.
If you are developing something, it is likely that you will need to this deployment more regularly. These softwares are only good for single-time deployments. In professional environments this deployment is automatized and happens quickly.
It is very likely that you also have some database in your project. Here the most common options are either some db-level synchronization, or dumping the database into files and synchronizyng on the file level. But it is already another topic.
It is also unlikely that you will need two different VMs for the vuejs and for the flask. You could wire them together to a single VM, that would make your task far more easy.
You will likely have a hard time to make your deployment on your server well working. This all is just the beginning. But don't worry, after you've learnt it all, it will be easy!
i am trying to deploy the project from VSTS to azure i have publish setting file i need to know how to use publish setting file in pipeline to deploy the project on azure. or any other solution to deploy on azure.
Yes, you can store the plain password in the secret variable of build/release, then specify the password through MSBuild argument (/p:Password={variable}).
After that you can specify the public profile (.pubxml file instead of .publishsettings file) in Visual Studio Build task (e.g. /p:SkipInvalidConfigurations=true /p:DeployOnBuild=true /p:PublishProfile="{profile name}")
Alternately you can write a script and add a Batch Script task to run the script to deploy on Azure. Please refer to Deploying to Azure from VSTS using publish profiles and msdeploy for details.
You can also use the Azure App Service Deploy task to deploy the Azure Web App. Please refer to How to deploy to Azure using Team Services Release Management for details.
I'm using SQL 2016 and we're converting over a bunch of SSIS packages (from way back in 2005). In our old architecture we had development and production. We're moving now to source control in VSO and we're staging our deployments. We have local development on developer machines, then we post to Dev, then to QA, then Staging, then finally production.
We've figured out how to use SSIS Environment Variables (AWESOME!) and we're able to run the files on local dev machines from inside Visual Studio using SDT. Then we deploy as project to an ispac file which we copy to the Dev server and import into our SSIS Catalog in SSMS. Then in SSMS we are trying to change the variables for each environment.
The problem is the Data Connection. I was passing the connection string and the authentication password as a parameter into a shared connection. So the connection read those values in from the project parameter when executing. Then we were going to change those values for each environment. It turns out on the server we need to execute using Integrated Security. Since we're testing remotely we can't use Integrated Security on our local machines. So basically local dev is SQL Authentication but Dev, QA, Staging, and Production environments will all be tested on servers using Integrated Security.
I can't seem to get this to work right. I have two Project Parameters DB_ConnectionString and DB_Password. I also have a shared Connection (OLEDB SQL) which in the package is parameterized. We use the Project Parameters for the connection so at execution it's using the project parameters to plug in the string and password.
When I post to live I need Integrated. So I tried putting an Integrated security connection string into the Environment Password fro DB_ConnectionString and then it requires a Password. But that isn't really working right. I'm getting a connection error.
SSIS Error Code DTS_OLEDBERROR
"Invalid authorization specification"
If you can avoid using parameters you will be better off in my opinion. When publishing to the catalog you can set your connection strings based on your environment. This way you can have sql auth on development machines and integrated auth when running in any of your server environments.
Is there a way to configure the generated SSIS deployment wizard (manifest) to only allow for SQL Server deployment? Basically I'm looking to eliminate this screen, or disable the file system deployment option.
It is possible to build a small app which is based on the SSIS API and implement only the SQL Server deployment type. For un example, you can check this http://www.selectsifiso.net/?p=510