Azure DevOps deploy overwrites not always cshtml files - razor

Sometimes it happens that when we do a deploy, the last committed files with extension cshtml, are not updated. We have not seen other type files being not updated. It could be that a file is in use and can't be overwrited. Is it possible to add an extra step to the deployment process so we can avoid this?

That is strange. If you are using the Azure App Service Deploy task within Azure Pipelines to deploy to your Web App, you have the option to Remove additional files at destination. Enabling this option deletes files in the Azure App Service that have no matching files in the App Service artifact package or folder being deployed.
Based on the chosen deployment method, there are other helpful additional deployment options like:
Rename locked files: Rename any file that is still in use by the web server by enabling the msdeploy flag MSDEPLOY_RENAME_LOCKED_FILES=1 in the Azure App Service settings. This option, if set, enables msdeploy to rename files that are locked during app deployment. This way, you can avoid deployment failures with ERROR_FILE_IN_USE errors.
Take App Offline: Select this option to take the Azure App Service offline by placing an app_offline.htm file in the root directory before the synchronization operation begins. The file will be removed after the synchronization completes successfully.
Having these in place can streamline your deployments and make them robust. Here is the complete reference for the task: Azure App Service Deploy task.

Related

Using Configuration File instead of System Registry

The Portal UI React application makes use of the Registry settings instead of a local settings.json file in order to run the application on the local environment.This is a pain for the developer because everytime a Registry is updated the system needs a restart which is not a advisable kind of approach in this fast moving development world. There is less flexibility and more dependency while using the Registry settings instead of a local json based configuration file.
I propose to move all the configuration files into local json file and checkout the file in the applications repository.
If there is any other approach which would make this easy to use scenario then pls share your thoughts.
Thanks
Iftekhar

How do I download csv changes from my Heroku

I made a heroku app with streamlit and I used two csv files to save changes. The app is a schedule for group plans, the changes people introduce in the schedule are visible online but when i check my git repository it is not actualized.
How can I download the modificated csv files?
Thanks
Your git repository stores the application source code, which gets deployed to Heroku.
At runtime you application use the Heroku local storage when saving files (not the git repository). You need to download/fetch the CSV files from the application.
Given that Heroku file system is ephemeral (local files are removed when the application restarts) it is not a good idea to persist data on the local filesystem, but rather using an external storage.
You can check out some options in HerokuFiles GitHub repository. If you want the CSV files to be stored with the application source code you can use PyGithub to perform a commit.

Trigger external pipeline / job after Jira in OpenShift startet

I'm running jira in openshift using the basic image from atlassian: https://hub.docker.com/r/atlassian/jira-software
So far most things work fine.
I installed a plugin using the web ui which worked as well.
But now I'm running into an issue when a pod is restarted. The pod uses the image and naturally (as specified) my plugin is not installed anymore. I can install the plugin via webservice calls and register it as an osgi module for jira. But I don't want to do this manually. Building a pipeline or jon for this is quite easy (I'm thinking jenkins or ansible tower). But I so far I didn't find a way to trigger this pipeline after the pod is started (or better after jira is started).
Anyone got an idea how to handle this?
Thanks and best regards. Sebastian
Why not create a custom image based on the Atlassian image with everything you need installed?
As far as I know, there isn't a way to trigger a pipeline when a Pod is started; only Webhook, Image Change, and Config Change triggers are available. You'll need to write a Jenkinsfile to script all of the installation and setup you want, but then that can be triggered in one of the three ways mentioned.
I'm thinking an Image Change trigger would work best for you, so when the latest version of Atlassian's image comes out, you can run your pipeline to set everything up on the latest version.
Also, just curious, but do you have some persistent storage attached to the Jira pod? If not, you'll lose everything in Jira if the Pod dies; that means tickets, boards, comments, everything.
Update:
Looking at this page, it looks like most of the stuff you're trying to persist is stored in jira-home, so maybe mounting that as a persistent volume will be a good solution for you.
You're correct that the tickets are stored in the database, but I'm guessing the database connection settings are getting wiped when the Pod is cycled.
The jira-home directory stores your application and database connection settings, as well as a subdirectory for your plugins.
dbconfig.xml
This file (located at the root of your JIRA home directory) defines
all details for JIRA's database connection. This file is typically
created by running the JIRA setup wizard on new installations of JIRA
or by configuring a database connection using the JIRA configuration
tool.
You can also create your own dbconfig.xml file. This is useful if you
need to specify additional parameters for your specific database
configuration, which are not generated by the setup wizard or JIRA
configuration tool. For more information, refer to the 'manual'
connection instructions of the appropriate database configuration
guide in Connecting JIRA to a database.
jira-config.properties
This file (also located at the root of your JIRA home directory)
stores custom values for most of JIRA's advanced configuration
settings. Properties defined in this file override the default values
defined in the jpm.xml file (located in your JIRA application
installation directory). See Advanced JIRA configuration for more
information.
In new JIRA installations, this file may not initially exist and if
so, will need to be created manually. See Making changes to the
jira-config.properties file for more information. This file is
typically present in JIRA installations upgraded from version 4.3 or
earlier, whose advanced configuration options had been customized
(from their default values).
plugins/
This is the directory where plugins built on Atlassian's Plugin
Framework 2 (i.e. 'Plugins 2' plugins) are stored. If you are
installing a new 'Plugins 2' plugin, you will need to deploy it into
this directory under the installed-plugins sub-directory.
'Plugins 1' plugins should be stored in the JIRA application
installation directory.
This directory is created on JIRA startup, if it does not exist
already.

How could I automatically upload files from my directory to server? [duplicate]

An ASP.NET application (running on Windows server/IIS 7) has to transfer big size files uploaded by current user to an external SFTP server. Due to the file size the idea is to do this asynchronously.
The idea is that the ASP.NET application stores the uploaded file on a local directory of the Windows server. The current user can continue his work. A Windows service or a Quartz job (other tools(*)/ideas?) is now responsible to transfer the file to the external SFTP server.
(*) Are there existing tools that listen on changes of a Windows directory and then move the files on a SFTP server (incl. handling communication errors/retries)?
If there is no existing solution, do you have had similar requirements? What do we have to consider? Because the connection to the SFTP server is not very stable we need an optimized error handling with auto retry functionality.
To watch for changes in a local directory in .NET, use
the FileSystemWatcher class.
If you are looking for an out of the box solution, use the keepuptodate command in WinSCP scripting.
A simple example of WinSCP script (e.g. watch.txt):
open sftp://username:password#host/
keepuptodate c:\local_folder_to_watch /remote_folder
exit
Run the script like:
winscp.com /script=watch.txt
Though this works only, if the uploaded files are preserved in the remote folder.
(I'm the author of WinSCP)

Publish Web does not include some dependency assemblies

In the past I have been using batch files to prepare release packages targeting different environments such as test, staging and production, and then copy the files to the Web site folders through various means. The batch files may run XmlPreProcess to alter web.config for different environments.
Lately I am trialing the Publish Web feature of VS 2012, after installing Web Deploy 3 in the server side. The result is looking good for Hello World.
However, I have a WCF app: MyWcfApp.dll had dependency on MyWcfContracts.dll and MyWcfImplementation.dll which depend on MyData.dll and MySql.Data.dll, yes, I am using MySql. All these files appear in the build folder, say MyWcfAp\bin\Debug.
When running Publish Web, I got some warning: The database provider for this connection string, MySql.Data.MySqlClient, is not supported for incremental database publishing. Incremental database publishing is supported only for SqlClient as well as Entity Framework Code First models.
Then the other dependent assemblies such as MySql.Data.dll got not copied over to the server.
Apparently Publish Web does a lot "smart" things through analyzing Web.config and having a lot presumptions.
Question 1:
Is it good to use Publish Web to deploy WCF service?
Question 2:
Is it possible to run some pre-deployment script say running XmlPreProcess before the deployment so I could target different environments?
Question 3:
Is it possible to ask Publish Web not to analyze Web.config and then just copy every assemblies and files in the build folder?
For the specific issue (Question 3) of not copying over dependent-upon assemblies:
I am working with a small WCF Service Application that I am deploying to my local file system (then hosting the site in IIS) and had the problem of depended-upon assemblies not being copied to the local folder. The solution for me had two steps:
In the service's References, hi-light each reference you need to be copied and hit F4. In the new window make sure 'Copy Local' is set to 'true'.
Right click on the Service Project and select properties. Click the 'Package/Publish Web' section, and from the 'Items to deploy...' dropdown select 'All files in this project folder'.