Speed up download Artifacts in Release Management - azure-pipelines-release-pipeline

I have a project in TFS that has about 3k files as resulting of the build. What we call artifacts.
It's very frequent I got the status like
No download tasks have completed in 1 minutes. Remaining task statuses:
WaitingForActivation: 1 task(s).
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
No download tasks have completed in 2 minutes. Remaining task statuses:
WaitingForActivation: 1 task(s).
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
No download tasks have completed in 3 minutes. Remaining task statuses:
WaitingForActivation: 1 task(s).
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
No download tasks have completed in 4 minutes. Remaining task statuses:
WaitingForActivation: 1 task(s).
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
No download tasks have completed in 5 minutes. Remaining task statuses:
WaitingForActivation: 1 task(s).
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
1 downloads remaining.
No download tasks have completed in 6 minutes. Remaining task statuses:
WaitingForActivation: 1 task(s).
I've tried to compress the artifacts and download one single file. But I didn't have success on that.
Could I have a hint from the community about how to speed up the Artifacts download? Is my idea of Archive during build and unzip during release a good approach?
I've tried that but I wasn't able to download only the zip file created.

We have started using robocopy to download build artifacts in recent
version of vsts-agent. Download performance using robocopy should be
better than the performance with v1 agent as well. In case upgrading
to TFS 2018 is not a viable solution, pl. get latest agent from here
and configure against TFS server:
https://github.com/Microsoft/vsts-agent/releases
If you are not using TFS2018, suggest you use the latest build agent which will use robocopy should be better performance.
Another option is zipping or creating a nuget package for each public artifact and then after the drop, unzipping. You could use Archive Files task or some 3rd-party extension in marketplace. Take a look at below two related question:
TFS build v2 agent downloads artifacts slowly, v1 unaffected
VSTS agent very slow to download artifacts from local network share
Besides, Release Management downloads all the artifacts published by the build definition that you selected by default before. Now you are able to add a configuration option for a release definition to force agents to only download artifacts that are required for task execution. This will also speed up your release pipeline. Note: This is only available to the on-prem version in TFS 2018 Update 2.
Details please take a look at this blog: Speed up your VSTS Releases by Partially Downloading Artifacts

Related

How to commit file .REAL_VERSION on PR and not trigger build with github actions?

We have a specific process we are looking to port from circleCI potentially if this works
Developer post PR
CI job 'buildMyStuff' triggers off the PR (or any changes except to .REAL_VERSION)
CI job 'buildMyStuff' adds and commits .REAL_VERSION with circleCI build number(used in git tag and CD in deploy job)
Here we want to prevent CI job 'buildMyStuff' doing a recursive build as it sees the PR changed (because it pushed .REAL_VERSION)
Developer sees build pass and 1 day later merge squashes into master/main branch
Now job "deployMyStuff' runs and does git tag using contents of .REAL_VERSION so it can re-use the artifacts built from CI in the previous step #2 since they are 100% guaranteed to be the same and do not need rebuilding(saving a ton of time and build credits). It also deploys to staging environment
NOTE on step #5 - If branch is not up to date with master, developer has to click update to master kicking off a new build again (you have to be up to date with master AND CI passing before merge)
Now, in circleCI, committing .REAL_VERSION triggers another build(ie. step 4 above) and using their special [ci skip] does not work since that results in skipping the next build AND THEN the deploy job too(ie. steps 4 and .
Basically, we want a CI build to commit ONE file during PR but not trigger any builds either ignoring because of
author of commit (circleci user perhaps)
OR [ci skip 'job name'] in comment of git message
OR never build on changes to .REAL_VERSION
OR something else to prevent that 1 build
Can we do this on github builds?
In regards to ignoring certain files in the trigger, you can scope this with the paths-ignore option.
on:
push:
branches:
- main
paths-ignore:
- '.REAL_VERSION'

Can github actions automatically open web pages?

My requirement is to automatically open a web page every 5 hours (for example: www.xxx.com, I don't need to see the content of this page, just open it). The page takes 3 minutes to load, and then just close the page.
Can I use Github Actions to achieve my requirement?
Yes, this technically possible. GitHub Actions has the on.schedule event which allows you to essentially set up a cron schedule for execution of your workflow. Your cron schedule for running the workflow every 5 hours would be something like 0 0/5 * * * (see here).
If you just want to open the webpage for 3 minutes, your workflow run step could use the && chaining the URL-opening command (xdg-open www.example.com can be used to open a URL) with a sleep 360. I leave the actual composition of the workflow file as an exercise to the reader 🙂.

Windows UWP crashing when uploaded to Store

I'm having a major problem with my app Small Acorns Vegetarian Recipes when uploaded to the Windows Store.
When I test the app locally, then create App Package and test by sideloading the app I cannot replicate the errors that are occurring from the live app in the store. I have also downloaded the live app on 3 different Windows 10 devices and the app works without crashing.
Below is the crash report from Windows Dev center which is recording this error as a STOWED_EXCEPTION of type System.UnAuthorizedException. There are multiple errors but they all look the same as the one below.
9NBLGGH2RQT4 Small Acorns 2016-W3 1/18/2016 12:00:00 AM e5fbe8d3-2fcc-9405-e339-795d8ec35826 "1 SmallAcorns_W10_7ffbdf7a0000 0xC179E8
2 SmallAcorns_W10_7ffbdf7a0000 0xFCD461
3 SmallAcorns_W10_7ffbdf7a0000 0xFF7263"
9NBLGGH2RQT4 Small Acorns 2016-W3 1/18/2016 12:00:00 AM 45df7950-ae20-259c-3f6e-4ec2a1758559 "1 SmallAcorns_W10_581b0000 0x98DC05
2 SmallAcorns_W10_581b0000 0x98DE13
3 SmallAcorns_W10_581b0000 0xC14E95
4 SmallAcorns_W10_581b0000 0xC14E5E
5 SmallAcorns_W10_581b0000 0xC14E49
6 SmallAcorns_W10_581b0000 0xC14BC3
7 SharedLibrary System::Runtime::ExceptionServices::ExceptionDispatchInfo.Throw 0x1C
8 SharedLibrary $13_System::Runtime::CompilerServices::TaskAwaiter.ThrowForNonSuccess 0x4A
9 SharedLibrary $13_System::Runtime::CompilerServices::TaskAwaiter.HandleNonSuccessAndDebuggerNotification 0x3C
10 SharedLibrary $13_System::Runtime::CompilerServices::TaskAwaiter.ValidateEnd 0x16
11 SharedLibrary $13_System::Runtime::CompilerServices::ConfiguredTaskAwaitable::ConfiguredTaskAwaiter.GetResult 0x9
12 SmallAcorns_W10_581b0000 0xC1548C
13 SharedLibrary System::Runtime::ExceptionServices::ExceptionDispatchInfo.Throw 0x1C
14 SharedLibrary $22_System::Threading::Tasks::ExceptionDispatchHelper::__c__DisplayClass0._ThrowAsync_b__3 0x19
15 SharedLibrary $13_System::Threading::WinRTSynchronizationContext::Invoker.InvokeCore 0x3C"
The health reports are next to useless as its clearly an async method failing and I cannot now download the .cab files to test with WinDbg (like you could with Windows 8.1 apps)
I have absolutely no idea where this error is occurring in the app as the app works on the multiple devices I've tested it on.
Has anyone managed to use WinDbg with an universal app? Any help much appreciated as I am totally stuck with this issue.
If anyone has the time to test my app as its hidden but available from the link above that would be great. Thanks
is the problem happen only on your device?
for the debugging tool, you can use the updated one for Windows 10 :)
https://dev.windows.com/en-us/downloads/windows-10-sdk
Debugging tools for Windows 10
This is because "chain compilation" and/or "code optimization" when in Release. Try to set them off one by one or all at the same time and and start debugging in Release on a device. You want be able to see all the info like when build in Debug, but at least you'll be able to find where and why your app crashes.

Mercurial / hgweb : How to get latest revision description for each file on browse page?

I have local repo, and I need to view latest changes description per-file basis via web interface.
Look on examples (folder in Netbeans sources) :
1 - on Netbeans native server
2 - on Bitbucket server
I have "1", but I want "2" (where we can see latest revision desc for each file). So, I try to modify hgweb templates (https://www.mercurial-scm.org/wiki/Theming) to reach this functionality, but in file list page template (manifest) variables to get rev desc not accessible. Only file name / size / permissions etc. available.
What can I do?
You cannot do this by just modifying templates. You'd have to iterate over the files, looking at the linkrev for the filelog revision referred to by the manifest and get the changeset message from there.

Google AppsScript server error on getFiles()

Getting a server error on
var contents = folder.getFiles(1501,500);
Ran fine for the first three iterations, 0,500 and 501,500 and 1001,500. There are over 3,000 files in the folder. Clicked it many times over the last several hours. Would anything else cause the error?
Also how are files added to a folder. If I add files can I get the increment by looking at the end of the list or the beginning?
Just played with the 500, reducing it and I got it to run a few more times, 1900 files total and it stopped again. Is there some limit on the max number. Looks like no more than 1999 files. Is this true? Guess I have to split the directory.