Compare File Versions With Batch Script - ms-access

I've created an Access database to be shared through the entire department, which I've split into a front end and a back end. Unfortunately, there's no easy way I can figure out to ensure all users are consistently using the newest version of the front end on their local machine as I add requested updates.
To overcome this, I created an install batch script that creates a shortcut on their desktop. as well as nesting the front end and an "update" batch script in a custom folder on their PC. The shortcut actually links to the "update" batch script, which then downloads the newest version of the front end (overwriting the existing one), then loads it.
Ideally, this would not download it every time and instead only downloads it if the version of the front end on the network is greater than that on the local machine. Unfortunately, I can't seem to figure out how to do this with an accdb file (though I've seen information for executable files). We are using Access 2010 and an Access 2007 filetype. I still have not figured out how to append a version number to the front end, but I'm open to including a text file as well simply to store that version number. Any suggestions?
Below is the script I currently have for the update file.
#ECHO OFF
CLS
XCOPY "\\NetworkPath\Install\*.accdb" c:\Reserved\Database /y /q
XCOPY "\\NetworkPath\Install\Update.bat" c:\Reserved\Database /y /q
CLS
ECHO Starting database...
START "" "C:\Reserved\Database\FrontEnd.accdb"

I've done the exact same thing, and solved the problem of only re-downloading the frontend when it has changed by using the xcopy command with the /d switch:
xcopy /yqd \\network\frontend.accdb frontend.accdb
Xcopy reference: http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/xcopy.mspx?mfr=true
That works, but leaves a small gap in the logic: when someone is using their local copy of the frontend, and you push a new version to the network, and then they exit the frontend and run the script again: it won't download the new version because the user's local copy will have a later modification time.
To overcome this, I actually make a copy of the local frontend and start that from the script, instead of starting the downloaded copy. That way the downloaded copy retains its original modification time and xcopy's time check works correctly. You do have to train your users though to ignore the local copies of the accdb file and only use the script.

Related

Publish Intranet page is using another user's local publish path for the file being changed

Object reference not set to an instance of an object.
When I publish an Intranet page update, I am getting an error that shows a user's local publish path for one of the files instead of using the web path. It isn't even MY local publish path..
When I publish to my local publish location, the file changed doesn't even publish/update locally.
When I run the project, it seems to work the first time (meaning no error), but when I perform that same task a subsequent time, this is when I get the Object reference error.
How can I fix the project/solution in order to hit the file/code I've changed?
We are using C#, ASP.NET core, Visual Studio 2019, on Windows 10.
I've tried cleaning and rebuilding, deleting upon build, restarting Visual Studio, restarting my system, get latest, undo checkout and start over.
Was replacing just the one aspx file I made code changes to.
Must replace the entire project of files after Publish, not just the select files you (think) you've worked on. Changes are made to files behind and beyond the targeted file(s).
Publish locally, backup Prod files, copy local files to Prod server, Voila!

Sharing downloaded file between steps on github actions

I am downloading a file from AWS S3 in a github action. In the next step (same job) I am trying to edit the file. Sometimes the file is still there, and sometimes it isn't.
Each step runs a bash script, and I check at the end of the first step that the file exists. The file is being downloaded to the $HOME directory, so the path to the file is /home/runner/my-file.json
Where should I download the file to, to guarantee that it is still there on the next step?
Just to close this off, files downloaded to $HOME are persisted between actions in the same job. I finally realised that the next step is trying to edit the file twice concurrently (I'm using some lerna scripts) and that is why it was sometimes reported as empty.
Unfortunately, you can't keep this file after finishing the action running, so to solve this, You should push this file to the repo where this action is currently running on, this could be done through git commands.
Thanks

Need to pass SSIS variable to the Execute Process task to make Dynamic command line argument for Winscp

I have a folder 'DATA' at SFTP location from where I need to download the set of files to some common location and then copy the respective files to different folder location.
File Names are:
Test1.csv
Test2.csv
Test3.csv
Test4.csv
Test5.csv
I want that files first gets downloaded to below location:
G:\USER_DATA\USER_USER_SYNC\Download
Since these files are related to different schema and have to processed separately by each different ssis packages for further transformations and loading.
For some reasons we have to first keep it at some common location and then move or copy afterwards.
Here's my command line argument.
/log=G:\USER_DATA\USER_USER_SYNC\SFTP_LOG\user_sync_winscp.log /command "open sftp://username:password#stransfer.host.com/" -hostkey=""ssh-rsa 2048 9b:63:5e:c4:26:bb:35:0d:49:e6:74:5e:5a:48:c0:8a""" "get /DATA/Test1.csv G:\USER_DATA\USER_USER_SYNC\Download\" "exit"
Using above, I am able to download a given file one file at a time.
Since, I need to have first it at some common folder location. Hence I am planning to add another Execute process task to copy the files.
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test1.csv G:\USER_DATA\USER_USER_SYNC\Testing1
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test1.csv G:\USER_DATA\USER_USER_SYNC\Testing2
and so on...
I am looking for some way, using which we can download all the available files to some common folder location and then move or copy to different folder locations.
I have changed the design and followed a new approach. Thanks to Martin for fixing the sftp related issues and continuous support.
New SSIS package has below tasks:
Step1. It will look for latest updated files on sftp server and download the given files Test1.csv and Test2.csv to location G:\USER_DATA\USER_USER_SYNC\Download\
Here's my command line arguments:
/log=G:\USER_DATA\USER_USER_SYNC\SFTP_LOG\user_sync_winscp.log /command "open sftp://bisftp:*UFVy2u6jnJ]#hU0Zer5AjvDU4#K3m#stransfer.host.com/ -hostkey=""ssh-rsa 2048 9b:63:5e:c4:26:bb:35:0d:49:e6:74:5e:5a:48:c0:8a""" "cd /DATA" "get -filemask=">=today" Test1.csv Test2.csv G:\USER_DATA\USER_USER_SYNC\Download\" "exit"
Step-2. Since my requirement was to further copy each file to different folder location, so that respective process can pick corresponding file and start transformation and loading it into sql server.
This step will execute the Window cmd process and copy Test1.csv to new location as
G:\USER_DATA\USER_USER_SYNC\Testing1
command line arguments as:
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test1.csv G:\USER_DATA\USER_USER_SYNC\Testing1
Like wise I have another Execute process task to copy Test2.csv to new location as
G:\USER_DATA\USER_USER_SYNC\Testing2
command line arguments as:
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test2.csv G:\USER_DATA\USER_USER_SYNC\Testing2
The given solution is working fine, However there are couple of things which still needs to be handle.
Since I am downloading the latest file only using -filemask=">=today". Everything runs fine if execute process task is able to find the latest files on sftp server. If it's not there, than the next subsequent execute process task is failing with below error message.
The returning The process exit code was "1" while the expected was "0"
Here what I understand is that it's failing as it has nothing to copy or move.
Is there any way by which we can capture the exit code returned from first execute process task and store it into some variable, so that we can use expression to decide that whether to start next task or not.
Second, as you can see that I am using two execute process task to copy files from one location to another. Can we do anything to combine both these two commands into one execute process task?
Any suggestion most welcome and also i think that this issue needs to be addressed as a separate question.

How to run/include batch file in installshield setup

I have created setup using installshield and everything is work file. Now I have one batch file and want to run with setup. I know we can create custom action and I have already created custom action for run powershell script and it is working fine.
Can anyone help/guide me for using which custom action I can execute the batch file?
Also I want to run MySQL script from installshield setup.
What I have tried:
I have tried to create different custom action but I don't know exactly which custom action is used for execute the batch file.
ASSUMING AN MSI INSTALLATION:
In your MSI Installation Designer pane, click on "Custom Actions and Sequences"; in the top part of the middle pane right click on "Custom Actions" and choose "New EXE -> Stored in Binary Table". Give it a name (and description, if you like). When it saves, right-click it and start the Custom Action Wizard.
The Action Type should be "Launch an executable" and the Location should be "Stored in the Directory Table".
For Action Parameters "Source", choose the directory you want to start in. For target enter a command to invoke your script, like cmd /c .\RunMyScript.bat arg1 arg2 ... (assuming the script is in the directory you started in.) If the script is in a different folder you can put one of your directory variables in brackets: cmd /c [INSTALLDIR]bin\script.bat. Typically the directory variable will already include a trailing backslash; using these variables with the bracket syntax helps make sure the action works even if the user chooses a different installation folder.
If the script is in a folder that is not a required part of the installation, you may need to make your command be something like cmd /c if exist .\script.bat .\script.bat - so that the custom action does not fail if the feature containing your script is not selected for install (or is removed when an installation is modified.)
I have typically wanted execution to be synchronous (install waits until script finishes before moving on); if your script does not return a reliable exit code, or if you don't want the installation to abort if the script fails, choose the one with "(Ignores error code)".
The custom action should typically be in the InstallExec sequence, after files are installed (but a script you run during an uninstall or repair could run earlier or in a different sequence.)

Get the latest updated file from FTP Folder

Kindly see this screen cast to get better idea about our requirement:
https://www.screenr.com/QmDN
We want to automate the Text Datasource Generation and connection to MS Excel in order to make it easier to the end-user to connect to the Text Datasource (CSV) to MS Excel so that they can generate their own reports.
The steps I have in mind:
Use WinSCP FTP Client with Scripting
Write script to get the most recent updated file from FTP Folder
Or instead of step 2, download all generated files from FTP to a Shared Folder on the Network.
Get the most recent version of the Generated CSV File
Rename the file to the Standard Naming Convention. This must be the name used in MS Excel as the CSV Text Datasource.
Delete all other files
I developed sample script that can be used by WinSCP to download the files from FTP folder:
# Automatically abort script on errors
option batch abort
# Disable overwrite confirmations that conflict with the previous
option confirm off
# Connect
open CSOD
# Change remote directory
cd /Reports/CAD
# Force binary mode transfer
option transfer binary
# Download file to the local directory d:\
#get "Training Attendance Data - Tarek_22_10_21_2014_05_05.CSV" "D:\MyData\Business\Talent Management System\Reports\WinCSP\"
get "*.CSV" "D:\MyData\Business\Talent Management System\Reports\WinCSP\Files\"
# Disconnect
close
exit
Then, I can schedule the above code to run periodically using this command:
winscp.com /script=example.txt
The above sample is working fine, but the main problem is how to identify the most recent file, so that I can rename it, and delete all the other files.
Appreciate your help.
Tarek
Just add the -latest switch to the get command:
get -latest "*.CSV" "D:\MyData\Business\Talent Management System\Reports\WinCSP\Files\"
For more details, see WinSCP article Downloading the most recent file.
You don't specify the language you use, here a Ruby script that downloads the most recent file of an FTP path. Just to demonstrate how easy and terse this can be done with a scripting language like Ruby.
require 'net/ftp'
Net::FTP.open('url of ftpsite') do |ftp|
ftp.login("username", "password")
path = "/private/transfer/*.*"
# file[55..-1] gives the filename part of the returned string
most_recent_file = ftp.list(path)[2..-1].sort_by {|file|ftp.mtime(file[55..-1])}.reverse.first[55..-1]
puts "downloading #{most_recent_file}"
ftp.getbinaryfile(most_recent_file, File.basename(most_recent_file))
puts "done"
end