I thought I could just copy it into .ebextensions and then go
config= require("../.ebextensions/config.json")
in my lib/server.js, but not so.
Do I have to subscribe to some native config game or can I just have my secret files copied over with
eb deploy
To add a file onto the list of files in git I have done the below. This is deploying what's on the file system, so I guess this should be improved to fetch a branch from git and zip that up
rm package.zip
zip package.zip `echo config/config.json; git ls-tree --full-tree -r HEAD | awk '{print $4}'`
eb deploy
and append to .elasticbeanstalk/config.yml
deploy:
artifact: package.zip
assuming you meant eb deploy and not eb debloy, make sure you committed these config files.
Related
Is there any way to create two applications that in Openshift that use the same git repo (although perhaps different branches?).
I am basically looking for a super simple way to create one "experimental" or "dev" application and one production one.
Thanks!
This blog post on release management covers the topic in more detail:
https://blog.openshift.com/release-management-in-the-cloud/
Here's a quick summary of the process...
# setup
cd LOCAL_APP_DIRECTORY
rhc app create STG_APP_NAME CARTRIDGE_TYPE
rhc app create --no-git PROD_APP_NAME CARTRIDGE_TYPE
git remote add production -m master PROD_GIT_URL
git push -f production master
git remote rename origin staging
# deployment
git push staging # or simply, git push
git push production
# emergency rollbacks:
git log # return commit history, example hash: 28c5555352a902c549c965da30cf7559c80f328e
git push staging 28c5555352a902c549c965da30cf7559c80f328e:master
git push production 28c5555352a902c549c965da30cf7559c80f328e:master
I am working with Git repositories in the following way:
I have the master repository and several remotes on the different production machines.
I am pushing the production code to the remotes and restart the services for the changes to take effect.
I am about to switch from Git to Mercurial and I would like to know ahead how I can achieve something like that.
You add entries to the [paths] section of your local clone's .hg/hgrc file. Here's an example of a section that would go in the .hg/hgrc file:
[paths]
remote1 = http://path/to/remote1
remote2 = http://path/to/remote2
You can then use commands like hg push remote1 to send changesets to that repo. If you want that remote repo to update is working directory you'd need to put a changegroup hook in place at that remote location that does an update. That would look something like:
[hooks]
changegroup = hg update 2>&1 > /dev/null && path/to/script/restart-server.sh
Not everyone is a big fan of having remote repos automatically update their working directories on push, and it's certainly not the default.
if you want to add default path, you have to work with default in your ~project/.hg/hgrc file. As Follows:
[paths]
default = https://path/to/your/repo
Good Luck.
You could have a look at hg-git GitHub plugin:
adding the ability to push to and pull from a Git server repository from Mercurial.
This means you can collaborate on Git based projects from Mercurial, or use a Git server as a collaboration point for a team with developers using both Git and Mercurial.
Note: I haven't tested that tool with the latest versions of Mercurial.
If you're on Unix and you have Git installed, you can use this bash function to readily add a path to the remotes without a text editor:
add-hg-path() {
git config -f $(hg root)/.hg/hgrc --add paths.$1 $2
awk '{$1=$1}1' $(hg root)/.hg/hgrc > /tmp/hgrc.tmp
mv /tmp/hgrc.tmp $(hg root)/.hg/hgrc
}
Then invoke it with:
$ add-hg-path remote1 https://path.to/remote1
If someone would like to build a Powershell equivalent, I'd like to include that as well. Other potentials improvements include error checking on the parameters and factoring out the call to $(hg root).
I am trying to use Hudson to do CI on a CVS repository but I receive the following error when I try to run the build:
Started by user admin
[Pilot1] $ "C:\Program Files (x86)\CVS Suite\CVSNT\cvs.exe" -Q -z3 -d :sserver:login:_server:/CVSRepo co -P -d workspace -D "Thursday, March 3, 2011 2:20:08 PM UTC" ITitC/
cvs checkout: in directory .:
cvs checkout: cannot open CVS/Entries for reading: No such file or directory
java.io.IOException: No such directory exists. Did you specify the correct branch? Perhaps you specified a tag: c:\ path\workspace
at hudson.scm.CVSSCM.archive(CVSSCM.java:474)
at hudson.scm.CVSSCM.access$100(CVSSCM.java:123)
at hudson.scm.CVSSCM$1.invoke(CVSSCM.java:381)
at hudson.scm.CVSSCM$1.invoke(CVSSCM.java:374)
at hudson.FilePath.act(FilePath.java:753)
I am able to successfully do an update if I remove the "ITitC" (the module) at the end of the command and run it directly through the cmd prompt. I was also able to quickly create a folder with that name inside of the "workspace" directory at the start of the Hudson build but it is deleted with each new build.
Is there a way I can force Hudson or CVSNT to create folders as needed either before each build or by default? Is this a problem with CVSNT? I'm not too attached to CVSNT and am willing to replace it with a better option.
We encountered same issue wit CVS 1.11.xx and were not able to solve it. In result we moved to CVS 1.12.xx and Hudson works fine. Seems this issue is specific to some CVS server builds and has not relation to CVS client.
I encountered a similar issue
[workspace] $ cvs.exe -Q -z3 -d :pserver:user#server01:/cvsrepo/projectrepo co -P -N -d . -D "Thursday, June 13, 2013 9:24:00 PM UTC" Module1 Module2
cvs.exe checkout: cannot open CVS/Entries for reading: No such file or directory
With only one module specified, the checkout works and the contents of the module are directly beneath the workspace folder created by Hudson. (there is no Module1 folder created)
For multiple modules, the checkout only works if a directory name is specified in the -d. Using the default Hudson specified . is when the error occurs.
[workspace] $ cvs.exe -Q -z3 -d :pserver:user#server01:/cvsrepo/projectrepo co -P -N -d customDir -D "Thursday, June 13, 2013 9:53:46 PM UTC" Module1 Module2
$ computing changelog
Finished: SUCCESS
Specifying the customDir creates a customDir folder beneath workspace with Module1 and Module2 folders below it.
This behavior happens in the command line as well as inside Hudson so it must be a feature of CVS.
Have just started using git. Building and installing it was easy. Then I went into the directory of one of my web projects and added a git repo to it.
$ cd ~/Sites/webapp
$ git init (and so on)
I also set up gitweb, and when I added ~/Sites/webapp to $projectroot setting in gitweb.cgi, that appeared in my browser when I went to http://localhost/gitweb/gitweb.cgi
My question is thus -- from what I understand, git doesn't have a central repo concept. Every project that I may be working on will have its own git repository. Since my projects are all over my hard disk, their respective repos are also all over the hard disk. How do I add multiple repositories to gitweb? Is there some kind of central registry of all my repos? Should I really rejig how I work, and move all my projects to a central directory? How is this done?
Better late then never I guess.
I solved it by creating a project root directory and linking the git repositories.
project_root="/opt/gitweb"
In /opt/gitweb
ln -s ~/Sites/webapp webapp.git
ln -s ~/someotherplace/whereitis/application appliation.git
I do this by creating an empty repository and linking to the repositories I want to browse. This workaround is necessary because of the complicated way instaweb runs gitweb and sets the project root.
git init instaweb
cd instaweb
ln -s ~/projects/gitproj1 gitproj1
ln -s ~/projects/gitproj2 gitproj2
git instaweb --httpd webrick
The server is up and running now and the homepage will list a .git project (which is the empty repository you just initialised) along with the two actual projects you linked to.
I am working on a system that performs continuous integration and I am looking for a method I can use to get the most recent changeset from a Mercurial repository without creating a repository locally.
I have considered using clone but this method will only work if you have set a working directory locally (since this will be occurring on a build server, I would prefer not to do this because of inclusion of the .hg file and the diffs - all I want is essentially an export of the files from the tip revision and nothing more.)
This request may not even be possible, and it's very likely that I just do not understand DVCS very well. However, if I cannot do what I want to do, is there a workaround?
It's possible using 'hg archive' depending how your remote repository is set up.
If it's available over HTTP using hgweb.cgi or hg serve you can hit the archive link programmatically to get the files you want. Example:
wget https://www.mercurial-scm.org/repo/hg/archive/tip.zip --output-document=- | unzip -
or it's available over ssh:
ssh you#there.com hg archive --type=zip - | unzip -
You can use:
$ hg clone http://your_repo
$ hg archive ../export/
$ rm -rf *
$ cd ..
$ cd export
From Mercurial's help files:
$ hg help archive
hg archive [OPTION]... DEST
create an unversioned archive of a
repository revision
You can use:
http://merc/raw-file
to retrieve a list of files in the repository or
http://merc/raw-file/filename
to get a specific file.