I have install Git, Node.js and NPM on my machine and have successfully been able to run a progressive web app on Chrome through the LocalHost. Now what about when I want to run this web app on a public server? Will I have to install Git, Node.js and NPM on my web hosting account? Or are these components already installed on web hosting servers in general (like would be an app like cPanel)?
By the way, would you be able to recommend any good FTP application for Mac that can upload zillions of files easily (not 1 by 1)?
You will need Node on the host server, but not NPM or Git (although NPM will be there by default when you install Node).
Typically what you want is to setup a "Continuous Integration / Continuous Delivery" platform. You can use free options like Travis or Jenkins or you can just use a shell script and run it through AWS Lambda or something like that.
Very simplified version:
You push code to Git
CI/CD detects the code check-in (polls) and pulls latest from Git on an Agent
CI/CD runs a "build" on the Agent, which for Node is at least npm install but can include Grunt, Gulp, Webpack, or a host of other useful steps.
CI/CD publishes the result of the build to a target server.
Here you have five machines involved:
Git server
Your local dev box
CI/CD server
CI/CD agent
production server
Hope this helps you get started in the right direction.
Related
Since the last dreamhost update which banned VPS users from accessing root/administrator. I am wondering how to do i install GULP in my hosting as it looks like we need admin access to install that and which dreamhost clearly denies it to give access.
So the problem is there are several tasks i need on my server to accomplish via GULP.
Do we have any alternative/Solution to this problem? (except changing the hosting service :))
I'm fed up of not being able to implement my production code in production environment.
thanks
DreamHost VPS are managed servers, they try to strike a balance between giving more computing power and isolation to customers (as opposed to shared hosting) while taking some management burden off of customers hands. This results in a compromise where you don't have to care about the underlying OS and basic services at all, and focus on your applications... but you're limited somewhat to what you can install on the machine.
The good news is that you can manage npm packages as an unprivileged user. While I am not a node expert, I have managed to upgrade npm and install gulp and all dependencies on my local user.
I first created a new dir for my version of npm (mkdir npm) then set the npm prefix to it
$ npm config set prefix ~/npm
Then it's time to install a newer version of npm in that dir:
$ npm install -g npm
After that's done, update the $PATH variable in your .bash_profile and in the current session, too export PATH="$HOME/npm/bin:$PATH" and confirm with a which npm that the npm command bash will execute is /home/$YOUR_VPS_USER/npm/bin/npm.
Now you can install the rest:
npm install -g bower
npm install -g grunt-cli
npm install -g gulp
npm install -g yo
If you need to have more power and accept the responsibilities of running an unmanaged server, DreamHost Cloud is more for you though.
I have working app on OpenShift server. My question is - how to update openshift's git repo of my application, if I make some changes using ssh acsess to openshift? I mean not using all this stuff with pull/push to my local mashine.
If I understand you correctly, you would like to modify source code without using git. I am not sure why you would want that. All that stuff with pull/push gives you a version control flexibility which can save you a lot of time when you screw up one thing. For example, you push brand new UI to production, which turns out to be buggy. With git, you have flexibility to revert back to previous version, and work on different branch to fix the bug on UI.
OpenShift follows conventional app structure. Git for source control, maven for build, jbosseap(for example) for app server, jenkins for continuous integration, etc. So, when you push using git, OpenShift will automatically build using maven, then deploy to the server.
If you would like to disregard all that advantages that OpenShift has to offer, use rhc ssh appname to directly work on the server.
I'm creating a yeoman generator for my web projects.
But I wonder how I can try and test my changes before publishing it?
Since I have installed it once, it will not run my local development version, instead it runs my installed version.
Any suggestions on how can test-run my local development version?
I finally found some information on how to accomplish this:
if you wish to develop on the generators code base, and debug locally, a common way to do so is to rely on npm link
git clone the generators repo locally
cd into that repository and run npm link. It'll install required dependencies and install the package globally, using a symbolic link to your local version.
If you want to install sub generators, you need to do so in the context of a yeoman-generator package linked earlier. Cd into the sub generators package you have cloned locally and run npm link.
We now have everything linked and known on the system, we now need to link the sub-generator repo into the parent one, yeoman-generator cloned and linked in step 1 & 2.
https://github.com/yeoman/generator/wiki/Testing-generators
EDIT:
Updated link for info: https://yeoman.io/authoring/index.html
If by "running locally" you mean the ability to test your generator and its flow you can simply do this.
In your project directory folder run npm link. If this passes in flying colors, go to step 2.
Open a terminal and cd into the folder you wish to initiate a project.
Run yo generator-theNameOfYourGenerator. This will run your generator.
I have a Mercurial repository (on Bitbucket) with some code (Java) and I want to do CI builds on a cloud-based Jenkins server (at Jelastic, running on CentOS). My problem is that I haven't been able to do a proper installation of Mercurial on the Jenkins server.
The Jenkins build fails with the following message:
ERROR: Failed to clone https://bitbucket_jenkins_user:some_password#bitbucket.org/repo_owner/my_repository because hg could not be found; check that you've properly configured your Mercurial installation
Setup information
It's a private Mercurial repository, hosted at Bitbucket
In Bitbucket I have set up a Service to trigger the Jenkins build, after a Push has happened
I have defined a specific bitbucket jenkins user in my Mercurial repository, it has only read rights and it logs in using simple https authentication
Jenkins runs on a Tomcat 7, hosted in a Jelastic cloud environment, on CentOS 6
The Mercurial plugin was installed through the Jenkins interface, by Manage plugins
The build is configured as being triggered remotely (by the service defined in Bitbucket)
Build results are the same when started manually and when triggered from a push to the repository
When I first did this I was under the impression that installing the Mercurial plugin in Jenkins would be enough, that it would also install the needed Mercurial binaries to be able to connect to the repository and get the code. I have realized that I was wrong and that on the Manage Jenkins / Configure System page I need to specify my Mercurial installation.
Questions
Is it possible to create a Mercurial installation without ssh access and doing a "yum install mercurial"?
In the Jenkins interface, what can I specify when choosing the "Install Automatically" option?
When defining an installer, I have experimented with the "Extract zip/tar.gz" option, but what can I write as the "Download URL for binary archive"?
Jenkins also offers an installer option of "Run command". What kind of commands could that be, maybe a "yum install ..." or "rpm ..."?
Since my server is cloud based, getting ssh access is a paid add-on which I would prefer to avoid. But if that is my only option I will of course do it, thereby getting access to running commands on the server. However, running "yum install mercurial" on Centos seems to only give the 1.4 version of Mercurial. Current version when I write this is 2.6.3, would I need to download the sources and compile it myself or is it possible to get that as a binary for Centos somewhere?
The Mercurial Plugin page has a section on how to use the Auto Installation options to install Mercurial using ArchLinux packages.
"The plugin supports generic tool auto-installation methods for your Mercurial installation, though it does not publish a catalog of Mercurial versions. For users of Linux machines (with Python preinstalled), you can use ArchLinux packages. For example, in /configure under Mercurial installations, add a Mercurial installation with whatever Name you like, Executable = INSTALLATION/bin/hg, Install automatically, Run Command, Label = linux (if desired to limit this to slaves configured with the same label), Command = [ -d usr ] || wget -q -O - http://www.archlinux.org/packages/extra/i686/mercurial/download/ | xzcat | tar xvf - (or …/x86_64/… for 64-bit slaves), Tool Home = usr, and configure a job with this installation tied to a Linux slave."
see https://wiki.jenkins-ci.org/display/JENKINS/Mercurial+Plugin
I have a job in Hudson server A which builds an artifact and deploys it to Nexus. I have another job in a completely separate Hudson server B which needs to download the artifact and deploy it. This job is normally run manually, and the person running it needs to indicate which version of the artifact to deploy - they may not always want to deploy the latest version (e.g. to roll back to a previous known good version).
Currently, I achieve this by using a parameterized build, and require the user to pass in the artifact version number; the job then uses the Execute shell build step to run wget on a URL constructed using the parameter. This is error prone.
Ideally I'd like a plugin that lets the user browse the artifact versions in the Nexus repository and pick and choose the one to deploy, but I'm open to other suggestions. A plugin that also handles the download would be nice, but I can live without it as long as I can still get a string that I can use in shell commands.
I've looked through the available Hudson & Jenkins plugins around Maven style artifact repositories, but they all seem more concerned with pushing artifacts into repos rather than getting them back down.
I'm using Hudson's "Copy Artifact" in other jobs, to get artifacts from other Hudson jobs on the same server, but this doesn't work across different Hudson servers, which is why I've turned to Nexus (which we're already using anyway).
Does anyone have any suggestions?
I recommend using rundeck to execute your deployments.
There is a rundeck plugin for Nexus that enables rundeck to display a pull down menu of available versions in Nexus.
There is a rundeck plugin for Jenkins that can be used to invoke deployments using rundeck and kick-off post deployment jobs (like integration testing) inn Jenkins.