I just erased a hard drive from my old system and would like to install Ubuntu on it along with PHP, RoR, MySQL and Apache to be my development environment. My primary HDD is Win7.
If I do the majority of my work in Win7 and save to the 2nd HDD (Ubuntu), can I access my development sites on Ubuntu as if it were in a separate box? Would all my paths be by drive letter instead of by IP?
You would have to use virtualization to accomplish this. Adding another hard drive doesn't automatically give you any IPs. You still have just your regular outside-facing IP (LAN or WAN) and your loopback (127.0.0.1).
You can read these VirtualBox instructions on setting up an internal network, with each virtual guest having its own local IP. Other virtualization software has similar capabilities.
EDIT: If you just want to serve from a directory on D, you can customize DocumentRoot to be e.g.
DocumentRoot "D:/public_html"
Related
I have 2 server at which I am working locally. The first is a front-end in Vuejs, and the second is back-end in Flask. From the client I request an api to the second.
I have to upload these two on a remote Linux VM (Debian), for which I have credentials and I can successfully connect it via PuTTy.
How do I transer my 2 directories to the VM?
Then, I should change the address that the client uses for api requests of the server, that is all? Or I will have to do something else?
You can copy directories by the scp or sftp protocol. In your case, this can be done most easily by the winscp software.
Both scp, sftp (implemented by winscp) and ssh (implemented by putty) use the ssh protocol. Putty is for remote terminal (i.e. you can give commands to the server), while winscp uploads, downloads and manages files on it.
If you are developing something, it is likely that you will need to this deployment more regularly. These softwares are only good for single-time deployments. In professional environments this deployment is automatized and happens quickly.
It is very likely that you also have some database in your project. Here the most common options are either some db-level synchronization, or dumping the database into files and synchronizyng on the file level. But it is already another topic.
It is also unlikely that you will need two different VMs for the vuejs and for the flask. You could wire them together to a single VM, that would make your task far more easy.
You will likely have a hard time to make your deployment on your server well working. This all is just the beginning. But don't worry, after you've learnt it all, it will be easy!
Me and my friend have to do a project in Mysql and i am trying to find how me and my friend can work on mysql together from our own workstations. Is there a way where we both can work on the same database together without being physically present.
I am fairly new to mysql.
Just take cheap shared hosting that has MySQL databases and preferably also PHPmyAdmin and cPanel or any other panel to easily manage your databases. In that way, you have minimum hassle to create and manage databases that you can both access.
Google "compare cheap shared hosting" and that's it. Or take this quick Google result I got: https://www.pcmag.com/picks/the-best-cheap-web-hosting-services
If you're not happy to pay $1-$2 per month for hosting then you can install WAMPP or XAMPP on Windows machine but then you'll have to also Google how to forward ports from your router.
Edited 3aug2020: For a free MySQL hosting for up to 5MB free storage please, check https://www.freemysqlhosting.net
You can use a program like XAMPP to host mysql and port-forward phpmyadmin so your friend can visit your db
https://www.apachefriends.org/index.html
If you have MYSQL running on a server or your local computer you can both access phpmyadmin (http://server-ip/phpmyadmin) you can also use programs like Navicat (paid) or HeidiSQL (Free).
You can make the database remotely accessible, in which case your friend can connect and work on it as well.
Without knowing more about your network setup it is difficult to say how you should proceed, but generally it is enough to spin up a mysql-server instance on the host machine, then forward a WAN port to the LAN address and port of the host machine.
Some software we have developed, we have "encapsulated it" into a virtual machine, that we run with Virtualbox, in command line in a non interactive way (no graphical interface). We send some instructions to the virtual machine, and it outputs some resulting files. We have tested this locally in a Linux machine. Now we would like to send this to many people using Linux, but we realize they will have different distributions, system libraries versions, etc, and then our VM might fail. So my question, is, it is possible to have someething like a static binary version of Virtualbox (or any other similar system / VM / Container) that does not need to use the system libraries, so that it can be run like a static binary?
It would be important to know what are the 'special' requirements of your solution regarding system libraries and the kind.
If you are using a standard host configuration, a standard VirtualBox install should be able to run the VM on any host OS.
Since a VM runs its own kernel, for the most part, is not dependent on host libraries. The the exception to this is when accessing/controlling host resources (disk, net, etc.). This being said, VirtualBox provides ways to access the most common resources (disk, net, etc.) that are transparent for the VM. Meaning that the VM will be configured always in the same way, regardless of whether the host is Win, Linux or Mac, and you can export your VM on Linux and importing it in other platforms without having to tweak it.
A container (eg. dockers) is more complicated, since it shares the kernel of the host, and it depends on how the host kernel is configured.
Again, if your application doesn't depend on 'special' access to host resources, a Docker will run the same way on all host OSs (Linux provides a native kernel, while Win and Mac run a linux virtual machine and then dockers inside it)
If you feel this doesn't answer your question, please share more details about the 'special' needs/configurations of your application, so we can dive deeper into this.
Have you looked at providing portable instances of the VM that can run on different host systems?
This example show how to create one for a windows host, check it out here, but I'm sure it can be done for different host systems as well.
Ok so I have an app with a Node/Express API and everything works fine on localhost. I'm trying to figure out how to make everything work on CPanel that's running on Apache. The client side stuff but I am unable to fetch any data from the backed. I've searched and looked, yes, but I'm still quite unsure on how to approach this. Do I have to use a Virtual Host and if so what are the specific steps I need to do?
NodeJS doesn't run on Apache or Nginx. Most you can do in these web servers is to set a reverse proxy.
NodeJS has its own web-server. cPanel won't help you in that regard, since you only need to install NodeJS on your server (you must have SSH access-root), and run it from there. You can daemonize your Node process to keep running installing PM2 or Forever (NPM Packages).
Here's a good answer (search before asking, the issue might be solved by then).
Run node.js on cpanel hosting server
cPanel typically runs Apache or another web server that is shared among all the cPanel/unix accounts. The web server listens on port 80. Depending on the domain name in the requested URL, the web server uses "Virtual Hosting" to figure out which cPanel/unix account should process the request, i.e. in which home directory to find the files to serve and scripts to run. If the URL only contains an IP address, cPanel has to default to one of cPanel accounts.
Ordinarily, without root access, a job run by a cPanel account cannot listen on port 80. Indeed, the available ports might be quite restrictive. If 8080 doesn't work, you might try 60000. To access a running node.js server, you'll need to have the port number it's listening on. Since that is the only job listening on that port on that server, you should be able to point your browser to the domain name of any of the cPanel accounts or even the IP address of the server, adding the port number to the URL. But, it's typical to use the domain name for the cPanel account running the node.js job, e.g. http://cPanelDomainName.com:60000/ .
Of course port 80 is the default for web services, and relatively few users are familiar with optional port numbers in URLs. To make things easier for users, you can use Apache to "reverse proxy" requests on port 80 to the port that the node.js process is listening on. This can be done using Apache's RewriteRule directive in a configuration or .htaccess file. This reverse proxying of requests arguably has other benefits as well, e.g. Apache may be a more secure, reliable and manageable front-end for facing the public Internet.
Unfortunately, this setup for node.js is not endorsed by all web hosting companies. One hosting company that supports it, even on its inexpensive shared hosting offerings, is A2Hosting.com. They also have a clearly written description of the setup process in their Knowledge Base.
Finally, it's worth noting that the developers of cPanel are working on built-in node.js support. "If all of the stars align we might see this land as soon as version 68," i.e. perhaps early 2018.
References
Apache Virtual Hosting -
http://httpd.apache.org/docs/2.4/vhosts/
Apache RewriteRule Directive - http://httpd.apache.org/docs/2.4/mod/mod_rewrite.html
A2Hosting.com Knowledge Base Article on Configuring Node.js - https://www.a2hosting.com/kb/installable-applications/manual-installations/installing-node-js-on-managed-hosting-accounts
cPanel Feature Request Thread for node.js Support - https://features.cpanel.net/topic/nodejs-hosting
Related StackOverflow Questions
How to host a Node.Js application in shared hosting
Why node.js can't run on shared hosting?
Is worth to point out that the NodeJS support hasn't yet come to cPanel (as early 2019)
Does anyone know an easy way to synchronize your /etc/hosts file across multiple machines? I use a MacBook, a MacMini, a Windows Machine as well as a Linux VM to develop websites with so it would be ideal to have all of them have the same hosts config.
Instead of having a /etc/hosts file for each machine you might instead consider using a DNS server.
You can run rsync on any of the macs/linux, and deltacopy on windows