Single database for many domains approach - mysql

Good morning.
I have to build a system (php/mysql) that run in a 20 domains for 20 different cities (for example). The system that run in this 20 domains is identical, the database too.
My issue is: I pretend to create a single database to serve this 20 domains, controlling the cities by something like city_id.
I wish to know if this is the best practice, or if the right way is create one database to each city/domain.
The domains are hosted in the same server, the core system is out of public_html directory.
/mysystem_classes
/public_html/city1.com
/public_html/city2.com
/public_html/city3.com
/public_html/city20.com
To serve images, css and js I will work with something like a CDN.

Normally you would setup just one virtual host, install your application there and let all the domains point to that software. Which domain points to which website is not specified on the server level, but on the application level in that case.
TYPO3 for example works like that when building multiple sites with one instance of TYPO3 and the used MySQL database. (using TypoScript or the backend configuration to define which domain belongs to which site-ID)
Wordpress has a multisite feature, which could be set up easily to use several subdomains. It uses one database and a single software instance but can deliver multiple blogs to different domains. (e.g. city1.example.org, city2.example.org). You will need to setup a wildcard domain (i.e. *.exqample.org) to let all possible subdomains point to the single vHost. This is similar to how the basic Wordpress.com-Blogs work. See: http://codex.wordpress.org/Create_A_Network

I believe you are looking for Mysql replication.

Related

Remote access to an Access database

I need to develop a very simple database (probably no more than 4-5 tables, with up to 50 records per table) for my company, with the following requirements:
The database itself (most likely an Access file) must be stored on a server and accessed through http://www.something.com/my_db.mdb
Users from 6 different countries (with generally low Internet bandwidth) must be able to access this database and to view / edit it through a few masks, as well as produce automatic reports / extracts
The whole solution must be as robust and as low-tech as possible, to reduce maintenance issues (ideally, no development at all)
I cannot pay an Access license for each user, and using OpenOffie or LibreOffice is not an option (because I cannot go and install it on the computers of all the users)
My first (and naive?) idea was to:
1) Create the mdb file containing only the data and store it on a webserver
2) Create the edition masks and the automatic reports in another file that would define the online file as data source
3) Deploy the file containing the edition masks to the computers of all users
4) The users only have to open their local file to edit the distant DB through ther edition masks
Is my approach somehow realistic? Do you see another approach that would make more sense? Can I implement my solution with 1 single Access license?
Thanks a lot in advance for your inputs and insights!
If you provide just the mdb file as file source, accessible via HTTP, the users won't be able to connect to the database, because in a HTTP GET file download they just get the .mdb file downloaded to their local computer. When they edit something within the database (e.g. add a record), it will be done just locally on their local copy of the file.
If you want to use a access database, the simplest approach I have is that you implement a very small web application (e.g. ASP.NET) which connects to the .mdb file (and the .mdb file then can be in a private directory on the server). Your web application then is deployed to Internet Information Server (Microsoft IIS as a webserver).
You can provide data forms as web application, which you implement using ASP.NET, or develop separate clients which access web services you develop with .NET.
You could try cloud based solutions like; Google Firebase
For a requirement of this type; one should not use Access tables which are static because Access is a front end database but instead use a back end database such as SQL Server Express. SSE is free and one is better positioned to provide real web based features if needed in the long run.
Further I would say, in terms of cost/management - one should really consider using one of the online db services such as soho, knack, airtable, etc. One of these could well be faster and less expensive than creating a web app from scratch for such a small requirement.

From a newby: Can one mediawiki installaiton have two wikis?

We are planning to use mediawiki as the basis for our products documentation. Access control will be used to grant customers access to content.
We would also like to use mediawiki for some of our internal documentation, stuff that customers should not access.
Is it possible to configure one installation of mediawiki such that one group of users sees certain wiki content and that another group of users sees other wiki content? If so, please point me to the appropriate documentation as I am not even sure what this would be called (thus I am uncertain where to look).
Thank you.
If by one installation you mean one database, it is sort of possible but extremely unwise. See this section of the manual for explanation and Category:Page specific user rights extensions (especially the Lockdown extension) if you decide to try it anyway.
Using the same installation directory (ie. PHP files) but separate databases is fine. The manual page about wiki farms describes a few ways to do it.
If you mean, that you want to restrict the "view" permission for certain pages to a specific group, then the answer is kind of maybe. With the default MediaWiki installation, that is not possible, as MediaWiki is designed to be "open" to all users (as least the view persmission). You can "just" restrict, that a certain group can read or can't read, but this will always mean all pages.
Maybe your problem can be solved by having really two wikis, instead of holding two "sections" in one wiki. For this you would need:
One MediaWiki installation on your file system (unzipping the mediawiki tarball release), e.g. /var/www/html/mediawiki/
Two mysql databases (or you use two database prefixes)
Two different urls (e.g. example.com/wiki1 and example.com/wiki2 or wiki1.example.com and wiki2.example.com)
A bit more complex MediaWiki configuration
Now, you first need to create two virtual hosts in your webserver. Both should point to the installation directory of your mediawiki (/var/www/mediawiki/). In the next step you would need to create a configuration which will be different depending on the wiki requested by the user (depending on what url is used). This is a bit tricky and a mostly undocumented way in MediaWiki, but in fact it's working like this:
You create a wgConf object
You fill this wgConf object with valid wikis (usually you use a unique name, e.g. the dbname)
You let wgConf extract all settings (using the name of the wiki, e.g. the dbname)
This part is more or less documented at the wgConf manual page. The more tricky way is to parse the url correctly and set all the information you need. The Wikimedia foundation uses a script called MultiVersion. This tool does a bit more as just parsing the url to indentify the wiki, but ok. With MultiVersion you would then set the configuration variable wgDBname which you then use to load the wgConf data. For more information, you should ask specific questions and look into the git repository of the Wikimedia foundations configuration. I use a similar approach with just 2 wikis, and a lot smaller MultiVersion (but it's based on the idea of the WMF), so maybe this will help you understand the way to configure wikis, too.
You want, e.g., also make sure, that the wikis are able to create inter-wiki links to link, e.g., a documentation of your public wiki in your internal wiki and vice versa. And you probably want to make sure, that some database tables are shared by one wiki, so your users just need to register once to access both wikis (and set the internal read permission for users to false, so that you have to give access to users ecplicitly). See $wgSharedDB and the manual for shared databases. The configuration of my two wikis uses this feature to share user tables.

How to password protect website hosted on Amazon Web Services (AWS)

I wanted to create a website that would be like a dropbox of sort, which just has files that me and my organization can access. I wanted to password protect the website, just a simple username and password. I have my own domain. I have been looking all over the web to find how to do this(I am a beginner) and found that using httaccess and htpasswd would be used to secure a website similar to what is shown here: http://www.htaccesstools.com/articles/password-protection/
But I cannot seem to get it to work. I am using the s3 bucket and putting the httaccess and htpasswd file in the same folder as the index.html file. Do you know how I would get my site to have a simple password protection(thats not seen in the source code or by typing in the html)? I am not sure if I am finding the directory correctly or not to implement this password protection correctly. Thank you for taking the time to read this and hopefully this makes sense!
Anyone else had this issue?
Amazon AWS alone won't do it. htaccess and htpasswd are also not the right tools for what you want to do.
Get yourself a cheap hosting account with a company like hostgator or godaddy or namecheap or any other that will host your web page and give you PHP and MySQL.
You cannot accomplish what you want just using javascript/jQuery. Those languages run on the browser, but you want to store your files on a server. Therefore, you need the language that controls the server - that is usually PHP. (The other popular solution is ASP, which is by Microsoft and runs on costly and complex Microsoft servers -- PHP is free and runs on (free) Linux and is therefore what ALL of the cheap web hosting companies provide. MySQL is the (free) database that is analogous to Microsoft SQL)
Next, watch a video tutorial on creating a PHP / MySQL login system, such as the ones over at:
phpAcademy (now called codecourse, apparently)
theNewBoston.com
You need to learn more about:
PHP sessions
Ajax
jQuery
MySQL (possibly)
On a basic website, you can stick your files into directories and control who can access those directories by whether or not they are logged in.
You can determine if a visitor is "logged-in" or not by asking for a username/password and setting a session variable. Session variables are just variables that are stored on the server, rather than on a user's own computer (of course, that wouldn't work since every visitor has his own computer and your files are stored on a central server -- so that is where the security (variables) must reside, right?)
Anyway, in a weekend of video watching and trial-and-error you can probably get something cobbled together that will do what you want.

Openshift - Redirect requests only on specific gear

I'm using HAProxy and I have more than one gear, but I have to use file system.
The problem is Gears don't share file system so I wonder if I have to setup HAProxy in way to let me redirect specific requests to a specific gear(the one that contains cronjobs).
Must I use HAProxy or have I alternatives?
Edit
Share the file system across gears would be great but not is completely necessary. My users don't need access. I just want the posibility to write files in the same gear. My using an specific URL o any other trick.
For example, it would be enough an specific URL go always to an specific(and why not the same) gear.
You should use something like amazon s3 to store your files, then they will be accessible to all of the gears in your scaled application.You would need to use an amazon s3 access library that is available for your language. If you are using Ruby on Rails, I would suggest using Paperclip.
I don't believe that you will be able to modify the haproxy configuration enough to always send a specific user to a specific gear within your application, and that also is a really bad programming practice to get in the habit of...

How can I point one domain to different websites

I have two pages built with two website builders. One with Wix.com, and the other using kickofflabs.com. I want to host both of the pages to my domain and hosting. I can only host one now, how can I host two? I used CNAME in pointing on of the websites.
Thanks in Advance!
The closest thing would be to create one or two subdomains, and point each one of those at the appropriate hosting. For instance, page1.domain.com would point to your wix.com page, and page2.domain.com would point to your kickofflabs.com page.
See this SO question: Subdomain on different host
To be honest, you'd probably be better off in the long run just using one website builder.
you should use an a record to point your domain to server. and you should config that server to receive that request